downloading study hangs
I am unable to download from pt list where I can download from the series list. It appears the hierarchy, Patient/study/series/image, is flawed. Any thoughts of why download link on home page of pacsone 1.013 hangs?
In that case, you may want to increase the default PHP resources in the PHP.INI file as follows:
;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;
max_execution_time = 30 ; bump it up to 120 (2 minutes)
memory_limit = 8M ; bump it up to 32 MB
Be sure to restart Apache after modifying the PHP.INI file.
;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;
max_execution_time = 30 ; bump it up to 120 (2 minutes)
memory_limit = 8M ; bump it up to 32 MB
Be sure to restart Apache after modifying the PHP.INI file.
You can try the following:
1. Access http://{HOSTNAME}/pacsone/hello.php and send the output HTML file to mail:pacsone@pacsone.net to check your PHP configurations.
2. Try enabling PHP error logging by modifying the following PHP setting:
log_errors = On
error_log = C:\Temp\php.log ; or whatever logfile you choose
Restart Apache after you modify the PHP.INI file.
3. You can verify that your PHP resource is indeed higher than the default by watching the 'Apache.exe' task from the Windows Task Manager (CTRL-ALT-DLT then Task Manager):
* Take a note of the memory usage before you start zipping patient files
* Watch it grow as soon as you hit the Download link
* Make sure the memory usage can increase over 8M default limit
4. Check for any logged errors in the PHP error log file in #2 above.
1. Access http://{HOSTNAME}/pacsone/hello.php and send the output HTML file to mail:pacsone@pacsone.net to check your PHP configurations.
2. Try enabling PHP error logging by modifying the following PHP setting:
log_errors = On
error_log = C:\Temp\php.log ; or whatever logfile you choose
Restart Apache after you modify the PHP.INI file.
3. You can verify that your PHP resource is indeed higher than the default by watching the 'Apache.exe' task from the Windows Task Manager (CTRL-ALT-DLT then Task Manager):
* Take a note of the memory usage before you start zipping patient files
* Watch it grow as soon as you hit the Download link
* Make sure the memory usage can increase over 8M default limit
4. Check for any logged errors in the PHP error log file in #2 above.
I have a similar problem, cannot retrieve large studies
I cannot retrieve large studies either, I have like 1000-2000 images pro examination from our CT scanner. Cannot download them to my local computer, probably the same problem as described by you.
Did You find a solution?
(I have a similar problem when listing the series in a studie, gets like the three first of 9 series. Seems to be the same timeout. But normally I see the studies when retrieving by DICOM Q/R, so that problem is just annoying but possible to live with...
More serious is that I also have problems when sending large studies for DVD export, feels timeouted there too. But I will have to work more with the problem to close in on the reason for it. Practically I'm not able to store studies on DVD or CD disk right now, and would need that to work. I normally have large studies, 500 up to to 3000 images.)
Did You find a solution?
(I have a similar problem when listing the series in a studie, gets like the three first of 9 series. Seems to be the same timeout. But normally I see the studies when retrieving by DICOM Q/R, so that problem is just annoying but possible to live with...
More serious is that I also have problems when sending large studies for DVD export, feels timeouted there too. But I will have to work more with the problem to close in on the reason for it. Practically I'm not able to store studies on DVD or CD disk right now, and would need that to work. I normally have large studies, 500 up to to 3000 images.)
Which version of PacsOne Server are you using?
You may want to upgrade to the latest version 3.1.2 of PacsOne Server which contains significant optimization for large database tables like yours.
For the Download issue, there's a file-system limit of 4 GB (2 GB if you're using Windows FAT file system) for created ZIP files. So if you're downloading 2000 images each of 3 MBytes, this limit may be the reason for the error.
You may want to upgrade to the latest version 3.1.2 of PacsOne Server which contains significant optimization for large database tables like yours.
For the Download issue, there's a file-system limit of 4 GB (2 GB if you're using Windows FAT file system) for created ZIP files. So if you're downloading 2000 images each of 3 MBytes, this limit may be the reason for the error.
PHP 5.0.4 Users Please Read This Post
A user has reported problem downloading large patients/studies/series from the PacsOne Server web user interface, where the downloaded zip file would be limited to about 1.9 MBytes and was corrupted.
This has been confirmed as a PHP 5.0.4 bug:
http://bugs.php.net/bug.php?id=32553
If you need to download images that compressed to more than 1.9 MB in a zip file, you should downgrade to PHP 5.0.3 instead until this bug is fixed in the next PHP 5 release.
This has been confirmed as a PHP 5.0.4 bug:
http://bugs.php.net/bug.php?id=32553
If you need to download images that compressed to more than 1.9 MB in a zip file, you should downgrade to PHP 5.0.3 instead until this bug is fixed in the next PHP 5 release.
Regarding the original issue of downloading large studies (for us, 500 MB ish) Could one option be to download all the studies unzipped? Given network vs. processor speeds, it would probably just as fast to save the ZIP step? or at least not focus on combining them into one big file?
appreciate your thoughts - I'm having the same difficulty with larger studies, after increasing timeouts and memory.
appreciate your thoughts - I'm having the same difficulty with larger studies, after increasing timeouts and memory.