Page 1 of 1
downloading study hangs
Posted: Wed Jan 14, 2004 7:19 pm
by DDors
I am unable to download from pt list where I can download from the series list. It appears the hierarchy, Patient/study/series/image, is flawed. Any thoughts of why download link on home page of pacsone 1.013 hangs?
Posted: Wed Jan 14, 2004 7:46 pm
by pacsone
How many total images does this study have?
Since the zipping of images is done in memory instead of files, there IS a limit on how many images you can put in one zip file, depending on the memory on your computer and the size of the images.
Posted: Thu Jan 15, 2004 3:58 am
by DDors
CT Chest coming off a GE multislice scanner , about 80 slices/20MB.
Pacsone Server is standard PC w/ XP pro sp 1 256MB
Posted: Thu Jan 15, 2004 6:03 pm
by pacsone
In that case, you may want to increase the default PHP resources in the PHP.INI file as follows:
;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;
max_execution_time = 30 ; bump it up to 120 (2 minutes)
memory_limit = 8M ; bump it up to 32 MB
Be sure to restart Apache after modifying the PHP.INI file.
Posted: Fri Jan 16, 2004 5:46 am
by Guest
I had already tried the php.ini resource increase without success. I increase the ram to 512, no luck. Fewer slices/smaller files load up well. e.g. 20 ct slices, 5mb files. I have up the resource to 64mb to no avail. Any other ideas?
Posted: Fri Jan 16, 2004 2:37 pm
by pacsone
You can try the following:
1. Access http://{HOSTNAME}/pacsone/hello.php and send the output HTML file to mail:
pacsone@pacsone.net to check your PHP configurations.
2. Try enabling PHP error logging by modifying the following PHP setting:
log_errors = On
error_log = C:\Temp\php.log ; or whatever logfile you choose
Restart Apache after you modify the PHP.INI file.
3. You can verify that your PHP resource is indeed higher than the default by watching the 'Apache.exe' task from the Windows Task Manager (CTRL-ALT-DLT then Task Manager):
* Take a note of the memory usage before you start zipping patient files
* Watch it grow as soon as you hit the Download link
* Make sure the memory usage can increase over 8M default limit
4. Check for any logged errors in the PHP error log file in #2 above.
I have a similar problem, cannot retrieve large studies
Posted: Thu May 26, 2005 12:25 am
by Isaac
I cannot retrieve large studies either, I have like 1000-2000 images pro examination from our CT scanner. Cannot download them to my local computer, probably the same problem as described by you.
Did You find a solution?
(I have a similar problem when listing the series in a studie, gets like the three first of 9 series. Seems to be the same timeout. But normally I see the studies when retrieving by DICOM Q/R, so that problem is just annoying but possible to live with...
More serious is that I also have problems when sending large studies for DVD export, feels timeouted there too. But I will have to work more with the problem to close in on the reason for it. Practically I'm not able to store studies on DVD or CD disk right now, and would need that to work. I normally have large studies, 500 up to to 3000 images.)
Posted: Fri May 27, 2005 3:40 am
by pacsone
Which version of PacsOne Server are you using?
You may want to upgrade to the latest version 3.1.2 of PacsOne Server which contains significant optimization for large database tables like yours.
For the Download issue, there's a file-system limit of 4 GB (2 GB if you're using Windows FAT file system) for created ZIP files. So if you're downloading 2000 images each of 3 MBytes, this limit may be the reason for the error.
PHP 5.0.4 Users Please Read This Post
Posted: Wed Jun 01, 2005 4:31 pm
by pacsone
A user has reported problem downloading large patients/studies/series from the PacsOne Server web user interface, where the downloaded zip file would be limited to about 1.9 MBytes and was corrupted.
This has been confirmed as a PHP 5.0.4 bug:
http://bugs.php.net/bug.php?id=32553
If you need to download images that compressed to more than 1.9 MB in a zip file, you should downgrade to PHP 5.0.3 instead until this bug is fixed in the next PHP 5 release.
Posted: Thu Apr 20, 2006 7:42 am
by rayrad
Regarding the original issue of downloading large studies (for us, 500 MB ish) Could one option be to download all the studies unzipped? Given network vs. processor speeds, it would probably just as fast to save the ZIP step? or at least not focus on combining them into one big file?
appreciate your thoughts - I'm having the same difficulty with larger studies, after increasing timeouts and memory.