NYCPHP Meetup

NYPHP.org

[nycphp-talk] memory problems

Rob Marscher rmarscher at beaffinitive.com
Thu Jun 4 12:29:36 EDT 2009


Just noticed you said the images are only 15k. Weird. Well, even  
without seeing the code, I think the suggestions of switching to  
imagemagick or splitting up the script will solve it.

On Jun 4, 2009, at 11:25 AM, Rob Marscher <rmarscher at beaffinitive.com>  
wrote:

> Sorry guys.  iPhone prematurely sent that.
>
> The other idea is to buffer the download of the file. You can use  
> fopen/fread/fclose to make sure you only keep say 1mb of data in  
> php's memory while you download the file.  Usually urls can be  
> treated like files via php's url file wrappers unless those have  
> been disabled.
>
> Or you can just
> ini_set('memory_limit', '1500m');
> Or however much memory you have available for the script and see if  
> you can get by.
>
> Really though, there's no reason a script like this needs to use so  
> much memory.
>
> Sometimes when running batch scripts, i've seen memory usage  
> continue to rise through the iterations for unknown reasons. In that  
> case, I split the script so there's a master script that manages the  
> queue and another script that I exec to run the operation. That way  
> it's a separate process and memory is always cleaned up.
>
> Good luck,
> Rob
>
>
> On Jun 4, 2009, at 11:12 AM, Rob Marscher  
> <rmarscher at beaffinitive.com> wrote:
>
>> GD needs to operate on raw data so even if the jpegs are smaller  
>> than your 500mb limit, when it expands it, it will go over.
>>
>> A couple ideas... exec ImageMagick convert instead of using GD for  
>> the resize.
>>
>> On Jun 4, 2009, at 10:48 AM, Rahmin Pavlovic <rahmin at insite- 
>> out.com> wrote:
>>
>>> Heya,
>>>
>>> So, I have this script that does the following:
>>>
>>> 1.  Requests jpeg from origin CDN via cURL
>>> 2.  If file doesnt exist... log error, continue.
>>> 3.  Write jpeg to temp file
>>> 4.  Resize original image (GD lib) to temp file. FTP to directory  
>>> on new CDN.  Create directory structure if not present.  Repeat  
>>> seven times per image.
>>>
>>> Not sure how many, but we're talking about 10k; maybe 15k images.
>>>
>>> The script works, but problem we're running into is the memory  
>>> limit.  We got about 31% through the images, and hit:
>>>
>>> PHP Fatal error: Allowed memory size of 524288000 bytes exhausted  
>>> (tried to allocate 41760 bytes) in /web/lib/php/ 
>>> populate_images.php on line 135
>>>
>>> Maybe it's me, but if we have to allot more than 500 MB of memory,  
>>> something is wrong.
>>>
>>> Any ideas?  Maybe sleep the script after each image?
>>>
>>> We're running the script in shell.
>>> _______________________________________________
>>> New York PHP User Group Community Talk Mailing List
>>> http://lists.nyphp.org/mailman/listinfo/talk
>>>
>>> http://www.nyphp.org/show_participation.php
> _______________________________________________
> New York PHP User Group Community Talk Mailing List
> http://lists.nyphp.org/mailman/listinfo/talk
>
> http://www.nyphp.org/show_participation.php



More information about the talk mailing list