NYCPHP Meetup

NYPHP.org

[nycphp-talk] memory problems

y2rob at aol.com y2rob at aol.com
Thu Jun 4 12:38:22 EDT 2009


 wow, awesome brent, i like the forking idea.

question:
how are you getting $files2process.? i'm assuming you're reading a directory if it were local, but can you get this number remotely from cURL?

thanks
~rob


 


 

-----Original Message-----
From: Brent Baisley <brenttech at gmail.com>
To: NYPHP Talk <talk at lists.nyphp.org>
Sent: Thu, 4 Jun 2009 12:29 pm
Subject: Re: [nycphp-talk] memory problems










You are not releasing memory somewhere. Using sleep is not going to
help in clearing memory. Monitor your memory usage (memory_get_usage)
and see if your memory usage keeps climbing or if it's just 1 big
image that is causing the problem.

Alternatively, fork your script for each image if you are running it
from the command line, and on unix.

$maxForks = 20;
while ( $files2process ) {

$processFile = getNextFileURL();
for ( $i=0; $i<$maxForks; $i++ ) {

$pid = pcntl_fork();
if ($pid == -1) {
  die('could not fork');
} else if ($pid) {
  // we are the parent
  $childPIDs[]  = $pid;
} else {
  ...process file...
 exit();
}

// Wait for all the children to finish
foreach($childPIDs as $pid) {
    pcntl_waitpid($pid, $status);
    echo '*** '.$pid. " ENDED\n";
}

}

Then a single image won't stop the whole process, the one image will
just cause one child process to exit. This would also speed things up
considerably since you can process multiple images at once. There are
obviously holes in the script, but they should be easy to fill.

Be aware that the system you are on may not have forking enabled. In
which case, forget my whole response.

Brent Baisley


On Thu, Jun 4, 2009 at 12:00 PM, Donald J. Organ IV
<dorgan at donaldorgan.com> wrote:
> Do you mind sharing the script?
>
>
> ----- Original Message -----
> From: "Rahmin Pavlovic" <rahmin at insite-out.com>
> To: "NYPHP Talk" <talk at lists.nyphp.org>
> Sent: Thursday, June 4, 2009 11:48:40 AM
> Subject: [nycphp-talk] memory problems
>
> Heya,
>
> So, I have this script that does the following:
>
> 1. ?Requests jpeg from origin CDN via cURL
> 2. ?If file doesnt exist... log error, continue.
> 3. ?Write jpeg to temp file
> 4. ?Resize original image (GD lib) to temp file. FTP to directory on
> new CDN. ?Create directory structure if not present. ?Repeat seven
> times per image.
>
> Not sure how many, but we're talking about 10k; maybe 15k images.
>
> The script works, but problem we're running into is the memory limit.
> We got about 31% through the images, and hit:
>
> PHP Fatal error: Allowed memory size of 524288000 bytes exhausted
> (tried to allocate 41760 bytes) in /web/lib/php/populate_images.php on
> line 135
>
> Maybe it's me, but if we have to allot more than 500 MB of memory,
> something is wrong.
>
> Any ideas? ?Maybe sleep the script after each image?
>
> We're running the script in shell.
> _______________________________________________
> New York PHP User Group Community Talk Mailing List
> http://lists.nyphp.org/mailman/listinfo/talk
>
> http://www.nyphp.org/show_participation.php
> _______________________________________________
> New York PHP User Group Community Talk Mailing List
> http://lists.nyphp.org/mailman/listinfo/talk
>
> http://www.nyphp.org/show_participation.php
>
_______________________________________________
New York PHP User Group Community Talk Mailing List
http://lists.nyphp.org/mailman/listinfo/talk

http://www.nyphp.org/show_participation.php



 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.nyphp.org/pipermail/talk/attachments/20090604/fca1b391/attachment.html>


More information about the talk mailing list