NYCPHP Meetup

NYPHP.org

[nycphp-talk] GD w/ Lots of Files

John Campbell jcampbell1 at gmail.com
Thu Aug 30 12:18:30 EDT 2007


> I'm trying to process 100-150 jpgs that are 2-4 mb into smaller files.
> What advice/best practices can you guys share in regards to memory
> management?  I can make my job go through about 5 before it blows up,
> and yes, i've upped the memory limit.  I need some method to clear out
> memory after each loop iteration... Any ideas?


If you want a quick fix, just write the output file after processing
each file, and set the variable that stores the image to null.  This
should automatically happen if you reuse the same variables to store
the image.  I am curious to see how you coded this so the memory grows
with each additional image.

If it is a one time job, you should run the script from the command
line (no time limit, or memory limit).  This is not a realistic task
for a web request, if you want to implement background processing then
a scheduler/queue is the way to go.

-John Campbell



More information about the talk mailing list