NYCPHP Meetup

NYPHP.org

[nycphp-talk] using PHP to import large-ish SQL into mysql

David A. Roth davidalanroth at gmail.com
Tue Oct 20 18:07:38 EDT 2009


If you are getting an "out of memory" error from a PHP program, see if
on your shared hosting account you have access to the php.ini file. If
so,  increase the memory.

After you locate the php.ini file, look for this line:
memory_limit = 16M      ; Maximum amount of memory a script may consume

The default, at least from my experience lately I've seen is 16MB. Try
increasing this from 16M to perhaps 128M or more, and see if this
allows your PHP script to continue without error.

On shared web hosting accounts, while there might be a /etc/php.ini
for the system, there is usually a locate one for the user. If there
is a CPANEL on your hosting account, look there for options to make
changes to PHP.

I was loading an XML file which was very large and ended up increasing
the memory_limit to several times its value in order for it to work
without a memory error.

Hope this helps!

David Roth

On Tue, Oct 20, 2009 at 5:57 PM, Marc Antony Vose <suzerain at suzerain.com> wrote:
> Hi there:
>
> I'm on a shared machine, and trying to get some 15 MB or so SQL files into
> MySQL.  No problem doing this at the command line, of course, but I have
> several SQL files that need to be imported a few times a month. So, ideally,
> I'd like to FTP them to a directory and go to a web page and hit a button to
> do the import.
>
> Problem is, the PHP script is terminating with a PHP error; server's support
> thinks it's out of memory.  According to phpinfo(), the script can use 90M,
> so the question is why a 20M sql import would use up 90M?
>
> I'm doing it like this:
>
> $cmd = "/path/to/mysql -h hostname -u username --password=password db_name <
> sqlfile.sql";
> exec( $cmd );
>
> I had thought that using this command-line command would prevent PHP from
> dealing with the memory footprint of the operation, but apparently this is
> not the case.
>
> So...doable in PHP without an approach like busting apart the file into
> small chunks?  Or better to use another language on the server for this...?
>
> Cheers,
> Marc
>



More information about the talk mailing list