NYCPHP Meetup

NYPHP.org

[nycphp-talk] importing 650,000 records

Jonathan hendler at simmons.edu
Mon Jan 2 10:10:20 EST 2006


This import happens once?
While a direct MySQL solution will be more reliable and faster - 
wouldn't using prepared statements through PHP/ADODB speed up the import?
More importantly, I agree you shouldn't have any indexes until after the 
import is complete. An index slows down inserts increasingly as the size 
of the DB increases.


michael wrote:

>On Sat, 31 Dec 2005 12:51:02 -0500
>Joseph Crawford <codebowl at gmail.com> wrote:
>
>  
>
>>Hey everyone.
>>
>>I have a client who has a comma delimited file each line being a new
>>record. currently i am trying to import this data into their new
>>system.  The file is 40mb in size and contains 650,000 records.
>>
>>    
>>
>
>Why use php?  Just use mysql to import the file.
>
>  
>




More information about the talk mailing list