[nycphp-talk] MySQL & 2.5 million rows!

Jim Hendricks jim at
Fri Apr 11 11:34:13 EDT 2003

You may want to save time and not move the data around at all.

The idea would be to have 3 tables all structured the same each named
uniquely.  Start by loading up table 1, then in the application point the
data to table1.  When doing the next import, detect that table 1 contains
the live data so use table2 to import the data.  If the data imports
correctly, change the app to point to table2, table1 now becomes your old
data.  Next import table 3 becomes your import, then live, table 2 becomes
old, table 1 gets emptied ready for next import.  Next import, 1 becomes
import then live, table 3 becomes old, table 2 gets emptied ready for next

Hope you get the idea.

Jim Hendricks, President, Biz Computing, Inc
Phone:  (201) 599-9380     Email: jim at
Snail:  Jim Hendricks,  Biz Computing, Inc.,  255 McKinley Ave, New Milford,
NJ 07646

----- Original Message -----
From: <jsiegel1 at>
To: "NYPHP Talk" <talk at>
Sent: Friday, April 11, 2003 10:24 AM
Subject: [nycphp-talk] MySQL & 2.5 million rows!

> You read the subject line correctly!!! I'm loading 2.5 million rows from
an ASCII file into a MySQL database.'s a little background on what
I've done and then a question. (Please keep in mind I'm a Php/MySQL
newbie...though I'm learnin' fast!!)
> I created three tables - data_new, data_old, data_live. The Ascii file
gets read, line by line, and inserted into data_new. When it's completed and
there are no glitches (i.e., no problem with the Ascii file), I want to move
the data from data_live to data_old and then move the new data from data_new
to data_live. So...the there a fast way to move the data from
one MySQL table to another (from data_new to data_live) other than walking
through data_new row by row...creating an Insert statement on the fly...and
then inserting the row into data_live?
> BTW, in case you are wondering why there are three different tables, I
felt that this was a better way  than my client's present system which
simply wipes out the live data and then reads in the Ascii file. If there is
a glitch then they have to empty the table and reload the Ascii. Doing it
this way, if they need to go back to the old data, I would move it data from
data_old to data_live.
> Jeff
> --- Unsubscribe at ---

More information about the talk mailing list