NYCPHP Meetup

NYPHP.org

[nycphp-talk] MySQL & 2.5 million rows!

Kerem Tuzemen keremtuzemen at hotmail.com
Fri Apr 11 10:34:10 EDT 2003


Hey Jeff

Why don't you put all of the data into the same table and add a smallint
field to keep the status of a particular record (i.e. showing which group it
belongs)? If needed, you can make it 2 or 3 different fields. That way
you'll have just one table and if I'm not missing something, it should work
pretty fast compared to your 3 table structure.

Good luck.

----- Original Message -----
From: <jsiegel1 at optonline.net>
To: "NYPHP Talk" <talk at nyphp.org>
Sent: Friday, April 11, 2003 10:24 AM
Subject: [nycphp-talk] MySQL & 2.5 million rows!


> You read the subject line correctly!!! I'm loading 2.5 million rows from
an ASCII file into a MySQL database. So...here's a little background on what
I've done and then a question. (Please keep in mind I'm a Php/MySQL
newbie...though I'm learnin' fast!!)
>
> I created three tables - data_new, data_old, data_live. The Ascii file
gets read, line by line, and inserted into data_new. When it's completed and
there are no glitches (i.e., no problem with the Ascii file), I want to move
the data from data_live to data_old and then move the new data from data_new
to data_live. So...the question...is there a fast way to move the data from
one MySQL table to another (from data_new to data_live) other than walking
through data_new row by row...creating an Insert statement on the fly...and
then inserting the row into data_live?
>
> BTW, in case you are wondering why there are three different tables, I
felt that this was a better way  than my client's present system which
simply wipes out the live data and then reads in the Ascii file. If there is
a glitch then they have to empty the table and reload the Ascii. Doing it
this way, if they need to go back to the old data, I would move it data from
data_old to data_live.
>
> Jeff
>
>
>
> --- Unsubscribe at http://nyphp.org/list/ ---
>
>
>



More information about the talk mailing list