NYCPHP Meetup

NYPHP.org

[nycphp-talk] DB for large datasets

Hans Zaunere hans at nyphp.com
Thu Aug 26 23:31:25 EDT 2004


> I'm developing an internal application that takes information about
> network traffic, and stores it in tables, currently MySQL, for each
> month. A merge table gets queried instead of having to look through
> each table. Now, the problem is, we're looking at more than 30M
records/mo
> and MySQL is just barfing. I'm getting the notorious error 127 from
the

Expand on barf...

Error 127 can come from quite a few different problems.  First, double
check

http://dev.mysql.com/doc/mysql/en/MERGE_table_problems.html

that you're not doing something that could cause problems.  

Maybe you're running into a 4gb limit at the OS level?  What version of
MySQL?  What OS?  There are many factors, including tuning parameters,
that could solve this.

30 million records is not that big and I've worked with prospects
supporting many more than that.  One option would be to archive monthly
tables to the compressed MyISAM storage handler, since they're probably
read-only.  Of course, you could consider InnoDB, which often works
better with larger tables.


---
Hans Zaunere, Sales Engineer
MySQL, Inc.  www.mysql.com
Office: +1 212.213.1131

Are you MySQL certified?
www.mysql.com/certification





More information about the talk mailing list