NYCPHP Meetup

NYPHP.org

[nycphp-talk] Searching an Entire MySQL Database

Michele Waldman mmwaldman at nyc.rr.com
Thu Nov 4 11:03:06 EDT 2010


And, make a backup.

Michele


From: Andrew Yochum 
Sent: Thursday, November 04, 2010 11:00 AM
To: NYPHP Talk 
Cc: NYPHP Talk 
Subject: Re: [nycphp-talk] Searching an Entire MySQL Database




On Nov 4, 2010, at 9:41 AM, Rob Marscher <rmarscher at beaffinitive.com> wrote:


      On 11/4/2010 08:32, ps at blu-studio.com wrote:
      > I am wondering if anyone has had to search an entire database and do a
      > search and replace on a string. That is to search in every table, in
      > every row, in every field.
      export the entire database into a file
      use PHP (or a good editor) to make the replacement line by line
      drop the database
      re-create it by importing the modified file

  I've had to do something similar.  Yes, use mysqldump.  Then use sed (if you're on a *nix machine).  Sed does search and replace by streaming the file rather than buffering the whole thing so it won't run out of memory no matter how big your dump is.  You can send the output to a new file.  Definitely test out importing it back in on a non-production machine.


This is a sane method. But you probably want to either:
1. Run on a schema-only dump and make sure it doesn't touch any columns or table DDL before you run it on a data dump
2. Make sure your search pattern matches things within a SQL quoted string.


Dan's solution of looping over tables and columns works well too. Mysql has  regex funcs you can use if necessary.


HTH.


Regards,
Andrew


--------------------------------------------------------------------------------


_______________________________________________
New York PHP Users Group Community Talk Mailing List
http://lists.nyphp.org/mailman/listinfo/talk

http://www.nyphp.org/Show-Participation
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.nyphp.org/pipermail/talk/attachments/20101104/874e8e1e/attachment.html>


More information about the talk mailing list