NYCPHP Meetup

[nycphp-talk] dbdeploy

Gary Mort garyamort at gmail.com
Mon Mar 8 03:28:02 EST 2010


Was wondering if anyone here has been using dbdeploy for helping manage
database schema's and if so, what you think of it?
http://www.davedevelopment.co.uk/2008/04/14/how-to-simple-database-migrations-with-phing-and-dbdeploy/

I'm also thinking that for me...one of the biggest issues I tend to have is
with data in your database.

Such data can be split into 4 types for me[I'm thinking primarily us using
something like Joomla here]

Config data that MUST be installed when the application is installed.  IE
all the module and menu entries that get installed when you install Joomla.

Config data that OUGHT to be installed when an sub-application is installed.
 IE all the module and menu entries that get installed when you install a
new set of components.

Sample Data that for testing and such.

Actual live data.

Since space for int's is cheap...my gut feeling would be to use the
auto increment id's to help separate this data.

IE any id under 1000 is system data that should always be migrated,
Any id between 1000 and 2000 is data that goes with an application and
probably should be migrated.

Now, this next part is trickier.  If you are doing work for someone on a
shared host, your somewhat limited, so for that I would say that anything
above 2000 and below 5000 is sample data, and anything above 5000 is live
data.

So basically, upon installing an application and it's related data set, go
through and for every auto increment id field, change the floor to 5000.

It means a little more work when setting up tests and doing dev work to
renumber any field entrees which are not live data and bring them down the
the appropriate range[first 1000 config data, second 1000 application data,
and next 3000 sample data].

Alternatively, if you have the ability to set your auto increment numbering
system, then simple set the live system to an even number and all dev
systems to an odd number and increment by 2's[so 5000,5002,5004 is the live
data, 5001,50003,5005 is the sample data].

To me, this makes it easy when visually looking at the tables to immediately
spot what data goes where - you can take a snapshot of a live server if you
need to work with the data from it...

Just wondering what others do in this regard.


[I generally am working as the sole php programmer fixing someone else's
code...and making a local snapshot of their environment works great, but
then deployment time comes and since some of it includes upgrading/adding
modules that install config data/files it can get...difficult to capture all
that config data without actually installing the data.  So a little more
prep work during the setup/dev will yield a smoother transition in the
future].
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.nyphp.org/pipermail/talk/attachments/20100308/5fd8133c/attachment.html>


More information about the talk mailing list