Solution for database version control and deployment?

Currently i use a dump script and commit the database to the git repo.
--skip-extended-insert --skip-comments --skip-dump-date means that a diff can give me a fair idea of what has changed, but it all falls down if I try to merge.

The WP_SITEURL, WP_HOME and all the other places wordpress stores full URLs need updating when importing to another host (testing,staging,production)

Read More

Is anyone using a better method?

Main issues:

  • WordPress stores full URLs all through the database (non-portable)
  • Lots of other, non-relevant records change
    • auto_increment values (i just strip these out, but have run into ID issues)
    • timestamps (can also strip out, potentially)
    • transient* records … no idea what to do with them

A process that created timestamped migrations, with only the things added or removed, would be ideal… but i’m not sure if it’s even possible?

Related posts

2 comments

  1. Here are two possible solutions, both of these are actually generic MySQL version control tools but can be adapted to your workflow:

    dbv.php

    This tool creates “migrations”, which are basically SQL scripts, from the changes detected on the database. These scripts are stored in a local directory and thus can be commited to your current VCS (git, for example).

    It’s used through a PHP web interface.

    DBVC

    Fundamentally similar to the previous tool, this is based on a command line interface. It’s configured through a json file. The main difference is that it doesn’t auto-generate the migration files.

    There’s a pending issue to integrate this with the previous similar, so that’s something to look for.

    WordPress Plugins

    Some plugins that could aid in the creation of a repeatable workflow:

  2. I’m doing this on MYSQL.

    It puts all the tables schema and data into their own file so I can easily see what has changed.

    Unlike most of the other solutions in this thread this solution gets the data, which is important for a CMS.

    This solution doesn’t use any tools, just a command line script.

    edit: I found my older code had a bug where import order was important. taking off the --compact flag fixes the bug.

    for x in `mysql --skip-column-names -u root -ppassword dbname -e 'show tables;'`; do
         echo exporting $x
         mysqldump -u root -ppassword --skip-add-drop-table --skip-add-locks --skip-disable-keys --skip-set-charset --extended-insert=FALSE --replace --skip-dump-date dbname $x > "./db/$x.sql"
    done
    

    Older code

    for x in `mysql --skip-column-names -u root -ppassword dbname -e 'show tables;'`; do
         mysqldump -u root -ppassword --compact --extended-insert=FALSE --replace dbname $x > "./db/$x.sql"
    done
    

    and here is how to import

    for x in `ls ./db/*.sql`; do
         echo importing $x
         mysql -pdbpassword dbname --force < $x
    done
    

Comments are closed.