Multiple developers / editors working on a site in progress

Background

I’m nearing the final stages of constructing my first fairly large WordPress site, and I’m now encountering some friction. For the most part, the site was developed on my local machine and I would push changes up to a staging server for review (see this question for more background). The solution I ended up with worked quite well when it was just me editing content, but now some other people are editing the content while I still have features to add. The idea was: we could get things done more quickly if the features and the content came together in concert… but now I’m not so sure.

Read More

Currently there is different content in the database on the staging server than on my local machine. That’s fine by itself, as I don’t need the final body copy on my local machine, but I need to do more development which will affect the database (install/write a couple more plugins which need their own tables).

My question is:

Is there an easy way to automate the merging of databases so that multiple people can work on a WordPress install? I could, of course, just export the tables which I know have changed on my local machine and push them to the staging server, but it’s possible that there are also things on the staging server I would like to bring down. I could grab the SQL output of both DBs and diff them… but that seems tedious and hackish. I’m wondering if this is a problem others have solved; if there’s a community-accepted way to handle this sort of thing.

Thanks!

Related posts

Leave a Reply

3 comments

  1. I asked this question over a year ago, and during that time we’ve added more people to our team and developed a much larger number of sites in WordPress. I wanted to walk through our process in case it might help anyone else.

    Everything in Git

    This was something I was doing even as I asked the question, but it’s good to call this point out. Using Git has not only helped us be more productive, but it’s also saved our collective asses several times.

    Have you ever needed to make major structural renovations to a site, get approval for those renovations from a client, and all the while make minor updates to the non-renovated version? We have, and Git let us do it. Describing this setup would get a bit long-winded, but the basics are that we made a new branch, pulled that branch on to the server, and attached a subdomain to that branch.

    We’ve also been saved by Git. It of course allows us to roll back changes, which is great, but it also allows us to bring old versions of files back. This means that if a client asks, “Remember how this part of the site worked about a year ago? Can we bring that back?”, the answer is yes — even if the person being asked wasn’t on that project a year ago.

    Beside these points, it also means that we’re never stuck without the files we need. We can always pull down the newest version of the site from any machine and start making changes.

    Use Git to deploy

    We do our WordPress hosting on Media Temple, and we really like them. They’re not the cheapest provider, but their service is excellent and their servers are really well set up. The also provide Git by default. This means we can set the server up as a Git repository, and pull changes in that way instead of using SFTP. It also means that doing work on the server isn’t in danger of being overwritten (as those changes can just be merged and pushed back up).

    Because we use BitBucket as our Git host, there’s a little bit of extra work required here. First of all we use .ssh/config files so that we we can type things like ssh sitename to log into our servers (we also use passwordless SSH, which makes this super easy). We also make sure to always use ssh passphrases (Mac OS X makes this very easy by allowing you to store your passphrase in Keychain.app). Finally, we add the a ForwardAgent line to the .ssh/config entry on hosts we want to pull from. This means that we only need each person’s SSH public key in BitBucket, and not the public key of each server. We also make sure to keep the .git directory one directory above the public HTML directory.

    Automated database dumps

    Once the server is in production mode, we make sure to automatically back up our database, just in case.

    Everyone has their own wp-config

    Because we’ve all got our own local database usernames and passwords, and because we could use different names and serving mechanisms, we each keep our own wp-config file. Each of these is stored in Git with a name like wp-config-gavin.php, and when we want to use that config, we symlink it to wp-config.php (which is ignored by Git using .gitignore).

    This also allows us to override the siteurl option in the wp_options database table like so:

    define('WP_SITEURL', 'http://sitename.localhost');
    define('WP_HOME', 'http://sitename.localhost');
    

    This prevents WordPress from looking at the database for the server location, and means there aren’t weird differences in location between local and server installs.

    One final note about wp-config.php files: make sure to store them above the public HTML directory and make the permissions read only for the web user. This makes a huge difference in securing WordPress.

    The database issue

    Finally, the meat of the matter.

    What I had to accept is, when using WordPress, there’s no good way to “merge” database changes. Instead, we needed to develop rules of conduct to solve this. The rules are fairly simple, and have served us well so far.

    During development, there’s a single person who “owns” the site. That person usually does the setup (getting the hosting package together, starting the Basecamp project, slicing the design, that sort of thing). Once that person is to a reasonable point, the dump the database for the WordPress install and put it into Git. From that point forward, everyone doing development uses that database dump, and the owner is the only one who makes changes to the database.

    Once the site build gets a bit further along, the site is put on a server. From that point on, the server’s database is canonical. Everyone (including the owner) must make all database changes on the server and pull the changes down for local development and testing.

    This process isn’t perfect. It’s still possible that someone might need to make changes in the WordPress backend locally during development, and then have to make those changes again in production. However, we’ve found that sort of thing to be rare, and this process works fairly well for us.

  2. I work on such an install and have answered questions like this before. Below is my preferred setup for this kind of work. Because you want to merge databases instead of replacing an existing database I would add to these a caution not to use the –add-drop-table flag when doing the MySQL dump.


    • Step 1. Mysqldump your development database
    • Step 2. Replace all instances of development.domain.com to production.domain.com^^
    • Step 3. Login to MySQL, run a SOURCE command to import data e.g. source /path/to/file

    ^^ How to replace all instances of old domain with new: (1) Copy the script below. (2) chmod +x it. (3) Run it.

    Usage: ./script.sh development-dump.sql > production-dump.sql

    #!/bin/sed -f
    s/'([^']*)development.domain.com([^']*)'/'1production.domain.com2'/g
    
  3. I’m experimenting with different solutions to this same problem right now. It’s definitely a tricky one.

    My current solution is to do a local mysql dump using the –skip-extended-insert flag. I believe this flag causes an insert record statement to be generated for every row of the database, making the dump a little more merge friendly. I picked up that trick from this article: http://www.viget.com/extend/backup-your-database-in-git/.

    I then source control the resulting .sql datadump file using Git along with the site’s source files. When another developer pulls down code changes, the .sql file comes in along with it. He then imports this file into his local version of the database. We have both set up our respective local databases the same way on both machines, using MAMP, so there’s no need to do any search and replaces.

    There have been merge issues so we’re trying to adopt a “take turns” approach to anything that will cause a database change. Before I do anything in WordPress that will change the database I make sure he’s checked in his latest dump, pull it and import it, and than ask him not to make any DB changes until I’ve made and checked in mine. This is obviously not ideal and I’m looking for a better solution but it’s a start. It also gives us version control of the DB which is nice.

    I may end up setting us up a shared dev database on the server and try connecting both of our local copies of the website up to the same DB via SSH tunneling. This approach, however, will run into problems whenever one of us installs a plugin. Basically the PHP files and MySQL DB will be out of sync.

    I’m eager to hear how others are dealing with this problem.