I have 2 databases on 2 different servers. The source database is a massive real estate database and the second is a WordPress database. I have to import the data from the source to the WordPress database’s posts and postmeta tables.
Currently, I’m pulling everything out of the source db and then looping through those results to insert them into the posts table with another nested query to insert each field as a metavalue using the insert id from the post insert. In other words, it’s a memory hog.
My question is, can the loop I have below be improved so that there aren’t so many individual insert lines? Or does anyone have any suggestions for making this faster / not as sloppy?
// SOURCE DB
$query = "select $fields from listings where data_id = 'B'";
$result = mysql_query($query);
// WORDPRESS DB
while ($row = mysql_fetch_assoc($result)) {
$query2 = "insert into wp_posts (post_author, post_content, post_title, post_status, comment_status, ping_status, post_name, post_type) values";
$query2 .= " ('1', '" . mysql_real_escape_string($row['remarks']) . "', '{$row['mls_acct']}', 'publish', 'closed', 'closed', '{$row['mls_acct']}', 'properties')";
$result2 = mysql_query($query2);
$id = mysql_insert_id();
foreach ($row as $key => $val)
{
$query3 = "insert into wp_postmeta (post_id, meta_key, meta_value) values ";
$query3 .= "('$id', '$key', 'mysql_real_escape_string($val)')";
$result3 = mysql_query($query3);
}
}
The first thought I have on this is to insert the records in batches rather than one at a time. So your inner insert would end up being something like:
I’m not sure what’s the optimum number of records per batch, but I think you get the idea.
You could probably do an
INSERT ... SELECT FROM ...
as a single query rather than the looped version you’re doing internally:I’m not particularly familiar with WP’s internals (and I really really don’t ever want to become so…), but if you can tweak this to work you’ll save yourself a few dozen separate insert queries inside the loop.