I’ve made this script where I run a WP_Query() that gets maybe 2,000 records and growing. I want to be able to write these results to a CSV file. I have that working when the posts_per_page is set to 300 or less records.
If I change posts_per_page to -1 then I get this error:
Fatal error: Allowed memory size of 134217728 bytes exhausted
I’m wondering if there is a way that I can send 300 records at a time to a CSV file at a time until there are no more and then trigger the file for download?
Or maybe stream each line to the CSV file? What would be the best way to manage the memory when doing a fopencsv?
Here is what I have now:
$export_query = array(
'post_type' => 'videorepot',
'posts_per_page' => -1,
'status' => 'published',
'meta_query' => array(
array(
'key' => 'agree',
'value' => 1,
'compare' => '=',
),
array(
'key' => 'opt-in',
'value' => 1,
'compare' => '=',
),
array(
'key' => 'status',
'value' => 'Video Ready',
'compare' => '=',
),
)
);
// Posts query
$wp_query = new WP_Query( $export_query );
if ( $wp_query->have_posts() ) :
$list = array( 'Email,Photo1,Photo2,Photo3,VideoURL' );
while ( $wp_query->have_posts() ) : $wp_query->the_post();
$postID = get_the_ID();
$user_email = get_post_meta($postID, 'user_email', true);
$photo1 = get_post_meta($postID, 'photo1', true);
$photo2 = get_post_meta($postID, 'photo2', true);
$photo3 = get_post_meta($postID, 'photo3', true);
$videourl = get_post_meta($postID, 'video_file', true);
$list[] = $user_email.','.$photo1.','.$photo2.','.$photo3.','.$videourl;
endwhile;
endif;
// Output file stream
$output_filename = 'export_' . strftime( '%Y-%m-%d' ) . '.csv';
$output_handle = @fopen( $output_filename, 'w' );
header( 'Cache-Control: must-revalidate, post-check=0, pre-check=0' );
header( 'Content-Description: File Transfer' );
header( 'Content-type: text/csv' );
header( 'Content-Disposition: attachment; filename=' . $output_filename );
header( 'Expires: 0' );
header( 'Pragma: public' );
foreach ( $list as $line ) {
// Add row to file
fputcsv( $output_handle, explode(',', $line), ',', '"');
}
// Close output file stream
fclose( $output_handle );
// We're done!
exit;
For some testing I removed the part of the script that writes the items to a CSV file and so I found out that its really something in my query here that makes it go bonkers. There are maybe 3000 records in my custom post type. Why would this make my query run out of memory?
I decided to go a different route because I am still experiencing memory issues when using wp_query. So, I wrote my MySQL query – you know I hacked it together from other examples. This query runs really fast so I think I’m going to have better luck with it.
I have a new question as this related to meta_key/meta_values. Currently, I am using SELECT * in my query, but I don’t want to select everything. I only need about 5 fields from the database. A few of these fields I want are meta_key/meta_value. Is there a way I can specify those meta_keys so when I export the table I can just get 5 columns with the results?
My query looks like this now:
$values = mysql_query("SELECT * FROM wp_posts
INNER JOIN wp_postmeta ON (wp_posts.ID = wp_postmeta.post_id)
INNER JOIN wp_postmeta AS mt1 ON (wp_posts.ID = mt1.post_id)
INNER JOIN wp_postmeta AS mt2 ON (wp_posts.ID = mt2.post_id) WHERE 1=1
AND ( ( post_date >= '2014-11-03 00:00:00' AND post_date <= '2014-12-31 23:59:59' ) )
AND wp_posts.post_type = 'videorepot'
AND (wp_posts.post_status = 'publish')
AND ( (wp_postmeta.meta_key = 'agree' AND CAST(wp_postmeta.meta_value AS CHAR) = '1')
AND (mt1.meta_key = 'opt-in' AND CAST(mt1.meta_value AS CHAR) = '1')
AND (mt2.meta_key = 'status' AND CAST(mt2.meta_value AS CHAR) = 'Video Ready') )
GROUP BY wp_posts.ID
ORDER BY wp_posts.post_date DESC
LIMIT 0,20");
For now I am setting the limit to 20, but I’ve tested it with the limit set to 0,9999 and it runs very fast in phpMyAdmin. The query above gives me around 1282 results when I don’t set the limit.
I’ve arrived at a good solution I think. This works for me and I did not have to increase php memory or anything like that.
Here is the final, unedited script. Keep in mind the mysql query I’m using uses a date range and 3 meta_keys to get a subset of data from our database. I’m getting about 1290 results all written to the CSV file. (Thanks David for the assist.)
I hope this helps someone else who is dealing with memory issues.
I was also getting “Tried to allocate more memory fatal error” but my solution was that inside while loop i missed this line:
Might be helpful if somebody else is also facing the same issue.