Inserting large amounts of data into a custom table during plugin install

I am working on a WordPress plugin that creates several new tables into the database. It also loads some default data into the tables from CSV files. Most of these are small and loading the data works fine. One, however, is a zip code database meant to be loaded with just over 43,000 rows of data.

The first time I tried to do exactly what I did for the tables with significantly smaller amounts of data to be inserted. WordPress responded with, “Plugin could not be activated because it triggered a fatal error.” After checking the database I saw that it got through just over 1,000 zip codes before it stopped. So I took the first 1,500 lines from that CSV and broke it into 2 CSV files (750 lines each). I used the code below to loop through loading the two CSV files to test if I could just do this in what would be an incredibly slow solution, but at least something that worked. It turned out that it was still only able to get through 1099 zip codes before stopping.

Read More

Does anyone have a solution for inserting very large amounts of data into a table from a WordPress plugin? Thanks in advance to anyone who tries to help me here.

Here is an example line from the zips CSV:

%1%;%00544%;%NY%;%HOLTSVILLE%;%-73.047623%;%40.813296%;%0%

Here is the create table function:

function zip_table_create() {

global $wpdb;
$table_name = $wpdb->prefix . "zip";

$sql = "CREATE TABLE $table_name (
     `zip_id` bigint(20) NOT NULL AUTO_INCREMENT,
     `zip` char(5) DEFAULT NULL,
     `state` char(2) NOT NULL DEFAULT '',
     `name` char(40) DEFAULT NULL,
     `lng` double NOT NULL DEFAULT '0',
     `lat` double NOT NULL DEFAULT '0',
     `population` int(10) unsigned NOT NULL DEFAULT '0',
     PRIMARY KEY (`zip_id`)
     );";

dbDelta($sql);

// Check to see if any records are stored in table
// If not, load default data from CSV
$zip = $wpdb->get_row("SELECT * FROM $table_name WHERE zip_id = 1");
if ($zip == null) {
    for ($z=1; $z<3; $z++)
    {
        $csvpath = plugin_dir_path(__FILE__);
        $csvpath = $csvpath."csv/zips".$z.".csv";

        $csv_array = csv2array($csvpath, ";", "%");

        for ($x=0; $x < count($csv_array); $x++) {
            $wpdb->insert( 
                $table_name, 
                array(  
                    'zip_id' => $csv_array[$x][0],
                    'zip' => $csv_array[$x][1],
                    'state' => $csv_array[$x][2],
                    'name' => $csv_array[$x][3],
                    'lng' => $csv_array[$x][4],
                    'lat' => $csv_array[$x][5],
                    'population' => $csv_array[$x][6]
                )
            );
        }
    }
}

}

Here is the csv2array function called in the create function:

function csv2array($file, $delimiter, $enclosure) {
if (($handle = fopen($file, "r")) !== FALSE) { 
    $i = 0; 
    while (($lineArray = fgetcsv($handle, 4000, $delimiter, $enclosure)) !== FALSE) { 
        for ($j=0; $j<count($lineArray); $j++) { 
        $data2DArray[$i][$j] = $lineArray[$j]; 
        } 
        $i++; 
    } 
    fclose($handle); 
} 
return $data2DArray;
}

Related posts

Leave a Reply

2 comments

  1. I have some suggestions for you:

    • 1, create MySQL insert queries on your own.
    • 2, combine multiple insert values queries into one like
      values(1,2,3),(4,5,6)
    • 3, use $wpdb->query then

    you need to figure out what the appropriate number of “multiple” is.

    I’ve tried with 100 and it worked for me but I think the number varies on server settings.

  2. One more suggestion is to extend the php execution time limit within the script.
    such as using:

    set_time_limit($seconds);
    

    Where $seconds is how much longer the script should run.