in a site of mine I Bulk import data (published as posts) from a json file, and to prevent post duplication, I do the following query on the wp db:
$identificatore = $deals->uuid;
$output = $wpdb->get_var($wpdb->prepare("SELECT count(id)
FROM $wpdb->posts wpo, $wpdb->postmeta wpm
WHERE wpo.ID = wpm.post_id
AND wpm.meta_key = 'deal_id'
AND wpm.meta_value = '$identificatore'"));
if(empty($output)) {
// retrieve data from json and publish as post
}
The problem:
When the blog have 1500/2000 posts published no matter, but when it have 10000/15000, the db query are longer then 15/20 seconds
In my mind what I think to do is do a preliminary check of the post, and check if is under a term “sold out” in the taxonomy “status”, and if in that taxonomy jump the query to check if there is a custom fields with the value “$identificatore”.
However if you know another way most light to prevent the posts duplication with “wp_insert_post” I’m here and I’m waiting for your help
Thanks