PHP best way to send post requests to hundreds of sites?

I’ve tried using Rolling Curl, Epi Curl, and other PHP multi curl solutions that are out there, and it takes an average of 180 seconds to send post requests to JUST 40 sites and receive data (I’m talking about receiving just small little success/fail strings) from them, that is dog slow!!!

It only does well with 1 post request which is like 3-6 seconds and I don’t even know if that’s even good because I see others talking about getting 1 second responses which is crazy.

Read More

I’ve also tried using proc_open to run linux shell commands (curl, wget) but that is also slow as well, and not server friendly.

What I’m pretty much trying to do is a WordPress plugin that is able to manage multiple WordPress sites and do mass upgrades, remote publishings, blogroll management, etc. I know that there is a site out there called managewp.com, but I don’t want to use their services because I want to keep the sites I manage private and develop my own. What I notice about them is that their request/response is ridiculously fast and I am just puzzled at how they’re able to do that, especially with hundreds of sites.

So can someone please shed light how I can make these post requests faster?

Edit

I’ve been doing some thinking and I asked myself, “What is so important about fetching the response? It’s not like the requests that get sent don’t get processed properly, they all do 99% of the time!”

And so I was thinking maybe I can just send all the requests without getting the responses. And if I really want to do some tracking of those processes and how they went, I can have those child sites send a post request back with the status of how the process went and have the master site add them into a database table and have an ajax request query like every 10 seconds or so for status updates or something like that.. how does that sound?

Related posts

Leave a Reply

2 comments

  1. cUrl takes about 0.6 – 0.8 seconds per request

    So for about 500 website it could take from 300 to 400 seconds.

    You could whip this through a loop.

    $ch = curl_init(); // Init cURL
    
    curl_setopt($ch, CURLOPT_URL, "http://www.example.com/post.php"); // Post location
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); // 1 = Return data, 0 = No return
    curl_setopt($ch, CURLOPT_POST, true); // This is POST
    
    // Our data
    $postdata = array(
        'name1' => 'value1',
        'name2' => 'value2',
        'name3' => 'value3',
        'name4' => 'value4'
    );
    
    curl_setopt($ch, CURLOPT_POSTFIELDS, $postdata); // Add the data to the request
    
    $o = curl_exec($ch); // Execute the request
    
    curl_close($ch); // Finish the request. Close it.
    

    This also depends on your connection speed. From a datacenter it should be fine, if your testing from a home location it might give not-as-good results.

  2. I’m currently working on a project download hundreds of URLs at a time with PHP and curl_multi. Do batches of up to 250 URLs and play with CURLOPT_TIMEOUT and CURLOPT_CONNECTTIMEOUT to refine your code’s speed.

    I have a cURL class (2500+ lines) handling all cURL magic including multi and straight to file downloads. 250 URLs / 15-25 seconds using decent timeouts. (But I’m not sharing it for free…)

    PS: Downloading that many URLs would require using temporary files as cURL download targets and not memory. Just a thought…