Im trying to implement a cache for a high traffic wp site in php. so far ive managed to store the results to a ramfs and load them directly from the htaccess. however during peak hours there are mora than one process generatin certain page and is becoming an issue
i was thinking that a mutex would help and i was wondering if there is a better way than system(“mkdir cache.mutex”)
From what I understand you want to make sure only a single process at a time is running a certain piece of code. A mutex or similar mechanism could be used for this. I myself use lockfiles to have a solution that works on many platforms and doesn’t rely on a specific library only available on Linux etc.
For that, I have written a small
Lock
class. Do note that it uses some non-standard functions from my library, for instance, to get where to store temporary files etc. But you could easily change that.Usage
Now you can have two process that run at the same time and execute the same script
Process 1
Process 2
Another alternative would be to have the second process wait for the first one to finish instead of skipping the code.
Ram disk
To limit the number of writes to your HDD/SSD with the lockfiles you could create a RAM disk to store them in.
On Linux you could add something like the following to
/etc/fstab
On Windows you can download something like ImDisk Toolkit and create a ramdisk with that.
I agree with @gries, a reverse proxy is going to be a really good bang-for-the-buck way to get high performance out of a high-volume WordPress site. I’ve leveraged Varnish with quite a lot of success, though I suspect you can do so with nginx as well.