Wanted some opinions on a solution I’m trying to come up with:
I have a fan/enthusiast plugin I’m working on for fans of TV shows. One of the custom post types attached to the plugin is a release type which stores data on dvd releases of a show. Some of the meta data that I weant stored is info from Amazon using their Product Advertising API. I wanted to cache some of the pricing data available there, such as the current price of the dvd.
I was thinking of using transients but from what I can tell their nothing more than a layer on WP options; their too global. I’d have to create numerous transients with a naming scheme that would point to a certain post.
The other option would be to use custom fields but use the CRON API to update it, I’d either have one Cron job update all the custom fields hourly or create one cron job per post that did so.
My question is which solution would be more practical performance and convenience wise?
I think I would write a wrapper function around custom fields, something like this:
and save the meta as an array, of the form:
You would just need to have your update function return the same array structure. I don’t know if there’s any real benefit of this approach over using the options table through transients, but it would make more sense to me to have post meta stored in the postmeta table.
When implementing caching with an API I’ve used the transients route. Mainly because they’re set up for it and easy to use. There aren’t a huge amount of problems in terms of table size or queries. MySQL is made for big datasets.
My approach was to take a hash of the url and/or arguments I was passing to the API and use that as my transient identifier. The function that does the communication with the API first checks for the transient and if it doesn’t exist or has expired carries on with the regular API call. The advantage here over what you’re suggesting with post meta is that it’s a single table and key we’re checking so it’s quick, there’s no looping or complex code needed and the API is only called when the transient has expired and when the data is requested which should reduce the number of API calls you make.
Works well for both my flickr and themoviedb.org plugins.
If it’s connected to a specific post, it really should be in postmeta. I’ve violated this principle many times myself for speed and ease of use reasons, but I felt dirty each time. 🙂
The best way I think would be to store both the data and an expiration time in postmeta instead. Instead of checking all the types every hour, check them only when viewed. Major benefit: You can do this lazily!
Example:
On post view, the meta is pulled. If it’s expired data, then the expired data is shown, but a setting is left somewhere in a global variable (storing just the post id, probably) telling another part of the code to refresh that data at the end of the page display.
Then you use the “shutdown” action hook. Here, you’ll check that global variable, and if some posts’ data needs refreshing, you go pull the data from amazon and update the postmeta, and bump the expiration by some amount of time.
This is the best of all worlds, really. The post is still displayed fast and your viewer doesn’t have to wait for the new data to come from amazon. The amazon data gets refreshed after the page is already finished. And you’re only refreshing data people are looking at, and only when they’re looking at them, minimizing processing.