Leave a Reply

3 comments

  1. I think I would write a wrapper function around custom fields, something like this:

    function get_post_transient( $post_ID, $meta_key, $update_function ) {
        $current_value = get_post_meta( $post_ID, $meta_key, true );
        if ( is_array( $current_value ) && $current_value['expiration'] < date('U') )
            return $current_value['data'];
    
        $new_value = call_user_function( $update_function, $post_ID );
        update_post_meta( $post_ID, $meta_key, $new_value );
    
        return $new_value['data'];  
    
    }
    

    and save the meta as an array, of the form:

    array(
        'expiration' => //timestamp that this field exires
        'data' => // all the custom data you want to save.
    );
    

    You would just need to have your update function return the same array structure. I don’t know if there’s any real benefit of this approach over using the options table through transients, but it would make more sense to me to have post meta stored in the postmeta table.

  2. When implementing caching with an API I’ve used the transients route. Mainly because they’re set up for it and easy to use. There aren’t a huge amount of problems in terms of table size or queries. MySQL is made for big datasets.

    My approach was to take a hash of the url and/or arguments I was passing to the API and use that as my transient identifier. The function that does the communication with the API first checks for the transient and if it doesn’t exist or has expired carries on with the regular API call. The advantage here over what you’re suggesting with post meta is that it’s a single table and key we’re checking so it’s quick, there’s no looping or complex code needed and the API is only called when the transient has expired and when the data is requested which should reduce the number of API calls you make.

    Works well for both my flickr and themoviedb.org plugins.

  3. If it’s connected to a specific post, it really should be in postmeta. I’ve violated this principle many times myself for speed and ease of use reasons, but I felt dirty each time. 🙂

    The best way I think would be to store both the data and an expiration time in postmeta instead. Instead of checking all the types every hour, check them only when viewed. Major benefit: You can do this lazily!

    Example:

    On post view, the meta is pulled. If it’s expired data, then the expired data is shown, but a setting is left somewhere in a global variable (storing just the post id, probably) telling another part of the code to refresh that data at the end of the page display.

    Then you use the “shutdown” action hook. Here, you’ll check that global variable, and if some posts’ data needs refreshing, you go pull the data from amazon and update the postmeta, and bump the expiration by some amount of time.

    This is the best of all worlds, really. The post is still displayed fast and your viewer doesn’t have to wait for the new data to come from amazon. The amazon data gets refreshed after the page is already finished. And you’re only refreshing data people are looking at, and only when they’re looking at them, minimizing processing.