get_posts / WP_Query Memory size of 134217728 bytes exhausted

  1. If all you want is printing ID in a file, then you may write custom query for it. That way you’ll be able to avoid some internal processing WordPress does.

  2. Many posts may exhaust your RAM, although I don’t think just selecting ID of 2100 posts should really eat up 134MB RAM. Just do the math, ID can be saved in just 1 byte, but lets say it’s taking 4 bytes. Still, 2100 x 4 = 8400 Bytes = 8.4 KB. Obviously PHP needs more internal memory to process, to create objects etc. But with 134MB memory, I could easily process few hundred thousand ID. So obviously you are doing wrong somewhere else.

Anyways, for whatever reason (may be you need to select everything from product, not just ID), you may segment the query with limits. Like the following CODE:

if ( ! defined('ABSPATH') ) {
    /** Set up WordPress environment */
    require_once( dirname( __FILE__ ) . '/wp-load.php' );
}
// $limit determines how many rows you want to handle at any given time
// increase / decrease this limit to see how much your server can handle at a time 
$limit = 100;
$start = 0;

// open file handle
$myfile = fopen( dirname( __FILE__ ) . '/wp_all_import.txt', 'a' );

$qry = "SELECT ID FROM `$wpdb->posts` where post_type="post" AND post_status="publish" limit %d, %d";
while( $result = $wpdb->get_results( $wpdb->prepare( $qry, array( $start, $limit ) ) ) ) {
    $write_data="";
    foreach ( $result as $row ) {
        $write_data = $write_data . $row->ID . "\n";
    }
    // Generally speaking, writing immidiately to the file is better than
    // string concatination, because depending on what you concat and how many times,
    // the generated string may become too big (like MB size string).
    // On the other hand, writing to files thousands of times in a single script may
    // cause I/O delays. So, here I've done a combination of the two to keep both
    // string size & I/O within limits.
    // Adjust this according to your own situation.
    fwrite( $myfile, $write_data );
    $start = $start + $limit;
}

// close file handle
fclose( $myfile );

This way PHP will only handle maximum $limit number of rows, so memory limit should not cross.

Note: never ever concatenate to make very long strings (like MB long), write immediately to the file before it becomes too long. It may produce some I/O delay, but it’ll not exhaust memory limit.

Leave a Comment