I don’t have full confirmation that they no longer see out of memory issues, but from logging the memory usage at shutdown it appears that this memory issue has likely been reduced by setting the following option in the get_posts
call in cartflows:
'update_post_meta_cache' => false
Obviously this isn’t ideal as a plugin update will reset this. I may look at using the parse_query
hook to see if I can identify the cartflows request and set this flag via functions.php
or a simple custom plugin.
It does identify a larger issue though that some plugins (looking at you Beaver Builder) use postmeta not for storing small flags or additional values, but for large blocks of data, sometimes several MB in size. By default WordPress seems to assume that postmeta will be fairly small and caches all post meta for every post that is loaded, leading to a cache array that can be several 100MB in size. Beaver Builder exacerbates this by not only storing its “current” data structure in meta, but numerous previous revisions.
As the site grows, with more and more posts and revisions made, coupled with other plugins that may use get_posts
to retrieve a decent sized list of posts, you have a site that start to need hundreds if not gigabytes of memory to successfully handle a request.