Chunk multiple cache sets into groups of 100 to avoid OOM/max_allowed_packet issues

Created on 14 December 2022, almost 2 years ago
Updated 1 November 2023, about 1 year ago
🐛 Bug report
Status

Fixed

Version

10.1

Component
Cache 

Last updated about 8 hours ago

Created by

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Not all content is available!

It's likely this issue predates Contrib.social: some issue and comment data are missing.

  • 🇫🇷France 5

    I am also facing the issue on a taxonomy term page.

    Before searching in core issue queue, I realized my MariaDB configuration was the default one and I upgraded it according to https://www.drupal.org/node/259580 with a single effect: the "MySQL Server has gone away error" has been replaced by a PHP out of memory error (512 Mo allowed).

    Patch from #15 has fixed the situation (thanks Defcon0).

  • 🇫🇮Finland iSampo

    Having this issue as well, MySQL goes away when it tries to cache all the site users (~12k) in cache_entity table, and then dblog does its job by trying to set a single log message for that insert with @message that contains the 85k placeholders.

    The patch in #15 does get rid of the error but has issues, including all cids being wrong in all cache tables due to not preserving keys.

    Here's a patch that does the following:
    - Move the chunking into doSetMultiple() to have it done only once. Not really a fan of the looping the chunks and then the items but here it is anyways. No other logic changes inside the function. Doing it this way it will of course increase the number of inserts by quite much so not sure what kind of implications that would have.
    - Preserve the cids
    - Update MAX_ENTITIES_PER_CACHE_SET to MAX_ITEMS_PER_CACHE_SET

  • 🇫🇷France fgm Paris, France
  • 🇫🇷France fgm Paris, France

    FWIW, in the other issue I had to raise max_allowed_packet = 1024M to support the use case. Maybe we could look up the value of the parameter at some point, maybe during cache flushes/container rebuilds and adjust that MAX_ITEMS_PER_CACHE_SET constant accordingly: 100 seems low for most setups.

  • 🇬🇧United Kingdom catch

    I don't think the 100 limit is much of a problem even on systems that can handle more, the ::setMultiple() is an optimisation against setting items one by one. Even with 1,000 items to set, that's only nine extra database queries, and each database query is likely to be faster given they're smaller.

  • 🇫🇷France fgm Paris, France

    Yeah, maybe just set it as a public or even protected property so that anyone with an actual need for higher values can tweak it from the outside (using setAccessible if it is protected).

  • Hi I have a website with about 30 Taxonomy, since 9.4.9 2 of my taxo crash, the two more big about 700 taxonomy, soo I applied this patch now works nice thx :)

    But, I have a content Books, each content has about 100-200 pages, impossible to acces now and my View crach, 9.4.9 make Big limitations and Crash many many site! xD

    Now Worpress do better for classification.... xD

    What I can do? Thx

  • Bug correct the viewss for taxonomy, but Big content (about 100 pages) block, and in Back-end report the LOG is break, message:

    "The website encountered an unexpected error. Please try again later."

    All page with a bit data seem BUG, patch no correct this. :(

  • I find solution for the moment, I put on my View with BIG TAXO to show only 10 content and works, more 20 no works Error server....

  • last update about 1 year ago
    Custom Commands Failed
  • Status changed to Needs review about 1 year ago
  • 🇬🇧United Kingdom catch

    #21 seems OK to me, it would be worth adding explicit test coverage to the database cache backend that you can set at least 101 cache items.

  • 🇬🇧United Kingdom catch

    Re-titling to make it clearer what the problem is.

  • last update about 1 year ago
    Custom Commands Failed
  • 🇸🇰Slovakia poker10

    Added a new test case to the DatabaseBackendTest::testSetGet() to test exceeding the chunk size. No other changes made.

  • last update about 1 year ago
    30,148 pass
  • 🇦🇺Australia acbramley

    Fixing stan fails, hiding previous patches

  • Status changed to Needs work about 1 year ago
  • 🇺🇸United States smustgrave

    Only moving to NW for IS update.

  • Status changed to Needs review about 1 year ago
  • 🇦🇺Australia acbramley

    Updated IS.

  • Status changed to RTBC about 1 year ago
  • 🇺🇸United States smustgrave

    Thanks @acbramley!

    IS lines up to solution and test coverage seems to cover #31

  • last update about 1 year ago
    30,150 pass
    • catch committed 37d64e23 on 10.1.x
      Issue #3327118 by Defcon0, acbramley, iSampo, poker10, Nitin shrivastava...
    • catch committed a04bdb08 on 11.x
      Issue #3327118 by Defcon0, acbramley, iSampo, poker10, Nitin shrivastava...
  • Status changed to Fixed about 1 year ago
  • 🇬🇧United Kingdom catch

    Committed/pushed to 11.x and cherry-picked to 10.1.x, thanks!

  • Automatically closed - issue fixed for 2 weeks with no activity.

  • Status changed to Fixed about 1 year ago
  • Patch 34 is still Oki for TAxo big, but problem is still here for show big content, all my COntent BIG is bloque with error:

    Error 503 Backend fetch failed
    Backend fetch failed

    Guru Meditation:
    XID: 699419813

    Varnish cache server

    Impossible to make sitemap too or make view with this content.... Problem limitation is still here and break all big website, please help us to find solution :)

    Thx for all works Team.

  • 🇫🇷France fgm Paris, France
  • 🇫🇷France fgm Paris, France

    The 503 in Varnish is not necessarily related to that issue: any page that is too long to serve will trigger it if your VCL has too short a timeout.

    Can you reproduce the issue without Varnish ?

  • Patch no works with Drupal 10.1.5....
    Patch oki D10.1.0 to 10.1.4.

    Still problem big content no works....

    All Big site is break and bloque with Drupal 10....

  • 🇫🇷France fgm Paris, France

    Great to see this merged. However, unless I miss something it does not address to potential issue of saturating the static cache, as mentioned as part 📌 Add an entity iterator to load entities in chunks Needs work , does it ?

    Do we want to track that along with the 2577417 iterator, or separately ?

  • I correct it on my website Drupal 10:

    I up the SQL "max_allowed_packet" on my hosting to maximum 64MB and all is oki, it is configured at 1MB basically and too low for Drupal, best ways may be is to write in Tutorial to check this??? I read 8MB is a minimum for Drupal 9....

Production build 0.71.5 2024