Potential memory limit issue for pages with many words

Created on 12 September 2023, 10 months ago
Updated 12 April 2024, 3 months ago

Problem/Motivation

There is a potential memory limit issue for pages with many words contained in the glossary.

Steps to reproduce

It depends on the server configuration / number of glossary words per page.

Proposed resolution

Ability to add CacheableMetadata in the process function of the Definition filter class

Remaining tasks

-

User interface changes

-

API changes

-

Data model changes

-

πŸ› Bug report
Status

Closed: cannot reproduce

Component

Code

Created by

πŸ‡«πŸ‡·France adevweb

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

  • Issue created by @adevweb
  • Status changed to Postponed: needs info 10 months ago
  • πŸ‡«πŸ‡·France fgm Paris, France

    Hi @Adeweb, thanks for your report.

    The cache metadata issue means the filter will run on each page, so consume more time after the first hit on a content (field) than without the cache metadata ; but it does not affect the memory usage when the filter processing actually runs, either on the first hit on a content after a flush (with cache) or on every hit (without cache).

    So I'll split the metadata issue to another issue so we can focus on the memory usage.

    Since this is a new issue I never saw in any release for Drupal 4, 5, 6 and 7, even on a glossary with 6k definitions, I need some more details. Ideal would be a anonymized DB dump, but assuming you do not have that available, can you please confirm:

    Downgrading to major as it does not prevent operation of the module in the general case, only requiring a higher memory_limit than ought to be necessary.

  • πŸ‡«πŸ‡·France fgm Paris, France

    Might be related to πŸ› Chunk multiple cache sets into groups of 100 to avoid OOM/max_allowed_packet issues Fixed since we load the definitions. Needs investigation.

  • πŸ‡«πŸ‡·France gdevreese

    Hi @fgm,

    D10 version: 10.1.4 (issue was first noticed on d10.0)
    G2 version: 1.0.0-beta2
    G2 entries: ~100, 1 translation, no revision
    Filter: g2:automatic
    Stop list: empty

    We notice an average 4mb increase in memory_get_usage() and a 6mb increase in memory_get_peak_usage() between a text field with all the G2 entries, and another text field with no entry at all.

    Not sure if the patch mentioned above did fix the memory usage but it seems to be fairly acceptable.
    Please note that we have only one concurrent request for this example. The original issue was noticed only on production with an average 7k concurrent users.

  • Status changed to Closed: cannot reproduce 3 months ago
  • πŸ‡«πŸ‡·France fgm Paris, France

    Cannot reproduce so closing.

Production build 0.69.0 2024