- Issue created by @adevweb
- Status changed to Postponed: needs info
about 1 year ago 1:05pm 13 September 2023 - 🇫🇷France fgm Paris, France
Hi @Adeweb, thanks for your report.
The cache metadata issue means the filter will run on each page, so consume more time after the first hit on a content (field) than without the cache metadata ; but it does not affect the memory usage when the filter processing actually runs, either on the first hit on a content after a flush (with cache) or on every hit (without cache).
So I'll split the metadata issue to another issue so we can focus on the memory usage.
Since this is a new issue I never saw in any release for Drupal 4, 5, 6 and 7, even on a glossary with 6k definitions, I need some more details. Ideal would be a anonymized DB dump, but assuming you do not have that available, can you please confirm:
- the number of definition nodes
- their average number of revisions and translations (I guess french only, no english, so just 1 ?)
- the number of DFN elements in each field in the content on which you notice this. If the content type has multiple fields with a format including the Definition filter, please separate the number of DFN elements per field, as the content is filtered per field, not as a whole.
- if you can perform profiling (e.g. using https://www.drupal.org/project/xhprof → , https://xdebug.org/docs/profiler or https://www.blackfire.io/ ) any trace you could provide would be useful
- alternatively, if you can reproduce the problem at will, it would be usefull if you could temporarily edit the filter to display the result of https://www.php.net/manual/en/function.memory-get-usage.php and https://www.php.net/manual/en/function.memory-get-peak-usage.php at the beginning and end of the filter, on nodes with multiple nodes but not causing an issue, and on those causing an issue; all so we can understand what the numbers involved are
Downgrading to major as it does not prevent operation of the module in the general case, only requiring a higher memory_limit than ought to be necessary.
- 🇫🇷France fgm Paris, France
Might be related to 🐛 Chunk multiple cache sets into groups of 100 to avoid OOM/max_allowed_packet issues Fixed since we load the definitions. Needs investigation.
- 🇫🇷France gdevreese
Hi @fgm,
D10 version: 10.1.4 (issue was first noticed on d10.0)
G2 version: 1.0.0-beta2
G2 entries: ~100, 1 translation, no revision
Filter: g2:automatic
Stop list: emptyWe notice an average 4mb increase in memory_get_usage() and a 6mb increase in memory_get_peak_usage() between a text field with all the G2 entries, and another text field with no entry at all.
Not sure if the patch mentioned above did fix the memory usage but it seems to be fairly acceptable.
Please note that we have only one concurrent request for this example. The original issue was noticed only on production with an average 7k concurrent users. - Status changed to Closed: cannot reproduce
7 months ago 7:46am 12 April 2024