Memcache performance issue from big (i.e. chunked) cache items

Created on 12 September 2023, over 1 year ago
Updated 6 December 2023, about 1 year ago

Problem/Motivation

When storing bigger items in cache, the memcache module breaks the item down into smaller chunks and then re-combines it before returning it on load. But the code is using a closure function call with array_reduce which creates two performance problems:

1. function calls are expensive/slow in PHP in general, so each call in this example adds another 9ms:

2. as the string that's created is growing and then send to the function it has to be copied and extended in RAM constantly, so each consecutive call gets slower. In this example after the 50th time, an individual call nears 30ms from 9ms at the start:

Steps to reproduce

Store a large item in cache that has to be chunked e.g. 50 times and profile how long it takes.

Proposed resolution

Replace the array_reduce and inline function with native PHP functions that don't even require a loop like:

return unserialize(implode(array_column($items, 'data')));

https://git.drupalcode.org/project/memcache/-/blob/8.x-2.x/src/MemcacheB...

Remaining tasks

Test and merge

User interface changes

none.

API changes

none.

Data model changes

none.

πŸ“Œ Task
Status

Fixed

Version

2.0

Component

Code

Created by

πŸ‡©πŸ‡ͺGermany thiemo Darmstadt

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Merge Requests

Comments & Activities

Production build 0.71.5 2024