Improve dfp_token cache; consider moving to token_cache module

Created on 11 September 2016, almost 9 years ago
Updated 20 July 2025, 8 days ago

I am currently investigating the performance of some sites using dfp.

The dfp token cache overall works great, but it can still be way optimized as it caches per text, but if all tokens as returned by token_scan() are known already, that is unnecessary. Also some tokens are per page, but others are global ($tag) or per user ($user), which currently is suboptimal as the key contains the user, but the cache entry is per page, so every text needs to be done again for each user - always.

So if you have 100 users and 10000 pages, you end up potentially with 100 * 10000 cache entries.

Especially if dfp_token_scan is called 20-30 times.

The new token cache should work like this:

- Define token objects with their granularity and group them accordingly

e.g.

$token_objects['node'] = [
  'id' => $node->nid,
  'object' => $node,
  'granularity' => DRUPAL_CACHE_PER_PAGE,
];

- Use cache_get_multiple grouped by granularity, to get all tokens and replacements for e.g. the current page

- Run token_scan() - find all tokens

- See which tokens can be satisfied from the token cache

Cache hit (all tokens found): Replace tokens in text, return

Note: As token_generate() always calls all hook_tokens() implementations, this is only useful if its not called at all - as all other calls to token_generate() afterwards are statically cached for most tokens already.)

Cache miss:

- Put all scanned remaining tokens into a string:

%user:name%=[user:name]|MAGIC_SEPARATOR|%node:nid%=[node:nid]

- Run the token_replace on this string

- Create an array of key-value pairs for each replaced token, and update the cache (@todo: We might need locking here for the cache collector - to avoid to play cache ping-pong)

- Gradually build up term, node / user / tag caches and replace tokens directly in the text.

- Store tokens in the cache

- Directly replace tokens instead of calling token_generate() - return

=> Way better cache hit ratio, not dependent on text changing and invalidated nicely

Also lets use its own cache_dfp or cache_token bin.

📌 Task
Status

Closed: outdated

Version

1.0

Component

Code

Created by

🇩🇪Germany Fabianx

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Not all content is available!

It's likely this issue predates Contrib.social: some issue and comment data are missing.

Production build 0.71.5 2024