Alternative approach to unique queue using upsert

Created on 7 August 2023, over 1 year ago
Updated 20 September 2024, 3 months ago

Problem/Motivation

Hello - and thank you for this module! We have been using the database_unique and it greatly improved the default database queue from the purge module.

We found that in certain situations checking each item individually in findItem was causing a bottleneck when multiple requests were received in a short time that invalidated the same cache tags (which in combination with purge_queuer_url) which cause thousands of requests to check duplicate items.

My colleague suggested we try an alternative approach by creating a unique identifier on the queued item and using upsert queries to avoid the individual findItem calls.

In our scenario, the ideal situation would be the ID of the URL from the url registry would flow through as the unique id of the invalidation ( https://www.drupal.org/project/purge_queuer_url/issues/2941913#comment-1... 🐛 At runtime track URLs that have already been queued and skip those. Fixed ) but working with what the framework offers I have hashed the 2 values used by the database_unique queue to store as the unique index.

Steps to reproduce

We have been testing this scenario with the followign setup:

  1. purge_queuer_url enabled and add thousands of items into the registry with the media_list tag
  2. Installed and enabled media_bulk_upload - this module gives a quick way to produce multiple batch requests that invalidate media_list

I am adding a patch here in case you / anybody has a need for exploring an alternative approach to queues.

Feature request
Status

Fixed

Version

2.0

Component

Code

Created by

🇳🇿New Zealand ericgsmith

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Merge Requests

Comments & Activities

Production build 0.71.5 2024