Add command to process all items using queues

Created on 2 September 2025, 29 days ago

Problem/Motivation

The existing Drush commands in the entity_mesh module use drush_batch for processing items. On large sites with many entities to process, this can lead to memory leaks, causing the command to fail and not all items to be processed.

Because the process is handled in a batch, it's not possible to resume the operation from where it left off. If the process fails, the entire task must be restarted from the beginning.

Steps to reproduce

This issue is a feature request, so there are no specific steps to reproduce a bug. The problem is related to the performance of existing Drush commands on large-scale sites.

Proposed resolution

The entity_mesh module already contains a Queue Worker at src/Plugin/QueueWorker/EntityMeshQueueWorker.php which is a more robust solution for processing large numbers of items asynchronously.

The proposed solution is to:

  1. Create a new Drush command that enqueues all the items to be processed into the queue.
  2. Users can then process the queue by running drush queue:run entity_mesh or by configuring a cron job to handle the processing automatically.

This approach will:

  • Prevent memory issues on large datasets.
  • Allow the process to be run in the background.
  • Enable the processing to be easily restarted from the last item if an error occurs.
  • Leverage the existing queue system within Drupal, which is designed for this type of task.

Remaining tasks

Create the new Drush command that enqueues the items. This will involve writing code to query the relevant entities and add them to the queue for processing by the existing `EntityMeshQueueWorker`.

Feature request
Status

Active

Version

1.0

Component

Code

Created by

🇪🇸Spain eduardo morales alberti Spain, 🇪🇺

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Merge Requests

Comments & Activities

Production build 0.71.5 2024