In some cases the duplicated jobs are created

Created on 9 July 2024, 3 months ago
Updated 11 July 2024, 3 months ago

Problem/Motivation

Related issue Add support for unique jobs Fixed
When testing the fixes from the issue Add support for unique jobs Fixed , I've noticed that it's not always working properly.

I will try to explain. I've tested it with commerce_recurring.
The job payload is simple: ['order_id' => $order->id()]
So I've run the 'Commerce Recurring' cron job to process a recurring order. The job has created two queue items:
with types commerce_recurring_order_close and commerce_recurring_order_renew with payload {"order_id":"230185"}.
Then I've run the 'Advanced Queue' cron job. It has processed the queue items. One of them was failed due to another issue not related to this one. And another one was successfully processed. So I had two queue items with states failure and

success</code at those moment.
But after running the  'Commerce Recurring' cron job again, two new queue items with the same payload and fingerprint were created. So the items were duplicated.
And if ran the the  'Commerce Recurring' cron job again without running the the 'Advanced Queue' cron job before it, the duplicates are not created, because the items had state <code>queued

and they were found as duplicated.

So, I think we should remove condition where we check only two states in the getDuplicateJobs() method:
->condition('state', [Job::STATE_QUEUED, Job::STATE_PROCESSING, Job::STATE_FAILURE], 'IN');

I've already attached updated patch for the 1.0 version of the module https://www.drupal.org/project/advancedqueue/issues/2918866#comment-1567... Add support for unique jobs Fixed

And opened this issue to fix, review and test the bug.

Steps to reproduce

See above.

Proposed resolution

Remove condition where we check only two states in the getDuplicateJobs() method:
->condition('state', [Job::STATE_QUEUED, Job::STATE_PROCESSING, Job::STATE_FAILURE], 'IN');

If similar jobs can be duplicated then the payload should contain some unique value e.g. some timestamp.

Remaining tasks

🐛 Bug report
Status

Closed: works as designed

Version

1.0

Component

Code

Created by

🇺🇦Ukraine khiminrm

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Merge Requests

Comments & Activities

  • Issue created by @khiminrm
  • Pipeline finished with Success
    3 months ago
    Total: 164s
    #219681
  • Status changed to Needs review 3 months ago
  • Status changed to Closed: works as designed 3 months ago
  • 🇬🇧United Kingdom jonathanshaw Stroud, UK

    I don't agree this is a problem.

    The database backend here is being deliberately set up so that it is possible to queue the same job many times, but at any one time there can only be one of that job in the queue.

    I think this is correct, because I think it's more likely the more common requirement. And anyway, it's be a serious BC break to change it now.

    The simple solution is to make a custom backend for your project extending the provided one.

    We could in theory provide an API for this by allowing the job plugin annotation to define which states the backend should consider for duplicates. Whether the maintainer would support this added complexity, I'm unsure.

  • Status changed to Active 3 months ago
  • 🇮🇱Israel jsacksick

    Calling this is a BC break is a strech, this was released as part of an RC release which isn't even visible on the project page.
    I think the job type should be responsible for making the fingerprint unique. I agree with bojanz take from https://www.drupal.org/project/advancedqueue/issues/2918866#comment-1567... Add support for unique jobs Fixed .

    A fingerprint should be unique... It should be the job type responsability to make sure it's unique in case you'd want to queue the same job multiple times...

    TBH, the unique jobs support is a feature I was expecting specifically for Commerce Recurring. If we can't even leverage it there without defining a custom backend, we may as well revert the whole feature as it is useless in its current form...

  • 🇮🇱Israel jsacksick

    Coming back to this again, I've been discussing this with Bojan on Slack. I understand the original intent and merging the MR would prevent a job to be ever requeued.

    That said, I think the problem we have to fix, either in AQ or in Commerce recurring, is ensuring we don't requeue the job, if the job didn't permanently fail... In other words, requeuing should only be allowed after the maximum amount of retries (i.e. after a permanent failure).

  • 🇷🇸Serbia bojanz

    The original code has the right intent, we can't make the fingerprint unique across job statuses, because that would mean that once a job completes, it can never be requeued again. Same with failure, if we gave up on a job, it needs to be requeueable.

    However, that doesn't cover failing-but-retrying jobs, if a job is retrying, we should not allow a duplicate to be queued. I think that's what we missed.

  • 🇮🇱Israel jsacksick

    @khiminrm: I see that the recurring jobs aren't configured to "retry"... So what is it a problem to have duplicate jobs? Once the job failed, the job state is "failure" and it shouldn't be reprocessed again, unless it's manually requeued (which shouldn't be the case)?
    I thought what you were complaing about was the fact that the job would be retried, and we were still somehow queuing duplicate jobs...

    I actually agree with @ jonathanshaw, we should probably close this...

  • 🇬🇧United Kingdom jonathanshaw Stroud, UK

    if a job is retrying, we should not allow a duplicate to be queued.

    Agreed. But the database backend sets the state of a failed job back to 'queued' if it should be retried, so duplicates will not be created. Not sure yet that we've missed anything.

  • Status changed to Closed: works as designed 3 months ago
  • 🇷🇸Serbia bojanz

    Based on #9, it sounds like we haven't

Production build 0.71.5 2024