Add DelayedRequeueException feature to feeds queue - D9.1

Created on 29 April 2022, over 2 years ago
Updated 2 November 2023, about 1 year ago

Problem/Motivation

Look at using "New DelayedRequeueException" https://www.drupal.org/node/3127239

I was thinking of delaying a fetch retry with say 30 minutes. This way the logs would get less bombarded with potentially the same message. Delaying a retry is possible since Drupal 9.1. - megachriz in slack

It would also be good to add configuration for this, so the delay time can be changed if wished. I wonder if this should be made configurable per feed type or a global setting. Making it configurable per feed type is somewhat easier because a settings page for Feeds global settings doesn't exist yet. But I wonder if it make sense that this would be configurable per feed type.
In the future, we probably would get Feeds global settings anyway. I plan to make the directory for storing files (for example fetched via http) to import configurable. - megachriz in slack

Steps to reproduce

Proposed resolution

Remaining tasks

  • Add configuration for the delay time.
  • Add tests:
    • Ensure that when a fetch task fails, it isn't immediately processed again at the next cron run but at a later time.
    • Ensure that when something fails during the process stage, the feed remains locked (this might be handled in an other issue, but it is closely related).

User interface changes

API changes

Data model changes

📌 Task
Status

Active

Version

3.0

Component

Code

Created by

🇨🇦Canada joelpittet Vancouver

Live updates comments and jobs are added and updated live.
  • Needs tests

    The change is currently missing an automated test that fails when run with the original code, and succeeds when the bug has been fixed.

Sign in to follow issues

Comments & Activities

Not all content is available!

It's likely this issue predates Contrib.social: some issue and comment data are missing.

  • 🇳🇱Netherlands megachriz

    I've looked at this one and it appears to become quite complicated.

    The issue is about picking up a queue task later when it fails. An example of a failure is when a timeout occurs during fetching data. Or a SQL error.

    With what is proposed, the queue task that fails is scheduled to run one hour later. But the thing is that the other tasks in the queue still go through. During my testing that resulted into reaching the finish task and then Feeds marks the import as finished and cleans up the queue. The delayed queue task does not get picked up (unless the other tasks take longer than an hour to complete).

    So not sure yet what to do with that. I thought about that the finish task checks if there are any tasks left, but then how does Feeds know when it should actually finish later on? Perhaps the finish task should become a delayed task by itself? But that does potentially cause the import to take an hour longer than necessary.

Production build 0.71.5 2024