Throttling and batch limiting for large datasets

Created on 7 April 2025, 12 days ago

Problem/Motivation

The FileMaker API Fetcher for Drupal Feeds module struggles with large datasets, lacking throttling and batch limiting capabilities. This can lead to API failures, timeouts, and inefficient handling of multi-run imports.

Steps to reproduce

Configure a Feed using FileMaker API Fetcher with a large dataset.
Attempt to import the feed.
Observe import failures or inefficiencies with large datasets.

Proposed resolution

Implement throttling, batch limiting, and improved offset management in the FileMakerApiFetcher class to handle large datasets more efficiently across multiple import runs.

Remaining tasks

Update FileMakerApiFetcher class with new features.
Modify configuration form to include new options.
Test with various dataset sizes and import scenarios.
Update documentation.

User interface changes

Add fields to Feed type configuration:

Throttle Enable/Disable
Throttle Delay
Maximum Batches per Run

API changes

None. Changes are internal to FileMakerApiFetcher class.

Data model changes

None. Will use existing Drupal state storage for managing import progress.

Feature request
Status

Active

Version

2.0

Component

Code

Created by

🇧🇪Belgium baikho Antwerp, BE

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Production build 0.71.5 2024