- 🇬🇧United Kingdom 2dareis2do
I have tested migrate_queue_importer. One interesting thing about migrate_queue_importer is it seems to use hook cron to elevate the priority of the module as hook_cron tends to be:
1. executed before the queue jobs are run.
2. executed sequentially, i.e. one hook implementation followed by another.One issue is that it still treats migrations as a batch. e.g. say I have a rss feed that has 100 items on it which may in turn have one or more dependencies with a similar amount of items, the chances of a job completing in say 30 seconds are slim which is the default value for with
suspendMaximumWait
.Now if a migration was executed and imported row by row, using the queue might be more beneficial. It is more likely a single row could be imported within this 30 second period or similar, rather than abstracting the import using migrate tools UI or drush (much of the functionality from migrate tools has been replicated in drush) where it processes the queue as a single job. In fact this is different to migrate Ui that does use the batch api to process items one by one, This allows a job to continue from if it was interrupted.
Of course migrate also has its own state (db) and ways of knowing how to handle an import depending on if it has been imported successfully or not etc.
I think it might be beneficial to have a way of importing a migration row by row, while at the same time allowing users to override the
suspendMaximumWait
to suit, so that a migration import and any dependencies works more optimally with the queue API and is likely to time out or fail.