- 🇮🇹Italy trickfun
When i run feed from feed process page i can't process more than 200 items.
If i setup "Force update", i can update the first 200 items.
If i don't flag "Force update" i can update 0 items.
Only 200 items are processed.To update more than 200 items i need to change line_limit in yml file.
So line_limit is the max items processed.I think is important to have this value as field in feed configuration form.
Thank you - 🇳🇱Netherlands megachriz
@trickfun
I generated a CSV file with 1 header row and 600 data rows (see attached) and could import the complete file just fine. I tried to do this via file upload, running the import in the UI. And I also tried it by running the import in background and then run cron. All 600 rows got imported as nodes.I also tried to import a file with 250,000 items using the "Import in background" option and on the command line I ran cron 5 times using
drush core-cron
. The import was not completed yet, but already more than 5000 nodes were imported.See the attached feed type for the configuration that I used,
line_limit
is set to100
. line_limit really only affects the number of items parsed at a time and is not a value for the maximum number of items that you can import. At least, that is not the intention of that setting.So there must be something else happening that causes the import to not continue after 200 items. Perhaps there is a bug in Feeds that only occurs in very specific circumstances? Or perhaps there is a bug in an other module that disrupts the import process?
Can you try to reproduce your issue on a clean install?
- 🇮🇹Italy trickfun
Thank megachriz,
info for you,
if i select "directory" fetcher and running process using "Import" i can process only 200 items.
if i select "upload file" fetcher and i running process using "Import" i can process all items. - 🇳🇱Netherlands megachriz
@trickfun
Interesting, though if I use the Directory fetcher instead of File Upload using the feed type provided earlier and put the file 'data.csv' in sites/default/files/csv-import-test and then on the feed form set "Server file or directory path" to "public://csv-import-test" and then run import in the UI, all 600 items get imported. I tried it again using the file with 250,000 items and import that on background. After a few cron runs, a few thousands items got imported. I did test with only one file in that directory, however. Do you use more that one file?In order to debug this, it would be helpful if you:
- Can produce this issue on a clean install;
- Provide the feed type configuration that you used on that clean install;
- Provide the file(s) that you tried to import. This may also be a sample file, if the issue can be reproduced with that.
- 🇮🇹Italy trickfun
@megachriz
in production i haven't problem. all item processed
strange behavior - 🇳🇱Netherlands megachriz
@trickfun
Your issue looks like to be reported in #3261011: Wrong CSV parser lines when feed is using CRON on large files → . It could be related to cron being ran very often. Or perhaps that multiple Feeds tasks run in parallel.