- 🇮🇳India nikunjkotecha India, Gujarat, Rajkot
Closing in favour of https://www.drupal.org/project/search_api_algolia/issues/3256840 ✨ Item splitter processor to avoid record size limit Postponed: needs info
We have a site of about 3,000 nodes we are indexing to Algolia.
Only a single one of those results is over the 10kb limit -- all others average 1.58kb in size.
When we clear the index and attempt to re-index all of our content, the process halts as soon as it encounters this 20kb+ node. The only error message we see is that object at position X was over the 10kb limit and could not be indexed.
What we DON'T get is any messaging/errors that indicate this failure caused all remaining items to be rejected.
The expected behavior is that when an over-limit or otherwise problematic entity throws an error, that particular entity is logged and skipped, and the indexing process continues on normally with the remaining items.
Closed: duplicate
Code
Not all content is available!
It's likely this issue predates Contrib.social: some issue and comment data are missing.
Closing in favour of https://www.drupal.org/project/search_api_algolia/issues/3256840 ✨ Item splitter processor to avoid record size limit Postponed: needs info