Row size too large, Too many keys specified

Created on 20 February 2023, about 2 years ago
Updated 25 February 2023, about 2 years ago

Problem/Motivation

88 fields have been added to the index. Each field has a type - string.
Error when trying to index

Couldn't index items. Check the logs for details.

dblog:

Drupal\Core\Database\DatabaseExceptionWrapper while trying to add a database index for column field_collectors_type_2 to table search_api_db_products: SQLSTATE[42000]: Syntax error or access violation: 1069 Too many keys specified; max 64 keys allowed: ALTER TABLE "search_api_db_products" ADD INDEX `_field_collectors_type_2` (`field_collectors_type_2`(10)); Array ( ) in Drupal\search_api_db\Plugin\search_api\backend\Database->createFieldTable() (line 870 of /d9/modules/contrib/search_api/modules/search_api_db/src/Plugin/search_api/backend/Database.php).

Drupal\search_api\SearchApiException: SQLSTATE[42000]: Syntax error or access violation: 1118 Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs: ALTER TABLE "search_api_db_products" ADD `field_drip_tube_type_1` VARCHAR(255) DEFAULT NULL COMMENT 'The field\'s value for this item'; Array ( ) in Drupal\search_api_db\Plugin\search_api\backend\Database->fieldsUpdated() (line 1143 of /d9/modules/contrib/search_api/modules/search_api_db/src/Plugin/search_api/backend/Database.php).

Drupal\search_api\SearchApiException: SQLSTATE[42000]: Syntax error or access violation: 1118 Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs: ALTER TABLE "search_api_db_prod" ADD `field__surface_engine_power_1` VARCHAR(255) DEFAULT NULL COMMENT 'The field\'s value for this item'; Array ( ) in Drupal\search_api_db\Plugin\search_api\backend\Database->fieldsUpdated() (line 1143 of /d9/modules/contrib/search_api/modules/search_api_db/src/Plugin/search_api/backend/Database.php).

if i click "Execute tasks now" i see an error

Drupal\search_api\SearchApiException: SQLSTATE[42000]: Syntax error or access violation: 1118 Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs: ALTER TABLE "search_api_db_prod" ADD `field__surface_engine_power_1` VARCHAR(255) DEFAULT NULL COMMENT 'The field\'s value for this item'; Array
(
)
in Drupal\search_api_db\Plugin\search_api\backend\Database->fieldsUpdated() (line 1143 of modules/contrib/search_api/modules/search_api_db/src/Plugin/search_api/backend/Database.php). Drupal\Core\Database\StatementWrapper->execute(Array, Array) (Line: 944)
Drupal\Core\Database\Connection->query('ALTER TABLE {search_api_db_prod} ADD `field__surface_engine_power_1` VARCHAR(255) DEFAULT NULL COMMENT 'The field\'s value for this item'') (Line: 437)
Drupal\mysql\Driver\Database\mysql\Schema->addField('search_api_db_prod', 'field__surface_engine_power_1', Array) (Line: 833)
Drupal\search_api_db\Plugin\search_api\backend\Database->createFieldTable(Object, Array, 'index') (Line: 1087)
Drupal\search_api_db\Plugin\search_api\backend\Database->fieldsUpdated(Object) (Line: 700)
Drupal\search_api_db\Plugin\search_api\backend\Database->updateIndex(Object) (Line: 154)
Drupal\search_api\Task\ServerTaskManager->executeTask(Object) (Line: 114)
Drupal\search_api\Task\ServerTaskManager->processEvent(Object, 'search_api.task.updateIndex', Object)
call_user_func(Array, Object, 'search_api.task.updateIndex', Object) (Line: 142)
Drupal\Component\EventDispatcher\ContainerAwareEventDispatcher->dispatch(Object, 'search_api.task.updateIndex') (Line: 215)
Drupal\search_api\Task\TaskManager->executeSpecificTask(Object) (Line: 359)
Drupal\search_api\Task\TaskManager->processBatch(Array, Array, Array) (Line: 295)
_batch_process() (Line: 137)
_batch_do() (Line: 93)
_batch_page(Object) (Line: 55)
Drupal\system\Controller\BatchController->batchPage(Object)
call_user_func_array(Array, Array) (Line: 123)
Drupal\Core\EventSubscriber\EarlyRenderingControllerWrapperSubscriber->Drupal\Core\EventSubscriber\{closure}() (Line: 580)
Drupal\Core\Render\Renderer->executeInRenderContext(Object, Object) (Line: 124)
Drupal\Core\EventSubscriber\EarlyRenderingControllerWrapperSubscriber->wrapControllerExecutionInRenderContext(Array, Array) (Line: 97)
Drupal\Core\EventSubscriber\EarlyRenderingControllerWrapperSubscriber->Drupal\Core\EventSubscriber\{closure}() (Line: 169)
Symfony\Component\HttpKernel\HttpKernel->handleRaw(Object, 1) (Line: 81)
Symfony\Component\HttpKernel\HttpKernel->handle(Object, 1, 1) (Line: 58)
Drupal\Core\StackMiddleware\Session->handle(Object, 1, 1) (Line: 48)
Drupal\Core\StackMiddleware\KernelPreHandle->handle(Object, 1, 1) (Line: 48)
Drupal\Core\StackMiddleware\ReverseProxyMiddleware->handle(Object, 1, 1) (Line: 51)
Drupal\Core\StackMiddleware\NegotiationMiddleware->handle(Object, 1, 1) (Line: 23)
Stack\StackedHttpKernel->handle(Object, 1, 1) (Line: 713)
Drupal\Core\DrupalKernel->handle(Object) (Line: 19)

PHP 8.1.14
MySQL 8.0.26-16 / MySQL 5.7
drupal commerce + search API + search API DB

๐Ÿ› Bug report
Status

Postponed: needs info

Version

1.28

Component

General code

Created by

๐Ÿ‡บ๐Ÿ‡ฆUkraine proweb.ua

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

  • Issue created by @proweb.ua
  • Status changed to Postponed: needs info about 2 years ago
  • ๐Ÿ‡ฆ๐Ÿ‡นAustria drunken monkey Vienna, Austria

    The problem with the maximum number of keys is well-known (see, e.g., #2403275: Syntax error or access violation: 1069 Too many keys specified; max 64 keys allowed in SearchApiDbService->fieldsUpdated() โ†’ and #2844945: SearchApiException Too many keys specified; max 64 keys allowed โ†’ ), but wonโ€™t do much harm. If you want, you can just manually remove or add indexes to the database table that you think will be most useful. (E.g., fields you often sort by, or single-valued fields often used for filtering.) Not really much we can do, except try to be smarter about which columns to place indexes on โ€“ see ๐Ÿ“Œ Is adding an index for each column in the denormalized index table really a good idea? Active .

    The second error is a new one, though, and of course both more serious and harder to work around.
    A quick research, though, suggests that setting innodb_strict_mode = 0 in the MySQL settings might prevent the error for you, at least during table creation. An error would still occur if an actual row got above that size, but that is probably far less likely than the maximum row size exceeding it, given our generous VARCHAR field lengths. So, could be worth a try.

    Otherwise, not really sure how to deal with this in the module generically. If we could spot this problem, one thing we could probably always do is just split the table into two (or more) parts and just JOIN them at query time. Less than ideal for performance, but better than an exception preventing any indexing, Iโ€™d say. However, not that trivial to implement, and as this is (I think) the first complaint about this in 15 years, this doesnโ€™t seem like a very common problem. We might also be fine just ignoring it unless there are more reports about it.

    Generally, when you have such a large index, investing in a dedicated search server like Solr seems like a good idea anyways.
    Also, please make sure you really need all those 88 fields indexed, and be aware that only fields on which you want to filter or sort, or which you want to search, need to be indexed.

Production build 0.71.5 2024