- Issue created by @jseltzer
- π©πͺGermany mkalkbrenner π©πͺ
Everything you describe in steps 1-4 are updates with existing data. That doesn't work. If you mean a Search API "server" in 6, that doesn't work.
You need to delete and re-create the Solr core! (Or the collection in case of Solr Cloud)
- π©πͺGermany mkalkbrenner π©πͺ
Did you read the release notes?
How to upgrade from 4.2.x or earlier
In order to support advanced highlighting and other features, the Solr schema provided by Search API Solr 4.3 got some fundamental changes, for example the usage of StandardTokenizer and storeOffsetsWithPositions.
When upgrading the module to 4.3.x from 4.2.x and earlier you still can read an existing Solr index and perform searches.
But if you write (index) data you'll run into errors like cannot change field "xyz" from index options=DOCS_AND_FREQS_AND_POSITIONS to inconsistent index options=DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS.
There're different ways to deal with the required update. But all require Solr knowledge.
But the safest way is to delete the existing core (or collection in case of Solr Cloud) and to create a new core (or collection) using a new Solr config-set generated by Search API Solr 4.3.x and to re-index your data afterwards.
If you want to avoid downtime of your service, you can also create a new core (or collection), clone and adjust the Search API server and index configs and index all your data twice, while users can still search via the old core (or collection). Afterwards you can use Solr's rename or alias capabilities to switch both cores (or collections) and delete the cloned Search API server and index configs. - πΊπΈUnited States jseltzer
I did read the release notes. With Pantheon we have no option to delete and recreate the core. Those were the steps they gave me as their version of the instructions in the readme - which sounds like it's not going to work. Can close as this appears to be entirely a platform issue.
- π΅πPhilippines danreb
The steps given above actually work in Pantheon because the disabling and enabling of the search API Solr server removes all the data in the disk or in Pantheon Solr backend, this is true with the Drupal CMS setup with a Single search API Solr server and a Single index ID.
In the case of a complex setup where the site is using multiple Search API Solr servers (Multiple Solr Server IDs) with multiple Index items assigned for each Search API Solr server, there is a need to use the module search_api_solr_devel and then use the Drush command that comes with it just like this in case the Server ID is named pantheon_solr
drush search-api-solr:devel-delete-all pantheon_solr
This needs to be executed before step number 3 or reloading of the Solr core.
- Status changed to Fixed
12 months ago 8:36am 21 November 2023 Automatically closed - issue fixed for 2 weeks with no activity.
- Status changed to Fixed
7 months ago 8:26pm 11 April 2024 - πΊπΈUnited States sjhuskey
I'm sorry for reopening this, but I could use some clarification before I create a bunch of work for myself. I have a collection with four indexes, each customized in its own way. If I delete the collection as recommended, do I need to redo all of those customizations? I just want to be sure to budget my time appropriately before I start breaking things.
I did a test without deleting the collection. I created some new content and queued everything for reindexing. The reindexing was successful without the error mentioned here.