Thanks for the patch. But as you mentioned, the log entry is generated in the Server class which then doesn't call the method on the backend anymore. The function is not meant to be called directly.
If we should harden the code if called directly, we need to do it for all methods, for example deleteItems().
No further feedback
Tests are still failing.
4.2 is not supported anymore and there will be no further release. But thanks for the patch people could apply locally!
It isn't a breaking change in this module. It is a bug fix to get things working with supported Solr versions. Older Solr 9 versions are EOL.
But Solr made a breaking change and mentioned that and their release notes. I'm sure you read it ;-)
Anyway, I would appreciate a contribution to the README.
Your Solr 9 seems to run in cloud mode
OK, the test fails ;-)
mkalkbrenner → created an issue.
If you require this functionality or generic sequences service, you can use the sequences module:
https://www.drupal.org/project/sequences →
I already thought about exclude lists.
But I suggest to commit this one first as it contains bug fixes and performance improvements without breaking any existing functinality. And maybe another beta release.
Then we could think about the next steps.
I did some more tests and found a minor issue that counting revisions per language needs to be done on the data table.
I noticed two critical issues during testing:
- The keep parameter had no effect.
- The performance is really bad on big databases.
The first one is a bug introduced in the dev branch.
The second is caused by the fact that all entities of a requested type or bundle are queued regardless how many revisions exist in database. In case of one million nodes, that takes a significant amount of time, CPU and memory to load all of them to count the number of revisions.
It is way better to only put those into the queue that have more revisions than to keep.
I fixed both issues within the MR instead of opening new ones because the code changed too much.
BTW, I implemented getting the entity IDs as SQL. Loading entities would quickly run out of memory.
The issue was with profile entities. That had to be handled in
✨
Support all entity types
Active
.
So I decided to combine both merge requests here.
I get an error during queue:run. I'll check that ...
mkalkbrenner → changed the visibility of the branch 1.0.x to hidden.
mkalkbrenner → created an issue.
mkalkbrenner → created an issue.
I added this to the product page:
Search API Solr supports any Solr version from 3.6 to 9.x. Solr 7.x to 9.x are directly supported by the module itself, earlier versions from 3.6 to 6.x require enabling the included search_api_solr_legacy sub-module.
Solr 10.x support will require some work and will be added sooner or later (sponsors are welcome).
In general, maintaining detailed informations and testing all versions requires a lot of time. But since there're are no major sponsors anymore, it isn't easy to do all these small tasks.
Since people a re using default_content_deploy 2.2.0 beta now, they run into this issue and lose data.
I put a big warning on
https://www.drupal.org/project/default_content_deploy →
Since the issue is obvious, can't we commit the fix here?
I think that this is rather a solarium issue.
Redis used request time
That explains why the tests are still failing.
So I leave it to you improve the test.
For us, the patch is good enough to fix the critical issue in our production environment.
Writing to Redis sometimes takes more than 2s in the tests :-(
Expire is time() + 2
, but
'created' => 1746536865.132
'expire' => '1746536864'
I agree that that "core compatibility" should be configurable.
But if I understand the issue correctly, "core compatibility" currently only exists because of a bug in the calculation?
Never edit composer.lock!
Edit the top level composer.json.
I believe I have not installed solarium/solarium explicitly, by itself, or in a specific version.
You must have done that. A contrib module can't do it.
For whatever reason you must have installed solarium 6.3.6 explicitly.
Simply remove solarium from composer.json.
I don't think that this is an issue with search_api_solr, but with your installation. Run
composer why-not drupal/search_api_solr:4.3.8
I was able to reproduce the issue if a module is installed that reacts to media entity updates. In our case, image_replace isn't fault tolerant. So default_content_deploy can't import the entity or fix the file ID later.
A quick fix is to handle files at the beginning of the import.
BTW that issue was hidden in previous versions. Meaning, it existed as well but didn't hurt because of the more verbose export format.
The format changed. Most of _links at top level is not need for our use-case.
So from the JSON code above, it's missing this outside of _embedded:
No, this is correct. Entity Reference fields are only in _embedded.
It seems to be an import related issue.
Sorry, but these patches aren't readable because they reformat entire files.
I ended up using this:
sudo su - solr -c "/opt/solr/bin/solr create -c IAB -n data_driven_schema_configs"
This is totally wrong. It doesn't use the drupal schema. That's why the file is missing.
Also of note, I was not able to use this command from the README.md file to create a core:
`$ sudo -u solr $SOLR/bin/solr create_core -c $CORE -d $CONF -n $CORE`
Did you replace $CORE and $CONF with the required values?
BTW why don't you start Solr in cloud mode and let Drupal create the collection for you?
In 2.2.x we still use JSON, but the format become much more readable because not required information gets removed now.
Nevertheless, YAML would be interesting as alternative.
We import users, webform submissions and newsletter subscriptions from different sites into one backend using default_content_deploy. So we need a UUID field to avoid id collisions.
This works well for everything except the SubscriberHistory. With that patch applied, SubscriberHistory works as well.
But I agree that the update hook requires a batch.
mkalkbrenner → created an issue.
mkalkbrenner → created an issue.
There's already an incremental import based on metadata this module adds to the JSON file.
mkalkbrenner → created an issue.
mkalkbrenner → created an issue.
I don't use Paragraphs at all. Feel free to provide a patch for that issue.
mkalkbrenner → created an issue.
For sure you can commit that patch in the Drupal 7 branch. But the 7.x module is already marked as unsupported, just like Drupal 7 itself.
BTW it is a bit strange that people want to update to newer Solr versions but not to newer Drupal versions ;-)
I went the "events way" in all the contrib modules I'm involved in. And a s far as I know, Core still does events as well.
And other big contrib modules like commerce are using events as well.
I don't think that one thing is right and the other is not. And there's not the one drupal way.
My understanding is that hooks get an OOP replacement. But it is totally valid to use Events.
BTW I would have preferred to replace hooks in core by events and closer following a PSR standard instead of introducing something new that is drupal-specific.
mkalkbrenner → created an issue.
mkalkbrenner → changed the visibility of the branch 1.0.x to hidden.
mkalkbrenner → created an issue.
mkalkbrenner → made their first commit to this issue’s fork.
Thanks for reporting. Can you provide a patch?
Thank you!
Thank you!
The _dcd_metadata is essential for the incremental import and to track entity references that don't use an UUID.
Patches are welcome.
Thanks for this MR. Can you adjust it to the current code base? I'll merge it immediately afterwards.
mkalkbrenner → created an issue.
I prefer to stay with events as well. The reason is that third-party libs like solarium or others to integrate search backends are using events, based on a PSR standard.
Now it is possible to bundle search related customizations that build one feature together in one event subscriber that subscribes to events of serach_api, serach_api_solr, solarium and facets.
Within the Solarium library tests, we also went for SOLR_MODULES=extraction
Thanks, but it is fixed in the dev branch on github already.
It is way easier to install search_api_solr_admin and to use the button for that use-case. Or the drush command.
The issue on github is fixed
The problem is that I can't get the tests to work on gitlab. I asked for help, but obviously the interest isn't that high.
If you would have followed the rules to add a PR on our github repro where the full test suite runs, you would have noticed that some fixes are in the code already.
What do you mean by "post the schema"?
This module creates a config-set zip. using search_api_solr_damin you can "deploy" that zip to a Solr server running in cloud mode and doesn't block the config-set API.
Can you debug what actually is in the array?
Can you find out which search triggers that?
turn on search_api_solr_devel to see the resulting Solr queries and why they match a document.