🇩🇪
Account created on 12 March 2007, about 18 years ago
#

Merge Requests

More

Recent comments

🇩🇪Germany mkalkbrenner 🇩🇪

Thanks for the patch. But as you mentioned, the log entry is generated in the Server class which then doesn't call the method on the backend anymore. The function is not meant to be called directly.

If we should harden the code if called directly, we need to do it for all methods, for example deleteItems().

🇩🇪Germany mkalkbrenner 🇩🇪

No further feedback

🇩🇪Germany mkalkbrenner 🇩🇪

4.2 is not supported anymore and there will be no further release. But thanks for the patch people could apply locally!

🇩🇪Germany mkalkbrenner 🇩🇪

It isn't a breaking change in this module. It is a bug fix to get things working with supported Solr versions. Older Solr 9 versions are EOL.
But Solr made a breaking change and mentioned that and their release notes. I'm sure you read it ;-)

Anyway, I would appreciate a contribution to the README.

🇩🇪Germany mkalkbrenner 🇩🇪

Your Solr 9 seems to run in cloud mode

🇩🇪Germany mkalkbrenner 🇩🇪

If you require this functionality or generic sequences service, you can use the sequences module:
https://www.drupal.org/project/sequences

🇩🇪Germany mkalkbrenner 🇩🇪

I already thought about exclude lists.
But I suggest to commit this one first as it contains bug fixes and performance improvements without breaking any existing functinality. And maybe another beta release.
Then we could think about the next steps.

🇩🇪Germany mkalkbrenner 🇩🇪

I did some more tests and found a minor issue that counting revisions per language needs to be done on the data table.

🇩🇪Germany mkalkbrenner 🇩🇪

I noticed two critical issues during testing:

  1. The keep parameter had no effect.
  2. The performance is really bad on big databases.

The first one is a bug introduced in the dev branch.

The second is caused by the fact that all entities of a requested type or bundle are queued regardless how many revisions exist in database. In case of one million nodes, that takes a significant amount of time, CPU and memory to load all of them to count the number of revisions.
It is way better to only put those into the queue that have more revisions than to keep.

I fixed both issues within the MR instead of opening new ones because the code changed too much.

BTW, I implemented getting the entity IDs as SQL. Loading entities would quickly run out of memory.

🇩🇪Germany mkalkbrenner 🇩🇪

The issue was with profile entities. That had to be handled in Support all entity types Active .
So I decided to combine both merge requests here.

🇩🇪Germany mkalkbrenner 🇩🇪

I get an error during queue:run. I'll check that ...

🇩🇪Germany mkalkbrenner 🇩🇪

I added this to the product page:

Search API Solr supports any Solr version from 3.6 to 9.x. Solr 7.x to 9.x are directly supported by the module itself, earlier versions from 3.6 to 6.x require enabling the included search_api_solr_legacy sub-module.
Solr 10.x support will require some work and will be added sooner or later (sponsors are welcome).

In general, maintaining detailed informations and testing all versions requires a lot of time. But since there're are no major sponsors anymore, it isn't easy to do all these small tasks.

🇩🇪Germany mkalkbrenner 🇩🇪

Since people a re using default_content_deploy 2.2.0 beta now, they run into this issue and lose data.
I put a big warning on https://www.drupal.org/project/default_content_deploy

Since the issue is obvious, can't we commit the fix here?

🇩🇪Germany mkalkbrenner 🇩🇪

I think that this is rather a solarium issue.

🇩🇪Germany mkalkbrenner 🇩🇪

Redis used request time

That explains why the tests are still failing.
So I leave it to you improve the test.
For us, the patch is good enough to fix the critical issue in our production environment.

🇩🇪Germany mkalkbrenner 🇩🇪

Writing to Redis sometimes takes more than 2s in the tests :-(
Expire is time() + 2, but

    'created' => 1746536865.132
    'expire' => '1746536864'
🇩🇪Germany mkalkbrenner 🇩🇪

I agree that that "core compatibility" should be configurable.

But if I understand the issue correctly, "core compatibility" currently only exists because of a bug in the calculation?

🇩🇪Germany mkalkbrenner 🇩🇪

Never edit composer.lock!

Edit the top level composer.json.

I believe I have not installed solarium/solarium explicitly, by itself, or in a specific version.

You must have done that. A contrib module can't do it.

🇩🇪Germany mkalkbrenner 🇩🇪

For whatever reason you must have installed solarium 6.3.6 explicitly.
Simply remove solarium from composer.json.

🇩🇪Germany mkalkbrenner 🇩🇪

I don't think that this is an issue with search_api_solr, but with your installation. Run
composer why-not drupal/search_api_solr:4.3.8

🇩🇪Germany mkalkbrenner 🇩🇪

I was able to reproduce the issue if a module is installed that reacts to media entity updates. In our case, image_replace isn't fault tolerant. So default_content_deploy can't import the entity or fix the file ID later.

A quick fix is to handle files at the beginning of the import.

BTW that issue was hidden in previous versions. Meaning, it existed as well but didn't hurt because of the more verbose export format.

🇩🇪Germany mkalkbrenner 🇩🇪

The format changed. Most of _links at top level is not need for our use-case.

So from the JSON code above, it's missing this outside of _embedded:

No, this is correct. Entity Reference fields are only in _embedded.

It seems to be an import related issue.

🇩🇪Germany mkalkbrenner 🇩🇪

Sorry, but these patches aren't readable because they reformat entire files.

🇩🇪Germany mkalkbrenner 🇩🇪

I ended up using this:
sudo su - solr -c "/opt/solr/bin/solr create -c IAB -n data_driven_schema_configs"

This is totally wrong. It doesn't use the drupal schema. That's why the file is missing.

Also of note, I was not able to use this command from the README.md file to create a core:
`$ sudo -u solr $SOLR/bin/solr create_core -c $CORE -d $CONF -n $CORE`

Did you replace $CORE and $CONF with the required values?

BTW why don't you start Solr in cloud mode and let Drupal create the collection for you?

🇩🇪Germany mkalkbrenner 🇩🇪

In 2.2.x we still use JSON, but the format become much more readable because not required information gets removed now.

Nevertheless, YAML would be interesting as alternative.

🇩🇪Germany mkalkbrenner 🇩🇪

We import users, webform submissions and newsletter subscriptions from different sites into one backend using default_content_deploy. So we need a UUID field to avoid id collisions.

This works well for everything except the SubscriberHistory. With that patch applied, SubscriberHistory works as well.

But I agree that the update hook requires a batch.

🇩🇪Germany mkalkbrenner 🇩🇪

There's already an incremental import based on metadata this module adds to the JSON file.

🇩🇪Germany mkalkbrenner 🇩🇪

I don't use Paragraphs at all. Feel free to provide a patch for that issue.

🇩🇪Germany mkalkbrenner 🇩🇪

For sure you can commit that patch in the Drupal 7 branch. But the 7.x module is already marked as unsupported, just like Drupal 7 itself.

BTW it is a bit strange that people want to update to newer Solr versions but not to newer Drupal versions ;-)

🇩🇪Germany mkalkbrenner 🇩🇪

I went the "events way" in all the contrib modules I'm involved in. And a s far as I know, Core still does events as well.
And other big contrib modules like commerce are using events as well.
I don't think that one thing is right and the other is not. And there's not the one drupal way.
My understanding is that hooks get an OOP replacement. But it is totally valid to use Events.
BTW I would have preferred to replace hooks in core by events and closer following a PSR standard instead of introducing something new that is drupal-specific.

🇩🇪Germany mkalkbrenner 🇩🇪

mkalkbrenner made their first commit to this issue’s fork.

🇩🇪Germany mkalkbrenner 🇩🇪

Thanks for reporting. Can you provide a patch?

🇩🇪Germany mkalkbrenner 🇩🇪

The _dcd_metadata is essential for the incremental import and to track entity references that don't use an UUID.

🇩🇪Germany mkalkbrenner 🇩🇪

Thanks for this MR. Can you adjust it to the current code base? I'll merge it immediately afterwards.

🇩🇪Germany mkalkbrenner 🇩🇪

I prefer to stay with events as well. The reason is that third-party libs like solarium or others to integrate search backends are using events, based on a PSR standard.
Now it is possible to bundle search related customizations that build one feature together in one event subscriber that subscribes to events of serach_api, serach_api_solr, solarium and facets.

🇩🇪Germany mkalkbrenner 🇩🇪

Within the Solarium library tests, we also went for SOLR_MODULES=extraction

🇩🇪Germany mkalkbrenner 🇩🇪

Thanks, but it is fixed in the dev branch on github already.

🇩🇪Germany mkalkbrenner 🇩🇪

Todo: document search_api_solr_admin.

🇩🇪Germany mkalkbrenner 🇩🇪

It is way easier to install search_api_solr_admin and to use the button for that use-case. Or the drush command.

🇩🇪Germany mkalkbrenner 🇩🇪

The issue on github is fixed

🇩🇪Germany mkalkbrenner 🇩🇪

The problem is that I can't get the tests to work on gitlab. I asked for help, but obviously the interest isn't that high.

🇩🇪Germany mkalkbrenner 🇩🇪

If you would have followed the rules to add a PR on our github repro where the full test suite runs, you would have noticed that some fixes are in the code already.

🇩🇪Germany mkalkbrenner 🇩🇪

What do you mean by "post the schema"?
This module creates a config-set zip. using search_api_solr_damin you can "deploy" that zip to a Solr server running in cloud mode and doesn't block the config-set API.

🇩🇪Germany mkalkbrenner 🇩🇪

Can you debug what actually is in the array?
Can you find out which search triggers that?

🇩🇪Germany mkalkbrenner 🇩🇪

turn on search_api_solr_devel to see the resulting Solr queries and why they match a document.

Production build 0.71.5 2024