🇳🇱Netherlands @Tr4nzNRG

Account created on 15 April 2019, over 5 years ago
#

Recent comments

🇳🇱Netherlands Tr4nzNRG

I have a feeling that the openai_embeddings module has an issue with the form. You can choose between Milvus and Pinecone plugin.

But it seems to do a 'try' for Milvus even if you selected the Pinecone plugin causing an error? I disabled public function validateConfigurationForm for both plugins Milvus.php and Pinecone.php to get further.

  /**
   * {@inheritdoc}
   */
  public function validateConfigurationForm(array &$form, FormStateInterface $form_state) {
    $this->setConfiguration($form_state->getValues());
    try {
      $this->listCollections();
    }
    catch (\Exception $exception) {
      $form_state->setErrorByName('hostname', $exception->getMessage());
    }
  }
🇳🇱Netherlands Tr4nzNRG

I have te same issue when trying to save the Pinecone API key and hostname:

cURL error 3: (see https://curl.haxx.se/libcurl/c/libcurl-errors.html) for /describe_index_stats

If I use curl from command line I do get a response:

curl -X GET "https://xxxxxxx.pinecone.io/describe_index_stats" -H "Api-Key: xxxxx"

{"namespaces":{},"dimension":1536,"indexFullness":0,"totalVectorCount":0}[2024-05-17 09:40:35]

Not sure what could be wrong.

🇳🇱Netherlands Tr4nzNRG

I noticed a missing comma on the prefilled settings. In my case .view-exposed-form wasn't working.

.overview--search .views-exposed-form, #police-search-form-popup

Should be:

.overview--search, .views-exposed-form, #police-search-form-popup

🇳🇱Netherlands Tr4nzNRG

@jeffc518 I'm also very interested in this change for the same reasons.

🇳🇱Netherlands Tr4nzNRG

We used this batch process and noticed that it slows down after processing ~1000/2000 nodes. Manually stopping this process and restarting 'fix' the issue and prevents a out-of-memory (depending on availability from webserver).

Another note and maybe good to add to the documentation when processing large amounts of nodes on high traffic website is that this process in combination with other modules could cause a cascade. For example when used with Node Revision Delete or executing this command while a re-indexing of Search API could cause a out-of-memory depending on the availability of memory from the webserver.

So it's wise to execute this batch/command when other processes aren't running and monitor the process. Like after a build process (release).

In short...

It might be good to adjust this solution so it 'restarts' after ~1000 nodes or find a solution for the OOM issue. For us it was the ?only? methode right now to handle a large amount of nodes on a medium/high traffic website.

🇳🇱Netherlands Tr4nzNRG

Short update: On our TEST environment we had an incident that the webserver ran out of memory. However we aren't sure if this is due this fix or that we have to many other running tasks(Search API indexing, Node Revision Delete). Also our TEST server has less memory than our PROD server.

🇳🇱Netherlands Tr4nzNRG

@cilefan was correct.

(for anyone else that runs in this situation)

We recently migrated from Drupal 9 > Drupal 10. For D9 we had a patch to solve the issue I'm describing (changing the public/private upload destination). This patch needs an update to fix the problem for Drupal 10.

Patch info: https://www.drupal.org/project/drupal/issues/1202074#comment-14887638 Toggleable public/private upload Needs work

This issue can be closed.

🇳🇱Netherlands Tr4nzNRG

I tested this on real data in my DDEV environment it seems to works as intended. For now this could be merged with the dev? @thomasdegraaff

However I think it still needs a minor improvement. I noticed that the batch process seems to slow down after 2000-5000 nodes. The reason why I don't know yet as the website I'm using has some complexity with other modules and content (direct indexing with Search API/Solr, cache invalidation?).

When I stop the process and restart the command it's back at it's original speed. So maybe this could be changed in the future? For now this is already a good improvement and allows us to use this module for production. So thanks for all the effort so far ;)

🇳🇱Netherlands Tr4nzNRG

I tested this in my dev environment and noticed it seems to process around 50 nodes / min. So it seems that the batch process works?

For the user it might seem that 'nothing' happens as the command line doesn't give any feedback of the running process. I only saw it myself by looking into the database and saw an ongoing increase of around 50 rows / min for the radioactivity table.

For my website I need to process around ~22.000 nodes in total. I didn't notice any slowness on the dev environment while this process took place. So it takes around 8 hours before all the nodes are processed.

I can also give feedback if it ever get's applied on a production environment. But first I need to test and apply this patch: https://www.drupal.org/project/drupal/issues/2329253#comment-14830297 📌 Allow ChangedItem to skip updating the entity's "changed" timestamp when synchronizing Fixed

To solve another issue where the 'changed' date get's updated. This is unwanted but already resolved for the radioactivity module. Just not in Drupal Core.: https://www.drupal.org/project/radioactivity/issues/3348337 🐛 Set syncing when updating reference fields Needs review

🇳🇱Netherlands Tr4nzNRG

Currently having the same issue where it needs to update about ~12000 nodes. The process seems to take about 20-50min (with no feedback). I would love to see this as a batch processor so we can run this process in the background on a production server without possibly causing an interruption of the service or slowing the website to a halt.

🇳🇱Netherlands Tr4nzNRG

Would be nice if it also takes workflow states in account (draft, archived)?

🇳🇱Netherlands Tr4nzNRG

This is a reason why we aren't updating to the latest version. A while ago this has been reported in another issue[1] but for some reason the issue is no longer accessible (403). The middleware [2] seems for now very suspicious.

[1] https://www.drupal.org/project/mailchimp/issues/3348453 💬 Why API key is deprecated and for what we need oauth_middleware_url? Active
[2] https://git.drupalcode.org/project/mailchimp/-/blob/2.2.0/mailchimp.inst...

I hope someone can clarify why this has been added.

🇳🇱Netherlands Tr4nzNRG

I had a similar situation where the module was enabled on a different develop environment (most likely with authenticated role present in the database).

Than when I was setting up a new develop environment (with recent database from production without authenticated role), It caused an error on the drush deploy.

Error: Call to a member function grantPermission() on null in /var/www/html/htdocs/modules/contrib/google_analytics/google_analytics.install on line 17 #0 [internal function]: google_
  analytics_install(true)

I think you can replicate the situation by first removing the authenticated role and than installing the module.

🇳🇱Netherlands Tr4nzNRG

Currently have the same issue with 6.0.3 when using it as a block.

Production build 0.71.5 2024