Delete huge data

Created on 26 June 2020, over 4 years ago
Updated 25 January 2023, almost 2 years ago

Hello there,
We try to delete a huge amount of revisions. For example, for one node type we got like 10K node, with revisions between 500-1000.
There is a problem with the node revision delete command, on the getCandidatesRevisions() this method has foreach with a query to build an big array. We got too much revisions so this query is too much slow and nothing happen during long time. I would like to know if it's possible to rewrite this part, maybe it will be good to batch also the query ?

🐛 Bug report
Status

Closed: duplicate

Component

Code

Created by

🇫🇷France musa.thomas France 🇫🇷

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Not all content is available!

It's likely this issue predates Contrib.social: some issue and comment data are missing.

  • 🇳🇱Netherlands seanB Netherlands

    This seems to be a duplicate of 🐛 Timeout with a lot of revisions Needs review , which has a bit more information. Closing this one as a duplicate.

    We also just added a 2.x version of the module. This is a complete rewrite based on plugins to determine which revisions can be deleted. It also uses a queue worker to actually delete revisions. Could you please test if this solves the timeout issues you currently experience?

Production build 0.71.5 2024