Canada 🍁
Account created on 29 April 2010, about 14 years ago
#

Merge Requests

Recent comments

🇨🇦Canada smulvih2 Canada 🍁

@miteshmap thanks for catching this! I assume this issue is due to updates from the drupal/purge module. Which version of purge is causing this to fail?

🇨🇦Canada smulvih2 Canada 🍁

@joseph.olstad the issue is with block caching, and the removal from the template file from your patch does fix the issue with the Did you find block, but this needs to be fixed globally for the two feedback form blocks. There are changes in the Block Preprocess file in wxt_bootstrap that needs to be reviewed as well. The change/ticket you reference toggles the two webform blocks, I think we should remove the old webform block (Report a Problem) from the block layout and just keep the Did you find webform block.

🇨🇦Canada smulvih2 Canada 🍁

Changes are now available in 1.1.4-rc1

🇨🇦Canada smulvih2 Canada 🍁

Changes are now available in 1.1.4-rc1

🇨🇦Canada smulvih2 Canada 🍁

Changes are now available in 1.1.4-rc1

🇨🇦Canada smulvih2 Canada 🍁

Fixed the merge conflict and pushed this change to 1.1.x

🇨🇦Canada smulvih2 Canada 🍁

Looks good to me, merged into 1.1.x

🇨🇦Canada smulvih2 Canada 🍁

I have tested 1.x-dev with MR#10 and it works great! There was a bit of confusion around ckeditor_templates_ui, but after I uninstalled and removed that module it was pretty straightforward. I would say get an alpha version going, close this ticket, open new tickets for any changes that are needed (uninstall ckeditor_templates_ui, update docs, etc...).

🇨🇦Canada smulvih2 Canada 🍁

@joseph this is awesome! I have a few projects that are pinned to wet-boew 4.0.43.1 for this issue. Will have to give this a try, thanks!

🇨🇦Canada smulvih2 Canada 🍁

Agree with @jacobupal, better to get a first release for CKE5, like a beta version, so people can start pulling into their projects and testing. I have multiple projects looking for this upgrade to CKE5.

🇨🇦Canada smulvih2 Canada 🍁

Removed D8/9 and added D11 in info file.

🇨🇦Canada smulvih2 Canada 🍁

Thank you both for the fixes and help testing. I have tested this patch on a project that uses the module and it looks good to me! Committed and contributed.

🇨🇦Canada smulvih2 Canada 🍁

@john.jimenez I think this is a great idea! If you have time to make a patch I can test/review.

🇨🇦Canada smulvih2 Canada 🍁

MR looks good, it has been merged and included in 1.0.3. Thanks @aman_lnwebworks!

🇨🇦Canada smulvih2 Canada 🍁

@aman_lnwebworks this is great, really appreciate your help with this! I changed the name of the module before contributing to drupal.org so I must have missed this. Will review and merge shortly today.

🇨🇦Canada smulvih2 Canada 🍁

Added error handling when export with references is used and the selected options produce no results.

🇨🇦Canada smulvih2 Canada 🍁

I created the related issue for file_entity . Batch export would fail if a file entity is referencing an image/file that was removed from the file system. Was seeing this on a full site export, with old test data. Suggest updating file_entity to latest 2.0-rc6 to avoid this issue.

🇨🇦Canada smulvih2 Canada 🍁

I added output for the exporter class, similar to the importer class. It output how many entities of each type were exported. When I did this, I found a performance issue with export with references. I had 229 taxonomy_terms in total, but the count at the end of the export process was showing thousands. To fix this, I added the already processed entities to the batch $context, so I could check if it has already been exported and skip. This prevented thousands of writes to already written files. Also added batch output for export so when running in Drush you get an idea of what is happening, same as the importer. Updated patch attached.

🇨🇦Canada smulvih2 Canada 🍁

I've made some major improvements on the importer class. Using a dataset of 2k nodes (12k+ entities) it was initially taking me 80min to import the full 12K entities and 60min to import them when all entities have already been imported (skipped). These numbers were even larger once I hooked up the methods for internal link replacement updateInternalLinks() and updateTargetRevisionId().

I did a full review of the importer class to fix all performance related issue. Please find below a list of improvements:

  • Minimize the data being stored in the queue table for each batch operation. I originally passed data like processed UUIDs and entity_id => UUID mappings through the class itself, which was causingdelay in starting the batch process as well as made the database grow significantly. Then this information was being passed using KeyValueStorage on each batch operation. The final solution now uses $context to pass this data to each subsequent batch operation, significantly speeding up the import process.
  • Added a setting called "Support old content", which enables the updateInternalLinks() method. This option loops over each JSON file to look for uri field names and does a str_starts_with to see if internal: or entity: exist in the JSON file. This is not needed with newer sites since links now use UUIDs and entities can be embedded with <drupal-entity> elements. Disabling this option sped up the import process.
  • There was duplication of $this->serializer->decode($this->exporter->getSerializedContent($entity), 'hal_json');. It was first being called in importEntity() to determine of there is a diff, then if there was a diff the preAddToImport() method was called and performed the comparison again. Now with this new patch this comparison is only done once, which significantly improved performance.

I also made changes to support path_alias entities. To accommodation this without needing to loop over the entities multiple times, I ensure path_alias entities are processed last so that the corresponding entities being referenced in the path field already exist and their entity_ids are available in $context.

I did a complete review of the importer class and fixed things like doc comments, inline comments, and general coding practices.

🇨🇦Canada smulvih2 Canada 🍁

@kevin.bruner currently WxT 5.2.x requires ^1.2 for the drupal/autosave_form module - https://github.com/drupalwxt/wxt/blob/a1337ab5895df15b3c06171006c040a4d1f8f059/composer.json#L15. This means if you want the latest version you can just run composer update drupal/autosave_form and you will get the latest 1.x release (1.4). Then you can add the patch from the autosave_form ticket you referenced, to your project level composer.json file to resolve.

Adding WxT patch here in case it take a while to get the autosave_form patch included in a new release.

🇨🇦Canada smulvih2 Canada 🍁

Ok another major improvement to the "export with references". I ran into a node that has 208 entity references (lots of media and files). This was causing memory issues, it was hitting 800MB and then would die. This is because of the current approach to exporting with references.

The current approach is to get all referenced entities recursively into an array, then loop over that array and serialize into another array, then loop over the second array and write the JSON files. With entities that have large amounts of references this starts to kill the memory.

My new approach is to get all referenced entities recursively into an array, then loop over that array to serialize and write to the file system at the same time. This means there is no second large array storing all entities and their serialized content. I tested this against the node with 804 references and it works great, memory usage doesn't go over 50MB. This node even has a PDF file of 28MB being serialized with better_normalizers enabled.

The patch attached provides this update to the exporter class. It removes a method, and simplifies the code, which is always nice :)

🇨🇦Canada smulvih2 Canada 🍁

I looked into this in more detail. The skip_entity_type option is only available in Drush for export of entire site (drush dcdes). If you pass node to this command (drush dcdes --skip_entity_type=node) it will export all entity types except nodes. I think this is a good option to exclude things like url_alias entities from a full site export.

The patch in #2 adds a UI to be able to exclude entity types as references. This only comes into play with drush dcder, or exporting with references through the UI. So I could do a drush dcder node, and it will export all nodes with references, but will not include nodes as part of the references from nodes. This granularity is particularly important to have with batch API, since including nodes as references from nodes can lead to one batch operation including hundreds of nodes and timing out. I think now that #3349952 Include dependencies from processed text fields Fixed has been merged, this granularity becomes even more important, if using the "Export processed text dependencies" option.

I think what is from patch #2 is the ability to pass this list of allowed reference types to Drush, as I know @mkalkbrenner requested this for another one of my patches. This would allow Drush to override what's in the UI if needed, so provides more flexibility.

🇨🇦Canada smulvih2 Canada 🍁

This patch is definitely needed. If users add a media item to French content, that is unique to that language (not a translation of the English media item), then that media item would not be exported as a reference.

Updating patch slightly to remove $language which is not used in this method.

🇨🇦Canada smulvih2 Canada 🍁

Coming back to the export, I originally exported 118k entities using the "All content" option which worked perfectly. Also tested "Content of the entity type" option and it worked well. i am now testing "Content with references" and running into an problem. When exporting with references, the queue database table would get populated as expected, then during the first batch operation it would spin until timed out. The issue ends up being getEntityReferencesRecursive(). It looks at references in the body field, and spiders out to include hundreds of nodes.

This is where the patch in #3357503 Allow users to configure which referenced entities to export Needs review comes in handy. I can exclude nodes from reference calculations. For example, I want to export all nodes of type page, with references. If I include nodes as part of the reference calculation, the first batch operation could be massive and timeout. With nodes excluded, I would get all pages still, but they would be spread out across all batch operations and not all included in one.

I have updated the patch to include #3357503, and also apply this filter directly to getEntityProcessedTextDependencies() to avoid loading entities that are later filtered out anyways.

Also need #3435979 🐛 Export misses translated reference fields Needs review included to make sure media items on French translations are included in the export.

🇨🇦Canada smulvih2 Canada 🍁

Update: With the latest patch #33 I was able to trigger an import through the UI of 118k entities and within 10 seconds it started to process the entities where before it would take too long and timeout. I am now importing the entities using nohup and drush. Definitely need to think of a few ways to optimize the export/import process, will take a look at igbinary for this!

🇨🇦Canada smulvih2 Canada 🍁

@mkalkbrenner thanks for the info! Will refer to this when fixing for path_aliases.

So I found the major source of my database exploding on import. DCD does a filesystem scan and stores a pointer to each file in the Importer object. This object is then added to each batch item in the database. This means each batch item in the database would have all 118k pointers. I was able to use KeyValueStorage to store this data outside of the object, and now the queue entries look reasonable for each batch item:

a:2:{i:0;a:2:{i:0;O:38:"Drupal\\default_content_deploy\\Importer":10:{s:46:"\000Drupal\\default_content_deploy\\Importer\000folder";s:14:"../content/dcd";s:52:"\000Drupal\\default_content_deploy\\Importer\000dataToImport";a:0:{}s:16:"\000*\000forceOverride";b:0;s:23:"\000*\000discoveredReferences";a:0:{}s:20:"\000*\000oldEntityIdLookup";a:0:{}s:17:"\000*\000entityIdLookup";a:0:{}s:11:"\000*\000newUuids";a:0:{}s:10:"\000*\000context";a:1:{s:7:"sandbox";a:2:{s:8:"progress";i:0;s:5:"total";i:22;}}s:14:"\000*\000_serviceIds";a:11:{s:13:"deployManager";s:30:"default_content_deploy.manager";s:16:"tempStoreFactory";s:17:"tempstore.private";s:16:"entityRepository";s:17:"entity.repository";s:5:"cache";s:13:"cache.default";s:10:"serializer";s:10:"serializer";s:17:"entityTypeManager";s:19:"entity_type.manager";s:11:"linkManager";s:16:"hal.link_manager";s:15:"accountSwitcher";s:16:"account_switcher";s:8:"exporter";s:31:"default_content_deploy.exporter";s:8:"database";s:8:"database";s:15:"eventDispatcher";s:16:"event_dispatcher";}s:18:"\000*\000_entityStorages";a:0:{}}i:1;s:11:"processFile";}i:1;a:4:{i:0;s:41:"cc1b90d7-7d81-47ca-b0b6-fd5c068a55e4.json";i:1;s:61:"../content/dcd/node/cc1b90d7-7d81-47ca-b0b6-fd5c068a55e4.json";i:2;i:22;i:3;i:22;}}

With this new path, I combined both batch operation callbacks into one callback, so we have 50% less batch operations with this method.

Rough numbers, if I have 118,000 entities to import, then all 118,000 entities would be referenced in each of the 118,000 batch operations. Since we had 2 operations per batch that would be x2. So 118k x 118k x2 = 27.87 billion entries. This is one such entry:

0 => [
  'name' => 'cc1b90d7-7d81-47ca-b0b6-fd5c068a55e4.json',
  'uri' => '../content/dcd/node/cc1b90d7-7d81-47ca-b0b6-fd5c068a55e4.json'
],

If we say this string is 92 bytes, then this would be approx. 2.3 TB of storage required for this.

Now we only have 118,000 batch operations (not times two), and the total data that is stored per operation in the DB is about 1234 bytes. So 1234 bytes times 118,000 is approx. 145MB.

So the import of 118k entities would increase the DB size by about 145MB instead of 2+ TB. This should also significantly reduce the time it takes to write the batch operations to the database when batch_set() is called (before the progress bar is shown).

Uploading patch here and will test the import against the 118k entities and report back.

🇨🇦Canada smulvih2 Canada 🍁

@mkalkbrenner, yes I can reproduce the issue with path_alias entities. I was not running into this before as path_aliases are not exported with reference since they have a relationship from

path_alias -> node

instead of node -> path_alias. Now that I specifically export path_alias entities as well, I can reproduce. I will make sure to account for this in the batch import in subsequent patches, but for now I am looking at how to make the export/import process scalable.

I was able to successfully export all 118,000 entities with the batch export, but I am having issues with the import at this scale. The issue is how I pass the $file to each batch operation, which is the contents of the JSON file in question. The issue with passing the file to the batch operation is that it then writes the file contents to the queue table in the database. Of the 118k entities, 50k are serialized files like images, pdf files, etc... Writing the actual file contents to the database exploded the database size and would timeout before even starting the batch operation, or run out of disk space.

I will need to rewrite the importer class for this to work, and instead of passing the file contents to the batch operations, I will just pass a pointer to the file in the filesystem. Then each batch operation can load the file from the pointer. The I will combine the decodeFile() and importEntity() batch operations into one method, reducing the amount of batches by 50% (2 per JSON file). If I ensure path_alias entities are imported last, then I can probably get them working by just swapping the entity_ids.

See my comments from slack below for records (2024-04-16):

Need to make an update to my DCD patch, instead of passing the file contents to the batch operation, and subsequently storing the JSON files in the database, I am going to pass a pointer to the file in the filesystem. Then load the file directly in the batch operation. This is going to be needed in a production scenario to prevent the database from swelling when imports are ran.

And look at combining the decodeFile() and importEntity() batch operations into one operation that does both, which will reduce the number of batches by 50%

This was the full rewrite of the importer class I was hoping to avoid, but now it looks like it's needed

🇨🇦Canada smulvih2 Canada 🍁

I'm running a large export of > 16k nodes, with references, for a total of > 118k entities. Reviewing patch #28 I found a redundant call to load the entity for a second time in the exportBatchDefault() method. New patch attached removes this second call and hopefully will speed things up a bit.

🇨🇦Canada smulvih2 Canada 🍁

I removed this patch from a project already, and as long as you upgrade to password_policy ^4.0.1 then the fix should be included. Thanks for the patch!

🇨🇦Canada smulvih2 Canada 🍁

Patch attached removes the submit logic and now I can successfully add an empty alt value, and it gets carried over to the node view as expected.

🇨🇦Canada smulvih2 Canada 🍁

PR #12 worked for me, no longer seeing the deprecation error.

🇨🇦Canada smulvih2 Canada 🍁

@joseph you are correct, the GCWeb PR attached show that they have remoed h1.gc-thickline and now the style applies to the plain H1.

🇨🇦Canada smulvih2 Canada 🍁

No, this is currently how to set the gc-thickline if not using lead title. To make gc-thickline the default used, he will need to change the page-title.html.twig file in wxt_bootstrap, plus review wxt_bootstrap.theme for any gc-thickline logic.

🇨🇦Canada smulvih2 Canada 🍁

Agreed, will need to support CKE4 for a while. Seeing some issues particularly with migrated content coming from Canada.ca, and some existing data on sites being upgraded. CKE5 with it's new plugins are too restrictive. For sites with just new content authored in CKE5 it seems to work pretty well though.

🇨🇦Canada smulvih2 Canada 🍁

Patch to fix the patch.

🇨🇦Canada smulvih2 Canada 🍁

After updating my containers to PHP8.3 (from PHP8.1) I am getting this error when importing content using this patch:

Deprecated function: Creation of dynamic property Drupal\default_content_deploy\Importer::$context is deprecated in Drupal\default_content_deploy\Importer->import() (line 255 of modules/contrib/default_content_deploy/src/Importer.php).

Adding this to the top of the Importer class fixes the deprecation error:

  /**
   * The batch context.
   *
   * @var array
   */
  protected $context;

New patch tested on PHP8.3 and fixes the issue.

🇨🇦Canada smulvih2 Canada 🍁

I disagree with the change of approach starting in #6. This new approach is causing tests in ContentTranslationTest to fail, since the data-langcode value doesn't match the embedded entity's language.

I think the approach in #2 was correct, since this actually fixes the data. If we use a filter to fix this, then the data on non-English languages will still use an incorrect data-langcode value, but the problem is masked while viewing the content. This would cause issues for anyone needing to manipulate the data programmatically and relying on the language code to be correct. I re-rolled #2 to apply against 1.5.0.

I also modified the ContentTranslationTest test as follows:

  • Do not create a French translation right after the initial entity creation.
  • Save the English node after adding an embedded entity.
  • Go to the add translation route, ensure a element exists with the data-langcode set to "fr".
  • Delete the embedded entity and continue tests with French embedding.

I'm not very good with tests, so this might need to be adjusted.

🇨🇦Canada smulvih2 Canada 🍁

I am running into a similar issue using MySQL managed database service in Azure. When I try to save my index, or try to index content, I get this error:

Drupal\search_api\SearchApiException: Cannot add primary key to table 'search_api_db_INDEX_NAME_search_api_datasource_2': primary key already exists. in Drupal\search_api_db\Plugin\search_api\backend\Database->fieldsUpdated() (line 1187 of /var/www/html/modules/contrib/search_api/modules/search_api_db/src/Plugin/search_api/backend/Database.php).

I notice that it creates the table, but then fails on the primary key part. Then next time I try to index content or save the index, it creates new tables with _X appended (1, 2, 3.. n) like this search_api_db_INDEX_NAME_search_api_datasource_2.

Then I tried applying the patch in #11. With this I get the following error:

Drupal\search_api\SearchApiException: SQLSTATE[42000]: Syntax error or access violation: 1235 This version of MySQL doesn't yet support 'existing primary key drop without adding a new primary key. In @@sql_generate_invisible_primary_key=ON mode table should have a primary key. Please add a new primary key to be able to drop existing primary key.': ALTER TABLE &quot;search_api_db_INDEX_NAME_search_api_datasource&quot; DROP PRIMARY KEY; Array
(
)
 in Drupal\search_api_db\Plugin\search_api\backend\Database->fieldsUpdated() (line 1188 of /var/www/html/modules/contrib/search_api/modules/search_api_db/src/Plugin/search_api/backend/Database.php).

So I reverted the patch and tried adding the workaround in settings.php from #8. Then when I try to clear cache with drush cr and I get:

SQLSTATE[42000]: Syntax error or access violation: 1227 Access denied; you need (at least one of) the SUPER, SYSTEM_VARIABLES_ADMIN or SESSION_VARIABLES_ADMIN privilege(s) for this operation

I get the same error when I try to run SET SESSION sql_require_primary_key=0 directly in mysql.

🇨🇦Canada smulvih2 Canada 🍁

Have to wrap the code to check if #page_variant isset() so it doesn't complain on non-panel pages.

🇨🇦Canada smulvih2 Canada 🍁

I have ran into this issue on a few different projects. I wanted to just override the template file but for only one specific variant. The patch attached adds this basic theme suggestion.

🇨🇦Canada smulvih2 Canada 🍁

#10 works for my on core 10.1.8, but also using on a few other projects. Need this patch in order to create new panels page.

🇨🇦Canada smulvih2 Canada 🍁

I had a similar issue with basic_auth. Resetting the user's password fix the issue for me. Happened after updating core from 10.0.x to 10.2.x.

🇨🇦Canada smulvih2 Canada 🍁

I am having the same issue in Drupal 10 with version 1.0.0-beta4. I can get it working on a view with a page display, but not for a panel page with a search view block display. I can see the block is printing (somewhat), but nothing is visible:

<form id="search-api-sorts-widget-widget--2" method="post" action="/en/guidance" data-drupal-selector="search-api-sorts-widget-widget-2" data-once="form-updated" data-drupal-form-fields="">
  
<input value="form-tnZMjD-CtIC1IwKZaVwCH6tueNmkNMW8VTC7Nj_oEYY" name="form_build_id" type="hidden" data-drupal-selector="form-tnzmjd-ctic1iwkzavwch6tuenmknmw8vtc7nj-oeyy">

<input value="YIWYPIUNpHQjlX3AqvQlPuqIUwKhYkiW65umIzR5Mt8" name="form_token" type="hidden" class="form-control" data-drupal-selector="edit-search-api-sorts-widget-widget-form-token-2">

<input value="search_api_sorts_widget_widget" name="form_id" type="hidden" data-drupal-selector="edit-search-api-sorts-widget-widget-2">

</form>
🇨🇦Canada smulvih2 Canada 🍁

Re-rolled patch against latest 2.0.x-dev branch. Tested all 3 export modes through the UI as well as Drush. Also tested import through both UI and Drush. Seems to work well. Made a slight adjustment to the --text_dependencies Drush flag so it takes the UI config value if not specified in the Drush command.

🇨🇦Canada smulvih2 Canada 🍁

+1 for #5! Fixes issue for me on PHP8.2.

🇨🇦Canada smulvih2 Canada 🍁

Added filter_var() to user input for Drush command to account for user passing in any of these to the --text_dependencies option: [1, TRUE, true, 0, FALSE, false]

🇨🇦Canada smulvih2 Canada 🍁

@mkalkbrenner thanks for the feedback, these are valid points! I have included a new patch here to address your feedback.

1. I have changed this to iterate over the entity translations:
$entity->getTranslationLanguages();

2. I have removed this logic as it is not needed as you pointed out.

3. I have added an option to the drush dcder command. You can now pass in --text_dependencies=TRUE/FALSE to change this behavior despite the configuration option in the UI.

🇨🇦Canada smulvih2 Canada 🍁

Re-rolled patch to apply against latest 2.0.x-dev branch.

🇨🇦Canada smulvih2 Canada 🍁

@mkalkbrenner thanks for the feedback! I have attached a new patch that does the following:

  • Made the xpath logic generic to work for all embedded entities. I needed this in my project as we have multiple entity types being embedded beyond the original list provided in ticket description.
  • Made the export process text dependencies configurable, with a checkbox on the settings page.
  • Added support for translations, so if a non-English language has an embedded entity it will also be exported. Found this issue with media items, since users are not translating the English images, but instead added a unique French image and embedding it.

As for your point 2, when exporting the entire site using prepareToExportAllContent(), the getEntityReferencesRecursive() method is not called, so no issue with performance here.

🇨🇦Canada smulvih2 Canada 🍁

This patch needs a re-roll after recent changes pushed to 2.0.x-dev branch. Will work on this over the next few days.

🇨🇦Canada smulvih2 Canada 🍁

Ok, got the import() process working with batch, it's solid now. Updated patch attached, will update PR shortly after and add comments.

🇨🇦Canada smulvih2 Canada 🍁

One issue with my last patch in passing data from first batch process prepareForImport() to importBatch(). Will need to figure out a different solution for this before this is ready for review.

🇨🇦Canada smulvih2 Canada 🍁

Created a PR to make it easier to review the changes, inline with #15.

🇨🇦Canada smulvih2 Canada 🍁

Removed a line used for testing.

🇨🇦Canada smulvih2 Canada 🍁

As I suspected in #8, the prepareForImport() method will cause timeouts on large data sets due to the amount of processing that occurs per JSON file. The new patch moves prepareForImport() into it's own batch process, that then passes the data to the existing batch process that imports the entities. Tested this on a large data set and works well.

🇨🇦Canada smulvih2 Canada 🍁

@liam thanks for reviewing this fix in mysql! This has been merged with the 1.1.x-dev branch.

🇨🇦Canada smulvih2 Canada 🍁

This has been merged with the 1.1.x-dev branch. Thanks guys!

🇨🇦Canada smulvih2 Canada 🍁

This has been merged with the 1.1.x-dev branch.

🇨🇦Canada smulvih2 Canada 🍁

Tried an upgrade from 5.0.1 to 5.1.0 and ran into this issue with drush updb:

[notice] Update started: wxt_core_update_8502
>  [error]  Drupal\Core\Config\Config::setData(): Argument #1 ($data) must be of type array, Drupal\Core\Config\Config given, called in /var/www/public_html/profiles/wxt/modules/custom/wxt_core/wxt_core.install on line 695 
>  [error]  Update failed: wxt_core_update_8502 
 [error]  Update aborted by: wxt_core_update_8502

Realized this fix was pushed to 5.1.x-dev branch (not 5.1.0 tag), and upgrading to this branch allowed me to complete the upgrade successfully. We should cut a new release 5.1.1 with this change to prevent upgrade issues for other projects.

🇨🇦Canada smulvih2 Canada 🍁

@danrod, when you run compose update, dependencies are calculated before patches are applied. This means you can't patch a composer.json file to change the module version like this. You would have to do this in your root composer.json file, for example:

"require": {
  "drupal/group": "2.0.0 as 1.6.0"
}
🇨🇦Canada smulvih2 Canada 🍁

Pushed changes for 5.1.x to drupalcode repo.

🇨🇦Canada smulvih2 Canada 🍁

@joseph, I merged 5.0.x into 5.1.x and pushed to GitHub for testing - https://github.com/drupalwxt/wxt/tree/5.1.x

I ran a fresh install and works for me. Please test this and report back. Thanks!

🇨🇦Canada smulvih2 Canada 🍁

Got batch export working for all three modes (default, reference, all). Adding patch here to capture changes, but will push changes to a PR to make it easier to review.

Remaining tasks:

  • Performance - exporting with references can load the same entity multiple times. Each batch item exports a single entity with any of its' referenced entities. If the same term is referenced on multiple nodes, the JSON file for the term is updated multiple times. We can add an array to store processed entities to avoid duplication.
  • There is some duplication of code between exportBatchDefault and exportBatchWithReferences, could extract these into new method(s).
  • Add progress indicator to drush command, so batch output shows in CLI
  • Test export/import with complex data
🇨🇦Canada smulvih2 Canada 🍁

Agreed, although so far it seems to be working well for things like entity references, links to other nodes, etc. Will make sure to test this with path aliases like you suggest. Also need to implement batch for export since currently I am using a custom drush command to get past the export limitation.

🇨🇦Canada smulvih2 Canada 🍁

Special thanks to Craig Clark for creating the SVG icons for this.

🇨🇦Canada smulvih2 Canada 🍁

Patch attached adds a CKE5 compatible plugin for redaction.

If you select text and click the redact button, your text will be wrapped within a <drupal-redact> element. If you have the redaction selected and click the button, the text will be removed from the <drupal-redact> element. The filter works the same as CKE4/D9.

CKEditor5:

Admin view (can see redacted text):

Anonymous view (cannot see redacted text):

🇨🇦Canada smulvih2 Canada 🍁

I have used this patch on two projects (both missing page content type) and was able to deploy without issue, including running database update scripts (drush updb).

🇨🇦Canada smulvih2 Canada 🍁

With patch #6 I was getting warnings in the dblog. New patch fixes this and now batch import works without any dblog errors/warnings.

🇨🇦Canada smulvih2 Canada 🍁

After testing the batch import for DCD a bit more, I think we need to add a batch process for the prepareForImport() method as well. With a lot of JSON files in the export directory, decoding all of these files can cause timeouts.

🇨🇦Canada smulvih2 Canada 🍁

Just tested #6 on a migration I am running to see how the new batch process would handle a real-life scenario. It seems to work as expected.

Here is the contents of my export:

  • file: 1
  • media: 1
  • node: 90
  • taxonomy_term: 88
  • Total JSON files: 180

I am calling the new importBatch() method programmatically on a custom form submit handler, like this:

$import_dir = \Drupal::service('extension.list.module')->getPath('cra_default_content') . '/content';
$importer = \Drupal::service('default_content_deploy.importer');

$importer->setForceOverride(TRUE);
$importer->setFolder($import_dir);
$importer->prepareForImport();
$importer->importBatch();

This gives me the batch progress bar and correctly shows 180 items being processed. After the import, I get all 90 nodes with translations. The term reference fields all work as expected. Even the links to other imported nodes within the body field work as expected.

Here are the patches I have in my project:

"drupal/default_content_deploy": {
    "3302464 - remove entities not supporting UUIDs": "https://www.drupal.org/files/issues/2022-08-08/dcd-remove-entities-without-uuid-support-3302464-2.patch",
    "3349952 - Export processed text references": "https://www.drupal.org/files/issues/2023-03-23/dcd-export-processed-text-references-3349952-2.patch",
    "3357503 - Allow users to configure which referenced entities to export": "https://www.drupal.org/files/issues/2023-05-01/default_content_deploy-referenced_entities_config-3357503-2.patch",
    "3102222 - batch import": "https://www.drupal.org/files/issues/2023-09-18/dcd-import-batch-api-3102222-6.patch"
},
Production build 0.69.0 2024