Imported config doesn't match sync directory for default insert conditions

Created on 26 August 2020, over 4 years ago
Updated 6 March 2024, 10 months ago

Problem/Motivation

When we import config during our continuous deployment process, we also verify that config imported successfully by running an export to see if any changes are identified. After importing a container, the verification export shows differences even though no changes have been made to configuration.

For insert conditions, the initial config export explicitly defines all values when exported from local environments. Importing the config to a higher environment works as expected but an export identifies differences even when no changes have been made since import. Specifically any default values for the insert conditions are no longer included in the exported config. If no insert conditions have been changed, the conditions key is simply an empty object in the verification export.

Steps to reproduce

1. Create a new GTM container with default insert conditions.
2. Export the configuration and note that the default values are explicitly defined in the conditions of the YAML file.
3. Import the configuration into another environment and note, via the UI, that the values are imported correctly.
4. Export the configuration from the second environment and note that there are now differences from existing exported config: the default values are no longer explicitly defined.

Example output

Below is the output of drush cex --diff. It was run after immediately after running the config import - no changes were made to configuration after import.

$ drush cex --diff            
 [notice] Differences of the active config to the export directory:
diff --git a/tmp/drush_tmp_1599002631_5f4ed8070ca20/google_tag.container.test_container.yml b/tmp/drush_tmp_1599002631_5f4ed8073ad4b/google_tag.container.test_container.yml
index 61896a077c..2dd5d31060 100644
--- a/tmp/drush_tmp_1599002631_5f4ed8070ca20/google_tag.container.test_container.yml
+++ b/tmp/drush_tmp_1599002631_5f4ed8073ad4b/google_tag.container.test_container.yml
@@ -21,10 +21,4 @@ role_toggle: 'exclude listed'
 role_list: {  }
 status_toggle: 'exclude listed'
 status_list: "403\n404"
-conditions:
-  'entity_bundle:node':
-    id: 'entity_bundle:node'
-    bundles: {  }
-    negate: false
-    context_mapping:
-      node: '@node.node_route_context:node'
+conditions: {  }

Proposed resolution

The database config values resulting from a config import should match the YAML files. An export run immediately after an import with no config changes in between should show no differences.

πŸ’¬ Support request
Status

Active

Version

1.0

Component

Code

Created by

πŸ‡ΊπŸ‡ΈUnited States andycarlberg

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Not all content is available!

It's likely this issue predates Contrib.social: some issue and comment data are missing.

  • πŸ‡ΊπŸ‡ΈUnited States hershy.k

    I've encountered the same issue as the OP here as described in issue comment 3113799 β†’ .

    I can confirm that by exporting the empty conditions object in the config file (when the condition plugins are not being used here and only appearing from other modules condition plugin) and applying one of the core patch 2815829 πŸ› Adding or editing a block through the UI saves the entity twice Fixed or the patch from #13 this config-mismatch no longer occurs.

  • Status changed to RTBC over 1 year ago
  • πŸ‡ΊπŸ‡ΈUnited States joshf

    I wasn't clear what the previous comments meant by "exporting the empty conditions object". What I did was manually edit the config yml to match what the second environment was outputting. Then after applying the patch the exports matched. +1 RTBC for me; thanks for the patch!

    Also hi @jds1 <3

  • Status changed to Active 10 months ago
  • πŸ‡ΊπŸ‡ΈUnited States solotandem

    As mentioned early on this is a bug in the core condition plugin code. This may have been resolved by the commit last October for issue 2815829. Either update to a core release with that commit or patch an earlier core release. Then see if this issue is resolved. Regardless, the fix needs to be in core not this module (as has been consistently indicated on numerous duplicate issues over the years).

Production build 0.71.5 2024