🇺🇸United States @effulgentsia

Account created on 2 September 2006, over 17 years ago
  • Software Engineer at Acquia 
#

Recent comments

🇺🇸United States effulgentsia

I suspect I'll end up creating a mysql57 contrib module

Done: https://www.drupal.org/project/mysql57

🇺🇸United States effulgentsia

I think the CR ( https://www.drupal.org/node/3444548 ) is ready to be published, but deferring to release managers to decide when to do that.

🇺🇸United States effulgentsia

I haven't read all the discussion here, but I noticed some discussion about the DX of JSON schema, or certain decisions we might want to make about the details of the JSON schema we want to use, so I want to quickly mention that https://github.com/vega/ts-json-schema-generator is a pretty great library that lets you convert TypeScript definitions to JSON schema. In other words, if we want to optimize for DX, we might want to consider a DX where people can write TypeScript definitions instead of writing JSON schema by hand, and then the JSON schema creation can be automated which would allow us to optimize the JSON schema to suit SDC's needs rather than optimizing for manually writing it.

🇺🇸United States effulgentsia

Per #33, I still plan on making a sqlite326 (or better name if anyone has ideas) contrib module for D11; I just haven't had the time for it these last few months. If someone wants to beat me to it, go for it! I agree that a 3.45 requirement will be inconvenient for enough people to warrant making the contrib module for them.

🇺🇸United States effulgentsia

This is the same 'problem' for normal relational SQL field storage like the body field though.

In theory, yes, having a 20kb value for your body field would require the same storage and scale with languages and revisions the same way as having 50 paragraphs with 400 bytes each. In practice, I've never had a coworker tell me about a huge database they encountered for a customer where the cause of that size was many nodes/revisions of large values for the body field (or any other text field). It's almost always due to paragraphs. Either in general there's a lot more sites building big pages out of many paragraphs rather than a long body field, or perhaps it's just more common for that kind of page building to iterate through more revisions than longform articles do.

🇺🇸United States effulgentsia

That's 10MB per node

That was off by 10x. I only scaled by number of languages instead of by the square of that. But with languages, they affect both the number of records per revision, and also the number of total revisions (since often when you update the content for one language, you want to translate that update to your other languages, which requires a new revision for each language). I updated #8 accordingly.

🇺🇸United States effulgentsia

is it really that bad if the component field is stored as a large JSON string?

We have customers at Acquia with >50GB databases, mostly filled with duplicate (or nearly duplicate) paragraph revisions.

Part of the problem is that every revision stores records for every language, even though any given revision only edits one translation. That's not unique to paragraphs, it's just that with paragraphs you have so many more entities than you do with nodes alone.

For example, say for one node you have:

  • 50 components
  • 50 revisions per language
  • 10 languages
  • 400 bytes per component

That's 10MB per node. So if your site has 1,000 nodes like that, that would be 10GB.

Whereas, if we can reduce that last one to 20 bytes per component: an 8 byte short hash plus 12 bytes of overhead (quotes, commas, plus also needing to store the component instance ID), that can reduce the storage by 20x.

I think we should use a full SHA1 or equivalent here.

Yeah, given a 12 byte (or who knows, maybe even more) overhead, the difference between an 8 character short hash and a 28 character (base64 160bit) SHA1 hash ends up being only a 2x difference (20 vs 40 total bytes), so perhaps that's worth it if we think SHA1 has sufficient collision resistance for this purpose.

However, the other thing we can do is use a dynamic length hash. For example, start with just the first 8 characters, check the lookup table, and if that key isn't already there, or is there and has the same JSON value that we want to store, then great just use that. But if the lookup table already contains a different JSON for that same short hash, then for the new JSON, try again with more characters (e.g., increase from 8 to 16 and repeat, or maybe just go immediately all the way to a full SHA1 or an even longer and more collision-resistant hash). That way, >99% of the data can be stored with just the short hash, and only occasional data ends up needing the longer hash.

🇺🇸United States effulgentsia

I opened JSON-based data storage proposal for component-based page building Active which I think is general enough to be able to support any UI we might end up wanting for this.

🇺🇸United States effulgentsia

Upon further reflection, I think we should switch from prop-types to TypeScript. One reason is that TypeScript has become so widely adopted in the JS ecosystem. Another reason is I think it'll be easier to have a single set of TypeScript definitions that can then work for React, Vue, or other frameworks.

When I first opened this issue, I didn't know how we could generate .template-info.json files from TypeScript definitions. Since then I learned about https://github.com/vega/ts-json-schema-generator, which generates JSON schema from TypeScript. Which makes me realize we should change JSX theme engine to read JSON schema (which is a standard) instead of .template-info.json (which is just something I made up because for some reason JSON schema didn't occur to me at the time).

I'll add some more detailed comments here later about how to go about implementing the above.

🇺🇸United States effulgentsia

https://github.com/php-tuf/composer-stager/issues/350 just got completed, making Composer Stager compatible with both Symfony 6 and 7, so that it's compatible with Drupal 10 and 11. So next steps for this issue are:

  • Release Composer Stager 2.0.0-beta5, so the above fix is in a tagged release. Note that Composer Stager 2.0 is only in beta status because it hasn't been sufficiently reviewed yet. But as far as we know, it's complete, so we can release RC or stable at any time.
  • Open an MR for this issue that adds Composer Stager ^2.0.0-beta5 to Drupal core's dependencies. See #50 for the latest patch that did this with an earlier version, but we don't do patches anymore.
  • Update the issue summary of this issue for accuracy (e.g., remove the info about the PHP file syncer and check the rest of the issue summary for whether it accurately reflects the current state of Composer Stager).
  • For anyone following this issue, and especially core committers, please review the Composer Stager codebase and open GitHub issues for any problems/concerns that you find.
🇺🇸United States effulgentsia

https://github.com/php-tuf/composer-stager/issues/350 got completed so now Composer Stager is compatible with both Symfony 6 and 7, so I think the MR in this issue should include it so that tests run properly.

🇺🇸United States effulgentsia

I think the best next step to move this issue forward is to get reviews of the code in https://github.com/php-tuf/composer-stager. @TravisCarden just recently (2 weeks ago) released 2.0.0-beta4. It's only in beta status because it hasn't been reviewed much by people other than those working on Package Manager. As far as we know though, it's feature complete and can be marked RC or stable once it's adequately reviewed. The most noteworthy recent change is that per 🌱 [policy] Require rsync for automatic updates in Drupal core and punt other syncers to contrib RTBC we removed the PHP filesyncer so now only works if people have rsync installed. This makes the package easier to review and easier to maintain. If there's a significant population who's on systems that don't already have rsync installed and can't install it, then a separate GitHub repo/package could be created (if someone were willing to maintain it) containing alternate file syncers (e.g., one that uses the Windows robocopy command for people on Windows), but those would not be "part of core" (i.e., maintained by Drupal core committers). The Drupal issue for discussing stuff related to Composer Stager is 📌 Add php-tuf/composer-stager to core dependencies and governance — for experimental Automatic Updates & Project Browser modules Needs work . Issues and PRs could also be filed in the GitHub repo.

Beyond the above, @tedbow: it would be good to get an updated MR of Package Manager into this issue.

🇺🇸United States effulgentsia

Given #18, this issue is essentially "Fixed", but moving it to RTBC first for visibility. I'll mark it Fixed if/when a few days go by without anyone raising compelling objections.

🇺🇸United States effulgentsia

Given #9, I think there's nothing to document. Sounds like we vetted that the status quo plan, both in terms of targeting Automatic Updates to be in the core repository, and working within existing backport policy guidelines but just being more likely to actually backport the things that are backport-eligible, is good enough.

🇺🇸United States effulgentsia

Umami JSX predates some of the exploration we did in https://github.com/drupal-jsx/starter-react. There, we used Vite, so could use Vite's glob function: https://github.com/drupal-jsx/starter-react/blob/main/src/index.js#L5.

However, there the components are all flat in the src/components directory. Not sure if that's ideal, we might want to prefer Umami's / Olivero's way of organizing templates by category directories.

Also, lately I've been second guessing the trying to use PascalCase for the component filenames, and I wonder if it would be better to just name the files exactly like the Twig files would be named. So for example, perhaps layout--threecol-section.jsx would be a better name than DrupalLayout--ThreecolSection.jsx? My original reason for wanting PascalCase was to make the JSX components feel more like regular React components and less like a Drupal-quirky thing, except the thing is they're already Drupal-quirky things because the shape of the props comes from Drupal, so perhaps by naming them with Drupal's standard kebab-case, we would actually simplify the mental model by even more clearly distinguishing JSX files that represent Drupal templates from the app's other JSX components?

🇺🇸United States effulgentsia

Good idea. I think the ones we should publish are the ones in https://github.com/orgs/drupal-jsx/repositories, which are scoped. I'm curious if a github repo is published to JSR, if it automatically stays up to date with pushes to the github repo. That would make it more convenient than figuring out how to get the right workflow for automatically publishing updates to npm, though with as many people using npm as there are, maybe there's already a well documented way to setup the github repos to do this?

Even before those are published to npm/jsr, they can be installed via their GitHub URIs (e.g., `npm install github:drupal-jsx/hyperscriptify`). There might be some divergence between the packages in the `drupal-jsx` repositories and what's in the Umami demo apps that would be good to reconcile at some point.

🇺🇸United States effulgentsia

I only reviewed this quickly, but at least on first glance, it looks great!

I'm curious why core_recommended_themes is a single recipe rather than splitting front-end theme (Olivero) and admin theme (Claro) into separate recipes. For example, might people want to reuse the Claro recipe with a different front-end theme, or reuse the Olivero recipe with Gin or some other admin theme? Is there discussion that led to combining them that I can read up on?

🇺🇸United States effulgentsia

Yes, there's not much conceptually new in HTMX that we didn't have in Drupal 6's AHAH which then became Drupal 7's AJAX. But the HTMX community did a really nice job in creating a more refined API and documenting it well.

🇺🇸United States effulgentsia

Just doing some issue queue organization. Raising the priority to Major because #3396649: Add a demo Next.js app is critical and could benefit from having at least some examples of use client components, and because #3399991: Add a demo Fresh app and subtheme is major and could benefit from having at least some examples of island components.

Note that the current MR in #3396649: Add a demo Next.js app might already have this, so potentially this issue can get closed as part of that one if that MR stays that way.

🇺🇸United States effulgentsia

Just doing some issue queue organization. Raising the priority to Major because while it isn't strictly necessary for an MVP, I think it would add a lot of value. For most websites, Next.js is overkill, and being able to show that this theme engine also works well for a lightweight and no-build-step framework like Fresh would open this up to a wider range of websites.

🇺🇸United States effulgentsia

Just doing some issue queue organization. Raising the priority to Critical just to add transparency that it's a key feature / bug-fix that's needed for a successful MVP.

🇺🇸United States effulgentsia

Just doing some issue queue organization. Raising the priority to Critical just to add transparency that it's a key feature for an MVP.

🇺🇸United States effulgentsia

Closing this because I think the child issues stand on their own and we don't need this separate Plan issue.

🇺🇸United States effulgentsia

Having thought about it more, I don't think we should do this, because I think we should focus the JSX theme engine primarily on exclusively server-side rendering (except for explicit client components). Given that both Next.js and Deno Fresh are two different takes on how to do exclusive server-side rendering, I don't think we need to also demonstrate yet another framework (Remix), especially since it doesn't currently support exclusive server-side rendering. People interested in Remix integration can write that themselves without us needing to demonstrate it in this project. We're also free to change our minds about this in the future and reopen this issue then if we want to.

🇺🇸United States effulgentsia

This was completed in https://github.com/drupal-jsx/drupal-utils-dev a couple months ago, I'm just late updating this issue. The solution was based on using prop-types declarations rather than TypeScript declarations, but the spirit is the same.

🇺🇸United States effulgentsia

Fixed incorrect "Drupal 10" references to "Drupal 11".

🇺🇸United States effulgentsia

The ajaxSubmit method is massive

But with modern browsers we shouldn't need 95% of it. The submitForm() function in #26 conceptually should be a drop-in replacement for ajaxSubmit(). It didn't seem to work and I haven't had time to look into it further yet but my hunch is that we shouldn't need to add all that much more code into it to make it work.

I therefore am thinking, at least to get us onto jQuery 4, we should just fork jQuery form and fix the four jQuery 4 issues in it

Yes, I agree with doing that so as not to block our jQuery 4 upgrade on this issue.

🇺🇸United States effulgentsia

Debian trixie would be fine in the short term, but I don't think its release is any time soon, so it will drift to later SQLite versions when they're released. I think we'll want to keep testing on SQLite 3.45 for the lifetime of Drupal 11, so either Ubuntu Noble or a way to pin SQLite to the desired version regardless of the distro's default would be preferable I think.

🇺🇸United States effulgentsia

create another docker image using Ubuntu/noble?

Right, the goal is to be able to run Drupal 11 tests on SQLite 3.45 while still also being able to run Drupal 10 tests on SQLite 3.26. I think one way to achieve that would be with an Ubuntu Noble docker image. Not sure if we should just go with that or if it makes sense to consider any alternative ways of getting SQLite 3.45 onto any of our other existing docker images.

🇺🇸United States effulgentsia

#33 affects the timing of when we commit the change in the 11.x branch, but back to RTBC as far as the policy itself is concerned.

🇺🇸United States effulgentsia

Looks like Ubuntu got it in already: https://packages.ubuntu.com/source/noble/sqlite3.

I think the next step here then is to not make things more difficult than necessary for developers/testers of 11.x prior to when Ubunutu 24.04 and the next MacOS are released. I'll try to find some time in the next week or two to create a sqlite contrib module and make sure that it's as easy as `composer require` and that things like the quickstart command and running tests work smoothly.

🇺🇸United States effulgentsia

Re #28, cool! Glad they released that ahead of their initial projection for end of Jan. If someone knows Ubuntu's process (e.g., if it needs to land in Debian trunk first) and can open an issue in the correct queue (Ubuntu's or Debian's) making the case for why they should try to get this in before Ubuntu 24.04's feature freeze, that would help! With only 6 weeks till Ubuntu's feature freeze, we shouldn't assume that this particular SQLite upgrade is on their radar or something they're prioritizing without someone filing an issue in their queue.

🇺🇸United States effulgentsia

Turns out, we might not actually need a subtheme for this.

🇺🇸United States effulgentsia

Thanks for all the feedback. I discussed this with @tedbow and we agreed that as long as sufficiently important bug fixes are backported to the production branch(es), then that resolves the original concern here, so closing this as "works as designed" (even though "will work as designed" would be technically more accurate).

When I have a chance I'll open a followup to discuss some more details about what else might be needed to facilitate testing: both manual and automated, both pre-commit and post-commit.

🇺🇸United States effulgentsia

Something to think about here is upgrade path:

  1. Sites might have active config with keys that this MR makes invalid when status=FALSE. Do we want a postupdate function to remove those?
  2. There might be contrib/custom modules/profiles with default config that includes these now-invalid keys. Do we want something that runs on module-enable and/or config-import that removes them?
🇺🇸United States effulgentsia

I committed the blocker issue, so this isn't postponed anymore.

🇺🇸United States effulgentsia

Pushed to 11.x, and published the CR. The CR doesn't currently mention that requiredKey can be set to false, so that might be a good thing to add to it.

Also tagging this (not just this issue, but all the stricter validation features) for 10.3.0 release highlights.

🇺🇸United States effulgentsia

This MR looks really great. My only hesitation with committing it is the various changes in here related to enforcing the lack of any other keys when an editor's image_upload.status is FALSE. For example, the need to remove those other keys from Umami's editor.editor.basic_html.yml. I think that should all be removed from this MR and moved to a follow-up that gets its own commit and its own change record.

I don't think separating that out needs to affect this MR too much though. For example, could we accomplish that by adding a editor.image_upload_settings.0
config schema object that doesn't add FullyValidatable? We can still leave FullyValidatable in place on editor.image_upload_settings.1.

The reason I want to punt on enforcing FullyValidatable on editor.image_upload_settings.0 is that I think it could be useful in some situations to have default config that adds the upload configuration even with a status=false, so that someone could change the status to true and have the other stuff already properly configured. Maybe I'm wrong and that's not useful, but in either case, I'd like us to discuss that in its own issue without holding up this one.

Other than that, I think this is good to go.

🇺🇸United States effulgentsia

If we end up setting MariaDB's minimum to 10.11, the next question to figure out would be when to start enforcing that in HEAD. While we should communicate the requirements ~6 months before the first 11.0 release window (in other words, now-ish), if we actually set that requirement in HEAD now, then that'll affect developers/testers on Ubuntu 22.04, so I'd recommend not actually setting that requirement until just before beta (which might still be before Ubuntu 24.04 is released, which would be inconvenient, but I think we kind of have to pull that trigger before beta).

🇺🇸United States effulgentsia

I don't think we have previous examples, but also the MariaDB release schedule has only existed since 10.6, which is since July 2021, so there is not enough time for it to have been in place, for us to know how it'll affect us.

Is there any reason to think that it will affect us though? What changed is they added a bunch of short-term releases, but has anything significant changed about their long-term releases? Just one piece of anecdotal evidence: Drupal 9 raised the MariaDB minimum to 10.3, and https://www.drupal.org/project/mysql56 exists which nominally supports MariaDB 10.0 and in the last 3.5 years since Drupal 9.0 was released, there's no significant code differences in that module vs. core's driver, and no one has opened an issue with any problems on MariaDB 10.0, 10.1, or 10.2.

However, I'm not necessarily opposed to the idea that we simply want core to be cautious about the promises it makes, and that the most cautious thing to do would be to set MariaDB's minimum to 10.11, so that core doesn't have to take on the risk of dealing with MariaDB 10.6 bugs in 2027/2028. For MySQL I think we should support the last 2 LTS releases of Ubuntu (because people don't upgrade to the newest Ubuntu immediately), but given MariaDB's lower marketshare, I think it's okay to set a more aggressive minimum for MariaDB.

Changing the title to say 10.11 for MariaDB since it seems like that's what we're recommending, so wanting that to have visibility to people scanning the issue queue who might not already be following this discussion.

🇺🇸United States effulgentsia

What would be the benefit of setting MariaDB's minimum to 10.11 vs. 10.6? Are there any features or syntax in MariaDB 10.11 that both:

  • aren't in MariaDB 10.6, and
  • are in MySQL 8.0

If not, then Drupal core's mysql driver would literally have no code in it for the lifetime of Drupal 11 that won't work on MariaDB 10.6. At which point, we'd be making the adoption of D11 harder for people (e.g., people on Ubuntu 22.04 who use MariaDB) for no reason.

Or, is it not about MariaDB features, and more about MariaDB support that's available once it's EOL? Are we concerned, for example, that in 2027 or 2028 we'll still be supporting D11 and we might find a MariaDB 10.6 bug that doesn't exist in MySQL 8.0 and that MariaDB won't fix and Drupal's core mysql driver will find challenging to work around? This would be a reasonable benefit to setting 10.11 as our minimum, but do we have any prior experience of this actually happening with prior MariaDB EOL versions?

Because MariaDB has low marketshare (not sure how reliable these are, but source1, source2), I think it's acceptable to set the minimum to 10.11. For example, people on Ubuntu 22.04 would still have options, such as switch to MySQL, or use a contrib mysql driver per #27, or upgrade to Ubuntu 24.04. However, if we decide to set the minimum this high, I think we should have a clear reason for doing so, such as features we actually intend to use or actual experience with EOL MariaDB versions giving us problems during the last couple years of supporting a major Drupal version.

It looks like no one from Pantheon has commented on this issue.

This hasn't happened yet, but I found https://docs.pantheon.io/pantheon-yml#specify-a-version-of-mariadb, so looks like Pantheon customers have an easy way to choose their MariaDB version as long as Pantheon offers it, and that doc page says it was last reviewed in March 2022 and it lists 10.6 in its example, so we could perhaps extrapolate that to mean that 10.11 will be available this year.

🇺🇸United States effulgentsia

Setting the title to what I think is RTBC here per #19.

🇺🇸United States effulgentsia

I can also now confirm that I'm able to reproduce this with the same steps as in the issue summary

That's because I followed the exact steps, including installing Drupal 10.1.5. I think this got fixed in 10.1.7 and 10.2.0 in 🐛 LibraryDependencyResolver::getMinimalRepresentativeSubset() calculates dependencies incorrectly Fixed . Please re-open this if there's still a bug related to this in those newer versions.

🇺🇸United States effulgentsia

Actually, when I wrote #5, I misunderstood what my coworkers were seeing. In fact, they're seeing the same as the original issue summary, where each response is a 200 (not a 404), but it's a PHP request every time, because the hash in the src attribute of the script tag on the page doesn't match the hash that gets generated in AssetControllerBase::deliver, so in this case each page view keeps requesting the former hash, and each request hits PHP instead of a file on disk.

I can also now confirm that I'm able to reproduce this with the same steps as in the issue summary and for the same reason as explained in #2.

Retitling for clarity (I missed that this was the pattern when I first read the issue, despite the issue summary actually explaining it quite clearly), and bumping to Major (frankly, I think Critical could even be justified since this can be awful for high traffic sites).

🇺🇸United States effulgentsia

I mostly agree with #7, but with a few tweaks...

For MySQL, I think we should be the most conservative, which means I would prefer instead of:

'the MySQL LTS available before the first major beta window'

something more like 'the MySQL LTS that's been in an Ubuntu LTS for at least a year before the expected Drupal major release date'.
Given the alignment of Drupal's cycle with Ubuntu's cycle, in practice I think that'll often be the same thing, but in cases where MySQL releases an LTS after Ubuntu's feature freeze, or if Drupal changes its cycle to be less aligned with Ubuntu's, then my version is a bit more conservative.

For PostgreSQL:

the most recent PostgreSQL major version that is available prior to the first beta window for a Drupal major version

Except we want to set our requirements 3 months before beta. But given PostgreSQL's schedule of Sep/Oct, I think we could change this to the most recent PostgreSQL major version that is available 6 months before the expected Drupal major release date without that making a difference to the version that that corresponds to.

For MariaDB, I think we should get Pantheon's input into what their constraints are around adopting MariaDB versions.

For SQLite, I think we could potentially be as aggressive as "the version that's in the latest Ubuntu LTS before Drupal's beta", or as conservative as "the version that's in the latest Ubuntu LTS, Debian, and MacOS before Drupal's beta", or in between the two (e.g., Ubuntu and MacOS but not Debian). I think for any given Drupal major version we should base what we choose within that range on how important are the features in the newer SQLite release. For Drupal 11, SQLite 3.45 has jsonb which I think is important, so if that makes it into Ubuntu 24.04, I'd support that minimum even if it means early testers/users on Debian and Mac need to manually upgrade their sqlite or install a contrib driver. However, if Ubuntu's latest ends up being 3.44, then I don't think there's anything between 3.40 and 3.44 that's useful (I don't think json5 in db insert/update queries is useful to Drupal), so I'd go down to 3.40, or even to 3.39 in order to pick up the 2nd latest MacOS. Basically, where it costs us nothing to make things a bit easier for more people, I think we should do that where we can, but where core development/maintenance gets a tangible benefit from a higher minimum, we should choose that higher minimum within reason.

🇺🇸United States effulgentsia

I support requiring MySQL 8.0 and MariaDB 10.5 (at least). I suspect I'll end up creating a mysql57 contrib module, like I did with https://www.drupal.org/project/mysql56 for Drupal 9 and 10, but I don't know for sure if I will yet.

I also support raising the MariaDB minimum to 10.6 if we decide to do that. RHEL's next minor release might include 10.11. Even if it doesn't, most RHEL users probably use MySQL over MariaDB, and the few that need/prefer MariaDB could use the contrib driver if I create one (which I'll be more likely to do if both Drupal requires 10.6 and RHEL remains on 10.5).

It looks like no one from Pantheon has commented on this issue. Might be good to reach out to them for comment. They're probably the largest Drupal host that uses MariaDB and I don't know their platform constraints with respect to versions that are available to them. (I work for Acquia and we use MySQL.)

🇺🇸United States effulgentsia

The biggest problem with 3.45.0 is that we aim to set requirements six months before the first release window for the major version

I forget if we documented our "set platform requirements 6 months before major release" policy, but if so, then FWIW, I think we should change that policy to explicitly mention SQLite as an exception to that (not just for 11.0 but ongoing), and allow SQLite's minimum to float up until beta. While I think SQLite is a fantastic db, and I would actually love to see more Drupal sites using it in production, especially if https://sqlite.org/hctree/doc/hctree/doc/hctree/index.html ever makes it beyond its currently experimental state, I don't think there's currently enough Drupal sites using SQLite in production to warrant needing 6 months prior to major release notice.

🇺🇸United States effulgentsia

Debian and Ubuntu both have releases scheduled prior to Drupal 11's earliest release date which will support sqlite 3.42.

This is correct for Ubuntu, but I don't think it is for Debian. Debian's latest stable release is Debian 12, released in June 2023, which has SQLite 3.40. Debian has not yet announced a release date for Debian 13, but they typically follow a ~2 year cycle, so mid-2025 is likely. So if we want Drupal 11 with sqlite to work on the latest release of Debian that's available when 11.0 is released, we should set the minimum to 3.40. However, if there's nothing we need in 3.40, I'd suggest lowering it to 3.39, which is the version in MacOS Ventura. If there is something we need in 3.40, then it's okay to set above Ventura's minimum since MacOS Sonoma has 3.43, but some people are slow to update their MacOS (including if they're on old hardware), so if there's no difference to us between 3.40 and 3.39, then we might as well be nice to them.

The argument for 3.42 over 3.40 is json5 support, but I don't envision us having any use case in core for json5. Especially if MySQL doesn't support json5 yet (does it?). json5 is great for human-readable (e.g., adding comments) json, but what would be the use case for us needing to send json5 into the db? Also, sqlite doesn't even retain the json5 format (e.g., comments would be stripped out before storing into the db file).

I think we should not be constrained by RHEL here

I can accept this for core, but I don't think it's good to drop RHEL support entirely. In other words, if Drupal 11 requires anything more than SQLite 3.26, I will create a contrib module with a sqlite driver that works on 3.26. It makes sense for core's sqlite driver to use the -> and ->> operators (which requires 3.38), but I don't expect there to be any problem with making contrib's sqlite driver change those to the equivalent json functions that are available in 3.26.

It would be great to just bump the minimum to 3.45.0

I agree that core being able to use 3.45's jsonb_*() functions would be nice. Given that I'll create a contrib sqlite driver for old sqlite versions, there might be a pathway to us choosing to be this aggressive with core's requirement. I think the key question here is whether 3.45 will make it into Ubuntu 24.04. If SQLite releases 3.45 end of January and Ubuntu's feature freeze is end of February, this is possible, but not guaranteed, especially since I think Ubuntu's process is that it would need to land in Debian's trunk first.

I think the other question is whether Drupal 11.0 will be released in June or July, because the June window means a beta end of March (before Ubuntu 24.04's release) and the July window means a beta end of April (just after Ubuntu's release).

But, let's assume hypothetically that 3.45 makes it into Ubuntu 24.04 and Drupal 11's beta is right after Ubuntu's release. Would it then be reasonable to say that core only needs to support that, and people on the non-latest Ubuntu or on any other OS (including MacOS until a new version of that is released later in 2024) need to install the contrib driver? How would this affect the quick-start command (e.g., could we add a check in that script that tells people if their sqlite is too old and give them the composer command they need to run to install the contrib driver)?

Finally, if any of the work in [PP-3] Add "json" as core data type to Schema and Database API Postponed and related issues actually wants to use sqlite 3.45 syntax (jsonb functions) or even 3.38 syntax (-> and ->>) operators, are we then saying that that work will only be committed to 11.0 and not to 10.3?

In summary, my opinion is that we should choose either 3.39 or 3.45 (or the former now and consider the latter later), but that anything in between the two doesn't make as much sense (unless there's important features between those two other than json5).

🇺🇸United States effulgentsia

FrankenPHP has a known issue about crashes when certain PHP constructs and functions are used within Fibers. I think it would be great to open a separate Drupal core issue that tries to find out if Drupal makes any of those calls within fibers. That could happen independently of this issue, which affects all persistent app servers, not just FrankenPHP. However, I don't have any ideas at the moment on what the best way would be to determine if Drupal triggers any of those FrankenPHP crashes without first solving this issue.

🇺🇸United States effulgentsia

I set #95 to NR just to queue tests, but now back to NW to keep this out of the review queue.

Question from the issue summary this is a workaround for drupalCi but what about gitlab?

It's a good question as to whether with GitLabCI if it's now possible to move this issue to the https://www.drupal.org/project/mysql56 queue but still be able to put some custom stuff into that project's .gitlab-ci configuration that would make it run all of Drupal core's tests. I'll need to look into that when I can carve out some time to do so.

🇺🇸United States effulgentsia

@bkosborne: I've heard reports from coworkers seeing something very similar to #3, but haven't found steps to reproduce it. If you (or anyone else seeing this) can figure out a way to get from a newly installed Standard Profile site into a configuration (which magic combo of contrib/custom modules is needed?) that puts you into that 404-every-time state, that would be super helpful to debugging and fixing it!

🇺🇸United States effulgentsia

It's that time of year :)

🇺🇸United States effulgentsia

The current MR looks really good to me and addresses all my prior concerns.

+    // All values are optional by default (meaning they can be NULL), except for
+    // mappings and sequences. A sequence can only be NULL when `nullable: true`
+    // is set on the config schema type definition. This is unintuitive and
+    // contradicts Drupal core's documentation.
+    // @see https://www.drupal.org/node/2264179
+    // @see https://www.drupal.org/node/1978714
+    // To gradually evolve configuration schemas in the Drupal ecosystem to be
+    // validatable, this needs to be clarified in a non-disruptive way. Any
+    // config schema type definition — that is, a top-level entry in a
+    // *.schema.yml file — can opt into stricter behavior, whereby a property
+    // cannot be NULL unless it specifies `nullable: true`, by adding
+    // `FullyValidatable` as a top-level validation constraint.

I'd like to take an attempt at wordsmithing this a bit more, but not right now. I'll either do that before committing this myself, or if someone else ends up committing it, I'm fine with opening a follow-up for improving code comments.

This issue is tagged "Needs subsystem maintainer review", so would be great to get @alexpott's review if possible. However, if he doesn't have bandwidth for it, I won't block commit of this on that.

       $root_type_has_opted_in = FALSE;
+      foreach ($parent->getRoot()->getConstraints() as $constraint) {
+        if ($constraint instanceof FullyValidatableConstraint) {
+          $root_type_has_opted_in = TRUE;

Since this MR only adds FullyValidatableConstraint to menus and shortcut sets, neither of which has dynamic types (plugins or third party settings), I think this is fine for now. However, before adding FullyValidatableConstraint to config roots that include any descendants with dynamic types, I think we need to work out how to handle those boundaries. It's impossible for a config root to know that all of the plugins within it are fully validatable, so I think the semantics of FullyValidatableConstraint must be limited to only its descendants with static types. For an element that has an ancestor with a dynamic type, we need to use that ancestor's presence or absence of FullyValidatableConstraint instead of checking ->getRoot(). But again, we can punt that to a follow-up since that's not surfaced for menus and shortcuts.

🇺🇸United States effulgentsia

Thanks for finding these. I think the proper place to fix it would be in https://github.com/drupal-jsx/propsify/blob/main/strict.js. Also in the petite.js file in that same repo if the same change is needed for Preact/Solid, but not if it's only for React.

Umami JSX isn't using the above module yet, so until it does, fixing it in this PHP branch is a reasonable stopgap.

🇺🇸United States effulgentsia

After opening this issue I learned about https://www.npmjs.com/package/prop-types, which is a better fit than TypeScript for this use case. This is now demonstrated in https://github.com/drupal-jsx/starter-react/blob/main/src/components/Dru....

🇺🇸United States effulgentsia

Hey @cosmicdreams! Great to see you here :)

Are you more interested in HTMX as it relates to Drupal core? Or how it relates to the (still in very early stage development) JSX theme engine (this issue)? If the former, I opened 🌱 [Plan] Gradually replace Drupal's AJAX system with HTMX Active . For that issue, I think a possible next step would be to pick one place that currently uses AJAX in core (I'd probably suggest taking a straightforward example like ConfigSingleExportForm rather than starting with Field UI or Views UI) and trying to convert it to HTMX and see what you end up needing to add to such an MR (besides the HTMX library itself) to make it work.

For this issue, I currently have it postponed on #3397049: Integrate with AJAX , because the first step is to make the JSX theme engine work with ajax at all. Once we figure out how to do that, it will inform how to make it work with htmx as well. I expect the solution with respect to JSX theme engine to be very similar between ajax and htmx, since the underlying problem is the same: where to intercept the response before it's added to the DOM so that we can manipulate it first (for example, render it with React or whichever JSX renderer is being used).

🇺🇸United States effulgentsia

Re #20, since converting to HTMX is too large scope for this issue, I opened 🌱 [Plan] Gradually replace Drupal's AJAX system with HTMX Active as a separate Ideas issue.

🇺🇸United States effulgentsia

I haven't tested this at all, so this might fail hard, but something along these lines is what I recommend.

🇺🇸United States effulgentsia

Re #23, I don't think Views / Views UI actually use jQuery Form directly, then only use it via Drupal.ajax. I think the only reason they declare a dependency on the plugin is because they instantiate Drupal.ajax() objects from JS code rather than always going through PHP's #ajax, so they don't get the dependency brought in automatically.

In other words, to remove it from Views / Views UI, we need to remove it from misc/ajax.js first. Changing this issue's title and summary to focus on that.

🇺🇸United States effulgentsia

#11 is a good summary of my concerns. To summarize even more, I'm concerned that it's not at all clear to module authors who write config schemas that the semantics of the schema is that every key that's in it is considered required-by-default unless it explicitly says nullable: true.

On the one hand:

On the other hand:

So, my biggest concerns with the MR in 📌 Configuration schema & required values: add test coverage for `nullable: true` validation support Fixed is that:

  1. It's adding setRequired() to every non-nullable element, which means anything that calls isRequired() for those definition objects will start seeing TRUE where it used to see FALSE. Is that fixing a bug (by making the code consistent with what https://www.drupal.org/node/1905070 has said for the last 7 years) or is it adding a BC break to a reasonable expectation that things are not required by default? It's hard to say.
  2. By virtue of adding the setRequired(), it's also adding the NotNull constraint to enforce that, which makes sense if we agree that these should all be required, but for people who did not know that all config elements are required by default, it means that not only will calls to isRequired() return a different value than they did before, but that also new violations will get returned that weren't returned before.

The FullyValidatable constraint proposed in #11 addresses my 2nd concern in the above list.

I'm not sure what we want to do about the first concern.

Should we make the setRequired() change regardless of the FullyValidatable constraint? If we name the constraint "fully validatable", then I think we should, since I think it would be weird for the presence of a constraint to change whether or not something is semantically required, it should only affect whether or not we run other constraints or how we handle their violations (or whether we perform validation at all).

Or, in addition to the FullyValidatable constraint, should we also add a new schema property to mappings, such as allDescendantsRequiredByDefault, which is what I also proposed simply calling strict. That way, we don't change the return value of isRequired() except within structures that people opt into. The downside of this though, is that we'd be introducing a new schema property to do what https://www.drupal.org/node/1905070 (but not our code) has already said we've been doing for 7 years.

I think it makes sense to proceed with FullyValidatable and not with strict for now and discuss whether or not we need/want a new strict property in a followup, but I think splitting it up that way would mean committing all this to 10.3.x only (and not to 10.2) so that we don't end up releasing a partial behavior change without having had time to work out the full impact of it.

🇺🇸United States effulgentsia

The two, active, LTS minors are 10.6 and 10.11

What's your definition of "active" and why does it include 10.6, but not 10.5? From https://mariadb.org/download:

The current maintained versions are: 10.4, 10.5, 10.6, 10.11 (maintained for 5 years)

🇺🇸United States effulgentsia

#3392970: Move preprocess functions to JS code is now encountering this issue when converting block.html.twig to Block.jsx, because the language switcher block is placeholdered and rendered later with BigPipe (which uses Ajax), and so the <drupal-block> element fails to get mapped to the Block JSX component. The workaround in that issue for now is to disable BigPipe module.

🇺🇸United States effulgentsia

What would actually be great would be a script to generate an SVG given an input array of versions and dates.

I've been meaning to try out something like https://apexcharts.com/javascript-chart-demos/timeline-charts/advanced/ but haven't found the time yet.

🇺🇸United States effulgentsia

Is the intended functionality that any place that references the my-button--primary component gets this one instead of the main one?

Oh I see. I think the answer is yes because in its my-button--primary.component.yml file, it says replaces: 'sdc_examples:my-button--primary'.

Leaving this issue open because I think adding a README to this submodule (or adding info about this submodule to the main module's README) would still be helpful.

🇺🇸United States effulgentsia

I think this will take a bunch of work, but it will be pretty amazing once done! I think it'll be helpful to do this in steps. Here's the first few steps I propose (for each step after the first, I'd be happy to merge a working MR to the main branch):

  1. Go through the Next.js tutorial (all 16 pages). It's got some good information in there specifically targeted to Next.js 13 (when support for RSC was added via their new "app" router) and 14. Or at least, I feel like I learned a bunch by going through it.
  2. Create an initial app within demo/themes/umami_nextjs/app (no need to create the Drupal theme yet, at this point let's just start with the next.js app only). Give it a root layout (for the <html> and <body> tags but not yet anything in the body other than {children}) and a catch-all route that all URLs will go to. At this step, just have this route render <div>{url}</div> where url is the URL that you're on, which you can reconstruct with the params and searchParams props.
  3. Next, change the route to instead of displaying the URL, to fetch that URL (adjusted for host and port) from Drupal. For example, if the Next.js app is on localhost:3000 and the Drupal site is on drupal.dev, then if in the browser you visit localhost:3000/a/b/c, the route should fetch the content of drupal.dev/a/b/c. At this step, it's better for the Drupal site to be using the regular Umami theme, not the Umami JSX theme. Once the route has that content, use happy-dom to get the innerHTML of just the div with the data-off-canvas-main-canvas attribute. Have the route render that innerHTML into a div via the dangerouslySetInnerHTML prop. Optionally, to make this look like Umami, copy the <link> tags from a "View source" of the Drupal site into the <head> section of the app's root layout. The hrefs of these link tags would need to be adjusted to use the Drupal site's URL, since at this point the next.js app can't yet serve these CSS files. At this point, it's now hopefully possible to go to localhost:3000 and click around to navigate to different pages and see their content.
  4. Next, replace dangerouslySetInnerHTML with using hyperscriptify to convert the HTML into a React element.
  5. Next, switch the Drupal site to use the Umami JSX theme, and wire up the call to hyperscriptify() to pass in the components map.
  6. Once we get this far, let's regroup to figure out the rest. For example, we'll need to figure out how to best get all the CSS served by Next.js (including both the global CSS still managed by Drupal and the Linaria CSS). But one feature I'm really looking forward to hopefully be able for us to try and have it "just work" is being able to map the 'a' element to Next.js's Link component in order to enable single-page-app behavior instead of full page refreshes.
🇺🇸United States effulgentsia

I added this as a target issue that we'll allow in post-alpha (but before beta) in 🌱 [meta] Release preparation steps for 10.2.0 Active if it's ready in time.

🇺🇸United States effulgentsia

@longwave and I agreed that we would like to get [PP-3] Add "json" as core data type to Schema and Database API Postponed in before beta (allow it in between alpha and beta) if it's ready before beta, so adding it to the issue summary's target issues list.

🇺🇸United States effulgentsia

Folks here might already know this, but I only recently learned (via a Bun.js release announcement) that npm supports an overrides field and yarn supports a resolutions field that let you achieve what the composer-require-lenient plugin lets you achieve. I think this adds to the argument that Composer itself should add similar functionality upstream, but if they don't want to, I'd support #74's suggestion of adding it to Drupal core.

Production build 0.67.2 2024