- π«π·France nod_ Lille
One idea that works only with JS enabled is when submitting the form, adding disabled to all unchecked checkboxes, that will exclude them from the POSTed values. Not sure how well it'd work when unchecking a permissionβ¦
- π¬π§United Kingdom catch
I think #53 is worth looking into, we've already had this but for years with or without js and making it work js-only would be a big improvement over not at all.
- π«π·France nod_ Lille
I'm not sure we can help more on the frontend side, browsers only submit checked checkboxes already. So on my 4000 checkbox pages, there are only the 143 checked checkboxes sent in the POST request.
browsers may have gotten smarter about all this since then.
- πΊπΈUnited States tedfordgif
Aren't there "shadow" hidden inputs as well? In order to get rid of those you'd need to e.g. round-trip a hash of the current state to detect if another user saved permissions in the meantime. Then the front-end could submit only the checkbox transitions. That would require JS as well.
While we're talking JS, if we send the max_post_vars as a setting, we can stop the submission if we would exceed it.
Or turn it into a progressive enhancement exercise, where the no-JS form can only look at one role at a time, and only once you enable JS can you edit all roles at once.
- πΊπΈUnited States tedfordgif
Hmmm, another terrible idea: I wonder if you could implement the transition idea without JS using radio buttons. If previously checked, make the "checked" input have an empty value, and the "unchecked" input have a value attr that means "remove this permission from the role". You could even use CSS to get close to the current interface.
I'm guessing the radio value always gets submitted, though, even with a blank value.
And there are likely to be a11y concerns.
- π©πͺGermany Anybody Porta Westfalica
Reviving this as I just ran into the issue once again (and think it happened to clients > 30 times in the last years).
Maybe a good first shot would be to ensure a viable minimum for
max_input_vars
for Drupal installations when installing Drupal and in the status report as warning?Just having a mid-size Drupal project, for example the admin menu administration page (
/admin/structure/menu/manage/admin
) or the permissions administration typically grows > 1000 input elements very fast.
So before going to be much into details here, adding a hook_requirements() check formax_input_vars
being at least 10000, otherwise showing a warning (not an error) to raise attention would be super helpful I think.While discussing this for a long time, I think many people are hurt by having a too low default hosting provider value (typically '1000'). What do you think? Everything else as follow-up perhaps?
- π¬π§United Kingdom catch
@anybody while it only improves the permissions page, I think π Improve performance of the user.permissions.js script running in /admin/people/permissions. Needs review may have improved things there - do you have a site you could confirm that on?
- π©πͺGermany Anybody Porta Westfalica
Hi @catch and thanks for the fast reply! Maybe there's a misunderstanding. My comment is about forms with many inputs not being saved at all, getting people into trouble, not only on the permissions page, but also for example on the menu administration and other pages with many form elements.
It's not about performance, but a risky (mis)configuration in php.ini for Drupal.
- π¬π§United Kingdom catch
@anybody that issue at least theoretically also reduced the number of checkboxes that the permissions page submits, so it would be good to see what if any effect it has on a real site affected by this issue.
Agreed the permissions page isn't the only problem here though.
- π©πͺGermany geek-merlin Freiburg, Germany
This can and should be solved generically in the DB driver.
We can steal code from ye olde SQLite 999-limit issue: π Make SQLite faster by combining multiple inserts and updates in a single query Needs work
Note that there have been several approaches in that issue. The current approach that sends multple values via JSON is really elegant, but likely too SQLite specific. But a former approach had a parameter chunker, that likely every DB driver can use. See #2031261-86: Make SQLite faster by combining multiple inserts and updates in a single query β ff.
- π©πͺGermany Anybody Porta Westfalica
@geek-merlin: Are you sure this landed in the right issue?
It's database-related, yes, but what we see here can also happen without any database interaction, what ever is done with the form values. (Of course in most cases they will be stored in a database).
This issue is about forms with many inputs, so that the sent request exceeds
max_input_vars
and values are dropped without a warning.I'm not sure if this belongs into database system. Guess you maybe worked in several related issues? (That clearly exist)
- π«π·France nod_ Lille
The main issue is at the http/php level
@tedfordgif we have done all we can at the html level. no more cruft to remove. the shadow inputs are not submitted
- π©πͺGermany Anybody Porta Westfalica
@nod_ thanks! Could you maybe give a short comment on my suggestions in #60? That could be a quick win to at least introduce a warning if the limit is low?
Additionally, we could show a warning if a form submitted exactly matches
max_post_vars
and allow opt-out via settings.php? With a highmax_post_vars
value this warning would then become improbable.Once we have a plan, I think it would make sense to split this into sub-issues for the different points to implement. But it first needs a decision which way to follow.
Thanks!
- π«π·France nod_ Lille
hook_requirement make sense. i would prevent the form from being submitted with a warning if we know it will go over
- π©πͺGermany Anybody Porta Westfalica
@nod_ thanks! I don't think we can detect the form is going over, because we simply won't receive values for the values over the limit, but we should elaborate on that!
Let's add a first subtask then to implement a hook_requirements check for a
max_post_vars
limit < 10 000 to show on installation and in the status report. I'll ask our team.Here we should proceed checking how to detect a form submit that exceeds the limit.
- π«π·France nod_ Lille
we can get the max input var value and check with JS at submit for example. It's a pretty dangerous situation so special casing makes sense
- π¬π§United Kingdom catch
When max_input_vars is exceeded an E_WARNING is triggered according to https://www.php.net/manual/en/info.configuration.php#ini.max-input-vars - could we special-case this in the error handler and then do something with it?
- πͺπΈSpain tunic Madrid
Idea from #71 would be great. Using an arbitrary big number doesn't ensure issue is not going to happen.
Other option is adding the check in the Form API layer. During form rendering the number of variables needed can be known, and raise a warning to the watchdog. #71 idea is easier to implement, I guess.
- π©πͺGermany Anybody Porta Westfalica
Other option is adding the check in the Form API layer. During form rendering the number of variables needed can be known, and raise a warning to the watchdog. #71 idea is easier to implement, I guess.
Indeed, I think that would be better than JS and server-side. On the other hand, that could also get complicated with compound elements, but definitely solvable.
I hope I'll find the time in the next days to check #71 that would be best! Adding a warning to the status report for very low limit still makes sense to me anyway.
- πΊπΈUnited States douggreen Winchester, VA
I've been looking into the suggestion by @catch in #71. Here's a patch (no MR because I'm not sure we want to go this direction).
- πΊπΈUnited States smustgrave
Have not reviewed
Fixes should be in MRs and will also need test coverage
- πΊπΈUnited States smustgrave
Appears to have pipeline issues.
Also appears to be missing test coverage, tagging for such.
- π¬π§United Kingdom catch
I don't think this can be tested unless we're able to add it to .htaccess within a test, otherwise it would depend on the environment.
- πΊπΈUnited States jenlampton
If we detect the E_WARNING on submission, it needs to stop the form from processing immediately to prevent data loss.
If a form does get processed with more than max_input_vars values, that can sometimes cause things to be deleted - for menus it causes the menu items to be removed, and for permissions it causes those permissions to be un-granted, and removed from the config files (not always database as discussed above).
> I don't think this can be tested unless we're able to add it to .htaccess within a test, otherwise it would depend on the environment.
For tests, we could use a test module with 10,008 checkboxes where the checkboxes are created by a loop.
- π¬π§United Kingdom catch
For tests, we could use a test module with 10,008 checkboxes where the checkboxes are created by a loop.
We can't guarantee that the test environment has max_input_vars set to 10,000.
- πΊπΈUnited States azinck
We can't guarantee that the test environment has max_input_vars set to 10,000.
Any reason not to read the max_input_vars value of the environment and just create more checkboxes than whatever the value is?
- πΊπΈUnited States douggreen Winchester, VA
I agree with @azinck, I believe we can create a test.
- πΊπΈUnited States douggreen Winchester, VA
This is ready for review again, complete with a test :)
- First commit to issue fork.
- heddn Nicaragua
This can't be easily tested with the current approach. Perhaps the testbot CI can handle things, but running the test locally, (5000 max input var is my php's default) and the test never runs to completion. I tried to set the value lower via an
ini_set
in the test's setup method. But that still left the test form still using 5000 from max_input_vars.Instead I manually tested things with the test by artificually setting my max_input_vars to 5. That let things run to completion in a very happy way. Moving this back to NW because we either need to be OK with manual testing (and get rid of the tests) or figure out a way to runtime set the max_input_vars to a lower value.
- heddn Nicaragua
As I expected, I tracked down over in https://www.php.net/manual/en/ini.core.php#ini.sect.file-uploads that we can't set that value at run-time.
- π¬π§United Kingdom catch
If it works on gitlab, could we check max_input_vars first, and if it's over a certain number, skip the test? That would mean the test doesn't time out locally and give people a hint how they can run it locally if it's likely to fail.
- heddn Nicaragua
It runs on the testbot, but at a huge cost, 36 minutes: https://git.drupalcode.org/issue/drupal-1565704/-/pipelines/397114/test_...
- π¬π§United Kingdom catch
It runs on the testbot, but at a huge cost, 36 minutes
OMG definitely not, let's remove the test from the MR.
Could the max_input_vars error get hidden by a subsequent error?
That seems possible, but I don't think we know exactly when it will get triggered, so we would need to hard-code support for it in core's error handler probably - maybe that would be OK though.