Core interfaces can go over max_input_vars

Created on 7 May 2012, over 12 years ago
Updated 2 September 2024, 5 months ago

Problem/Motivation

Core interfaces can easily exceed common configurations for max_input_vars
e.g.

  • editing a menu with a lot of links
  • the permissions page

Steps to reproduce

Set max_input_vars to a low value
Try to submit changes to the permissions form

Proposed resolution

Add a hook_requirements that checks the number of permissions x the number of roles and then compares that to 80% of max_input_vars. If its higher, show a requirements error advising to increase the ini option
Show the same message on the permissions page when we know the form won't be able to submit
Show a similar message for the menu link edit page where it would be number of links x 3 (one for each of enabled, parent and weight) and again 80% to allow for form alters etc

Remaining tasks

All of the above

User interface changes

API changes

Data model changes

Release notes snippet

I've setup a content type having several field groups configured as horizontal tabs with field collections inside it (several ones including text, dropdowns, date, etc) one field collection for each field group (tab)

Im using:
Field collection 7.x-1.0-beta3+4-dev
Fieldgroup 7.x-1.1
File entity 7.x-2.0-unstable3+21-dev
Entity API 7.x-1.0-rc2+0-dev
Field Collection Table 7.x-1.0-beta1

Everything is working fine except for the issue of having 1 added empty row for each time I save the node. I know there's a bug report about that and Its in progress.

When I hit the remove button on each row to remove those empty lines (from the bottom-up), happens this:

1st row: deleted OK

2nd row: deleted OK but I got the following on the FF javascript console:

Error: $("#" + base, context).once is not a function
Source File: (obscured)/node/123/edit
Line: 16

3rd row: looks like goes to server, but when the "working" animated gif dissapears, nothing happens and I got this on javascript console:

Error: $("fieldset.path-form", context).drupalSetSummary is not a function
Source File: (obscured)/sites/all/modules/pathauto/pathauto.js?m3nlsr
Line: 5

Im setting this before that line on the other module's scripts

if (typeof jQuery.fn.drupalSetSummary == 'undefined') {
  return;
}

and works, but then another module (like metatag) that uses drupalSetSummary raises the error. etc.

Looks to me like jQuery is breaking and then everything that depends on it cease to work.

Plus, if I try to save the node on this instance (when remove button stopped working) im getting "An illegal choice has been detected. Please contact the site administrator."

I cannot find $("#" + base, context).once on the module's code to see whats going on. Any help?

Edit:
Additional data: http://drupal.org/node/445130#comment-2851276
/misc/form.js is loaded on my edit page when I see the source code, but maybe when this module makes ajax calls somehow the drupalSetSummary() function is removed or something.

Thanks for reading this.

πŸ› Bug report
Status

Needs work

Version

11.0 πŸ”₯

Component
FormΒ  β†’

Last updated about 18 hours ago

Created by

πŸ‡¨πŸ‡¦Canada seaarg

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Merge Requests

Comments & Activities

Not all content is available!

It's likely this issue predates Contrib.social: some issue and comment data are missing.

  • πŸ‡«πŸ‡·France nod_ Lille

    One idea that works only with JS enabled is when submitting the form, adding disabled to all unchecked checkboxes, that will exclude them from the POSTed values. Not sure how well it'd work when unchecking a permission…

  • πŸ‡¬πŸ‡§United Kingdom catch

    I think #53 is worth looking into, we've already had this but for years with or without js and making it work js-only would be a big improvement over not at all.

  • Merge request !9392Resolve #1565704 "Remove empty checkboxes in JS" β†’ (Closed) created by nod_
  • πŸ‡«πŸ‡·France nod_ Lille

    I'm not sure we can help more on the frontend side, browsers only submit checked checkboxes already. So on my 4000 checkbox pages, there are only the 143 checked checkboxes sent in the POST request.

    browsers may have gotten smarter about all this since then.

  • πŸ‡ΊπŸ‡ΈUnited States tedfordgif

    Aren't there "shadow" hidden inputs as well? In order to get rid of those you'd need to e.g. round-trip a hash of the current state to detect if another user saved permissions in the meantime. Then the front-end could submit only the checkbox transitions. That would require JS as well.

    While we're talking JS, if we send the max_post_vars as a setting, we can stop the submission if we would exceed it.

    Or turn it into a progressive enhancement exercise, where the no-JS form can only look at one role at a time, and only once you enable JS can you edit all roles at once.

  • πŸ‡ΊπŸ‡ΈUnited States tedfordgif

    Hmmm, another terrible idea: I wonder if you could implement the transition idea without JS using radio buttons. If previously checked, make the "checked" input have an empty value, and the "unchecked" input have a value attr that means "remove this permission from the role". You could even use CSS to get close to the current interface.

    I'm guessing the radio value always gets submitted, though, even with a blank value.

    And there are likely to be a11y concerns.

  • πŸ‡©πŸ‡ͺGermany Anybody Porta Westfalica

    Reviving this as I just ran into the issue once again (and think it happened to clients > 30 times in the last years).

    Maybe a good first shot would be to ensure a viable minimum for max_input_vars for Drupal installations when installing Drupal and in the status report as warning?

    Just having a mid-size Drupal project, for example the admin menu administration page (/admin/structure/menu/manage/admin) or the permissions administration typically grows > 1000 input elements very fast.
    So before going to be much into details here, adding a hook_requirements() check for max_input_vars being at least 10000, otherwise showing a warning (not an error) to raise attention would be super helpful I think.

    While discussing this for a long time, I think many people are hurt by having a too low default hosting provider value (typically '1000'). What do you think? Everything else as follow-up perhaps?

  • πŸ‡¬πŸ‡§United Kingdom catch

    @anybody while it only improves the permissions page, I think πŸ“Œ Improve performance of the user.permissions.js script running in /admin/people/permissions. Needs review may have improved things there - do you have a site you could confirm that on?

  • πŸ‡©πŸ‡ͺGermany Anybody Porta Westfalica

    Hi @catch and thanks for the fast reply! Maybe there's a misunderstanding. My comment is about forms with many inputs not being saved at all, getting people into trouble, not only on the permissions page, but also for example on the menu administration and other pages with many form elements.

    It's not about performance, but a risky (mis)configuration in php.ini for Drupal.

  • πŸ‡¬πŸ‡§United Kingdom catch

    @anybody that issue at least theoretically also reduced the number of checkboxes that the permissions page submits, so it would be good to see what if any effect it has on a real site affected by this issue.

    Agreed the permissions page isn't the only problem here though.

  • πŸ‡©πŸ‡ͺGermany geek-merlin Freiburg, Germany

    This can and should be solved generically in the DB driver.

    We can steal code from ye olde SQLite 999-limit issue: πŸ“Œ Make SQLite faster by combining multiple inserts and updates in a single query Needs work

    Note that there have been several approaches in that issue. The current approach that sends multple values via JSON is really elegant, but likely too SQLite specific. But a former approach had a parameter chunker, that likely every DB driver can use. See #2031261-86: Make SQLite faster by combining multiple inserts and updates in a single query β†’ ff.

  • πŸ‡©πŸ‡ͺGermany Anybody Porta Westfalica

    @geek-merlin: Are you sure this landed in the right issue?

    It's database-related, yes, but what we see here can also happen without any database interaction, what ever is done with the form values. (Of course in most cases they will be stored in a database).

    This issue is about forms with many inputs, so that the sent request exceeds max_input_vars and values are dropped without a warning.

    I'm not sure if this belongs into database system. Guess you maybe worked in several related issues? (That clearly exist)

  • πŸ‡«πŸ‡·France nod_ Lille

    The main issue is at the http/php level

    @tedfordgif we have done all we can at the html level. no more cruft to remove. the shadow inputs are not submitted

  • πŸ‡©πŸ‡ͺGermany Anybody Porta Westfalica

    @nod_ thanks! Could you maybe give a short comment on my suggestions in #60? That could be a quick win to at least introduce a warning if the limit is low?

    Additionally, we could show a warning if a form submitted exactly matches max_post_vars and allow opt-out via settings.php? With a high max_post_vars value this warning would then become improbable.

    Once we have a plan, I think it would make sense to split this into sub-issues for the different points to implement. But it first needs a decision which way to follow.

    Thanks!

  • πŸ‡«πŸ‡·France nod_ Lille

    hook_requirement make sense. i would prevent the form from being submitted with a warning if we know it will go over

  • πŸ‡©πŸ‡ͺGermany Anybody Porta Westfalica

    @nod_ thanks! I don't think we can detect the form is going over, because we simply won't receive values for the values over the limit, but we should elaborate on that!

    Let's add a first subtask then to implement a hook_requirements check for a max_post_vars limit < 10 000 to show on installation and in the status report. I'll ask our team.

    Here we should proceed checking how to detect a form submit that exceeds the limit.

  • πŸ‡«πŸ‡·France nod_ Lille

    we can get the max input var value and check with JS at submit for example. It's a pretty dangerous situation so special casing makes sense

  • πŸ‡¬πŸ‡§United Kingdom catch

    When max_input_vars is exceeded an E_WARNING is triggered according to https://www.php.net/manual/en/info.configuration.php#ini.max-input-vars - could we special-case this in the error handler and then do something with it?

  • πŸ‡ͺπŸ‡ΈSpain tunic Madrid

    Idea from #71 would be great. Using an arbitrary big number doesn't ensure issue is not going to happen.

    Other option is adding the check in the Form API layer. During form rendering the number of variables needed can be known, and raise a warning to the watchdog. #71 idea is easier to implement, I guess.

  • πŸ‡©πŸ‡ͺGermany Anybody Porta Westfalica

    Other option is adding the check in the Form API layer. During form rendering the number of variables needed can be known, and raise a warning to the watchdog. #71 idea is easier to implement, I guess.

    Indeed, I think that would be better than JS and server-side. On the other hand, that could also get complicated with compound elements, but definitely solvable.

    I hope I'll find the time in the next days to check #71 that would be best! Adding a warning to the status report for very low limit still makes sense to me anyway.

  • πŸ‡ΊπŸ‡ΈUnited States douggreen Winchester, VA

    I've been looking into the suggestion by @catch in #71. Here's a patch (no MR because I'm not sure we want to go this direction).

  • πŸ‡ΊπŸ‡ΈUnited States douggreen Winchester, VA
  • πŸ‡ΊπŸ‡ΈUnited States smustgrave

    Have not reviewed

    Fixes should be in MRs and will also need test coverage

  • πŸ‡ΊπŸ‡ΈUnited States douggreen Winchester, VA
  • Pipeline finished with Failed
    16 days ago
    Total: 149s
    #387473
  • πŸ‡ΊπŸ‡ΈUnited States smustgrave

    Appears to have pipeline issues.

    Also appears to be missing test coverage, tagging for such.

  • πŸ‡¬πŸ‡§United Kingdom catch

    I don't think this can be tested unless we're able to add it to .htaccess within a test, otherwise it would depend on the environment.

  • Pipeline finished with Failed
    14 days ago
    Total: 518s
    #388996
  • Pipeline finished with Failed
    14 days ago
    Total: 214s
    #389016
  • πŸ‡ΊπŸ‡ΈUnited States jenlampton

    If we detect the E_WARNING on submission, it needs to stop the form from processing immediately to prevent data loss.

    If a form does get processed with more than max_input_vars values, that can sometimes cause things to be deleted - for menus it causes the menu items to be removed, and for permissions it causes those permissions to be un-granted, and removed from the config files (not always database as discussed above).

    > I don't think this can be tested unless we're able to add it to .htaccess within a test, otherwise it would depend on the environment.

    For tests, we could use a test module with 10,008 checkboxes where the checkboxes are created by a loop.

  • πŸ‡¬πŸ‡§United Kingdom catch

    For tests, we could use a test module with 10,008 checkboxes where the checkboxes are created by a loop.

    We can't guarantee that the test environment has max_input_vars set to 10,000.

  • πŸ‡ΊπŸ‡ΈUnited States azinck

    We can't guarantee that the test environment has max_input_vars set to 10,000.

    Any reason not to read the max_input_vars value of the environment and just create more checkboxes than whatever the value is?

  • πŸ‡ΊπŸ‡ΈUnited States douggreen Winchester, VA

    I agree with @azinck, I believe we can create a test.

  • πŸ‡¬πŸ‡§United Kingdom catch

    Yeah max_input_vars +1 is a great idea.

  • Pipeline finished with Failed
    10 days ago
    Total: 180s
    #393096
  • Pipeline finished with Failed
    9 days ago
    Total: 243s
    #393871
  • πŸ‡ΊπŸ‡ΈUnited States douggreen Winchester, VA

    This is ready for review again, complete with a test :)

  • Pipeline finished with Failed
    8 days ago
    Total: 178s
    #394678
  • Pipeline finished with Failed
    8 days ago
    Total: 148s
    #394742
  • Pipeline finished with Failed
    8 days ago
    Total: 144s
    #394746
  • Pipeline finished with Failed
    8 days ago
    Total: 149s
    #394748
  • First commit to issue fork.
  • Pipeline finished with Failed
    6 days ago
    Total: 675s
    #397086
  • Pipeline finished with Failed
    6 days ago
    Total: 604s
    #397097
  • heddn Nicaragua

    This can't be easily tested with the current approach. Perhaps the testbot CI can handle things, but running the test locally, (5000 max input var is my php's default) and the test never runs to completion. I tried to set the value lower via an ini_set in the test's setup method. But that still left the test form still using 5000 from max_input_vars.

    Instead I manually tested things with the test by artificually setting my max_input_vars to 5. That let things run to completion in a very happy way. Moving this back to NW because we either need to be OK with manual testing (and get rid of the tests) or figure out a way to runtime set the max_input_vars to a lower value.

  • heddn Nicaragua

    As I expected, I tracked down over in https://www.php.net/manual/en/ini.core.php#ini.sect.file-uploads that we can't set that value at run-time.

  • Pipeline finished with Failed
    6 days ago
    Total: 1663s
    #397112
  • πŸ‡¬πŸ‡§United Kingdom catch

    If it works on gitlab, could we check max_input_vars first, and if it's over a certain number, skip the test? That would mean the test doesn't time out locally and give people a hint how they can run it locally if it's likely to fail.

  • πŸ‡¬πŸ‡§United Kingdom catch

    It runs on the testbot, but at a huge cost, 36 minutes

    OMG definitely not, let's remove the test from the MR.

    Could the max_input_vars error get hidden by a subsequent error?

    That seems possible, but I don't think we know exactly when it will get triggered, so we would need to hard-code support for it in core's error handler probably - maybe that would be OK though.

Production build 0.71.5 2024