Add PerformanceTestBase for allowing browser performance assertions within FunctionalJavaScriptTests

Created on 8 March 2023, almost 2 years ago
Updated 21 June 2023, over 1 year ago

Problem/Motivation

Drupal core doesn't have any automated performance testing, but it's been sorely needed since at least 2009 when I opened 🌱 Automated performance testing for core Active .

Since then, a lot of things have changed - at that time we were just in the process of adding SimpleTest coverage to core, now we're using PHPUnit. However the issue of uncaught performance regressions like 🐛 Performance regression introduced by container serialization solution Fixed and having to manually test performance improvements and fixes is still there.

In general, unless a change is very likely to be performance-sensitive, we don't require manual performance testing before commit - even with lighthouse/devtools/xhprof it's quite a lot of work to set up a test case, and even when we do manual performance testing, people often get it wrong (like comparing requests where one has some cold caches and one has warm caches for example).

We sometimes do require manual performance testing for improvements/fixes - but this only validates the improvement, it doesn't stop regressions very often - and in fact regressions are often introduced with changes that visually look 'innocent' from a performance perspective.

There is a lot of potential performance data we can collect, but broadly it falls into two categories:

1. Absolute performance data - how many queries are executed in a particular request, how many http requests are made from a particular request etc. For these, we can potentially set a fixed number and fail if a test deviates - although this will require issues that change the number without actually being regressions (like an extra article on the Umami front page with an image) to be adjusted each time. Even these might need adjusting with different database engines, for example.

2. Relative performance data - time to first byte, PHP execution time, largest contentful paint, memory usage. These will all vary depending on the environment that the test runs on and also vary across runs on the same environment.

Ideally, we'd collect as much data as possible, potentially with hard fails for things in the 'absolute' category, and then graphing or much wider thresholds for the 'relative' category.

Steps to reproduce

Proposed resolution

There are various options for driving a browser and collecting data, but we already have one in core - phpunit + mink/selenium drivers, which gives us access to a full browser. PHPUnit also allows everyone to maintain any tests we might add using an API/language they're familiar with and potentially to run them locally. Additionally, PHPUnit lets us add basic sanity checking to the test via non-performance assertions, so that functionality changes which might break the test can be picked up. Even if we ended up using something else eventually, it's an easy start.

Current plan:

Add a base class PerformanceTestBase extending WebDriverTestBase.

This enables chromedriver test logging, and then overrides ::drupalGet() to collect the performance log after each page request.

In this issue, collect number of CSS, JavaScript and image requests and add a basic test of Umami to assert how many on the front page.

Once this issue lands, there are multiple other types of performance data we can begin to collect:

Remaining tasks

User interface changes

API changes

Data model changes

Release notes snippet

Feature request
Status

Fixed

Version

11.0 🔥

Component
PHPUnit 

Last updated about 17 hours ago

Created by

🇬🇧United Kingdom catch

Live updates comments and jobs are added and updated live.
  • Performance

    It affects performance. It is often combined with the Needs profiling tag.

Sign in to follow issues

Comments & Activities

Not all content is available!

It's likely this issue predates Contrib.social: some issue and comment data are missing.

Production build 0.71.5 2024