Allow using the Drupal core robots.txt file

Created on 15 January 2019, almost 6 years ago
Updated 17 June 2024, 5 months ago

Problem/Motivation

It would be useful to be able to configure robotstxt to use the default robots.txt file from Drupal core (core/assets/scaffold/files/robots.txt). Use cases:

  1. Start with the default and then use hooks provided by robotstxt to make adjustments.
  2. On development servers, use a custom file and on production servers use the default, controlled by different configuration.

Using the actual file from Drupal core means that it gets updated as core updates the file. This would not happen if one is relying on the default value created upon install.

Proposed resolution

Add a configuration variable to control whether robotstxt serves the default file or its own value.

Remaining tasks

Decide what to do. Implement.

User interface changes

Addition of a config control.

API changes

None.

Data model changes

Addition of a config control.

Original report

That is, when the robotstxt is empty use the DRUPAL_ROOT . '/sites/default/default.robots.txt' file.

Feature request
Status

Needs work

Version

1.0

Component

Code

Created by

🇨🇭Switzerland drikc

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Not all content is available!

It's likely this issue predates Contrib.social: some issue and comment data are missing.

  • 🇨🇦Canada Liam Morland Ontario, CA 🇨🇦
  • 🇪🇸Spain penyaskito Seville 💃, Spain 🇪🇸, UTC+2 🇪🇺

    I also think this would be a useful feature.

    Our usecase is that we have the default robots.txt, but we want to alter it with the xmlsitemap integration that is already built-in here.
    But if there are core/scaffolding changes to it, we would like to get those.
    If we want to do that, we need custom code instead of using this module, which I'm guessing others might be doing too.

  • 🇪🇨Ecuador LeonelEnriquez98

    To address this issue, I decided to add a new option in the settings form to allow using the default robots.txt file from Drupal Core

  • Open in Jenkins → Open on Drupal.org →
    Core: 10.2.x + Environment: PHP 8.1 & MySQL 5.7
    last update 5 months ago
    5 pass
  • Status changed to Needs work 5 months ago
  • 🇪🇸Spain penyaskito Seville 💃, Spain 🇪🇸, UTC+2 🇪🇺

    The patch at #6 looks good to me but:
    - Needs an update hook for setting the value while keeping BC (default should be the textbox value for existing installations)
    - Needs to modify default config? (IMHO default should be the existing scaffolding robots.txt)
    - Needs tests
    - Needs upgrade tests?

  • 🇨🇦Canada Liam Morland Ontario, CA 🇨🇦

    Does it need an update hook? After this change, use_default will be null on existing sites. This behaves the same as false so existing behaviour is unchanged.

  • 🇪🇸Spain penyaskito Seville 💃, Spain 🇪🇸, UTC+2 🇪🇺

    Might not be required ATM, but will make things easier when/if config schema validation is strictly applied (in config schema is defined as boolean, and not as nullable).

Production build 0.71.5 2024