Add cache context url.site to robots.txt response

Created on 2 June 2022, over 2 years ago
Updated 27 August 2024, 4 months ago

Problem/Motivation

I have a website with language negotiation based on domain.

In a hook I add a link to the sitemap like this:

/**
 * Implements hook_robotstxt().
 */
function itr_global_robotstxt() {
  return [
    '# XML sitemap',
    'Sitemap: ' . \Drupal::request()->getSchemeAndHttpHost() . '/sitemap.xml',
  ];
}

This will result in the same URL for both domains.

Adding the cache context url.site will make a separate cache record for both domains.

Steps to reproduce

Proposed resolution

Remaining tasks

User interface changes

API changes

Data model changes

πŸ“Œ Task
Status

Fixed

Version

1.0

Component

Code

Created by

πŸ‡§πŸ‡ͺBelgium JeroenT πŸ‡§πŸ‡ͺ

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Merge Requests

Comments & Activities

Not all content is available!

It's likely this issue predates Contrib.social: some issue and comment data are missing.

Production build 0.71.5 2024