- 🇫🇷France fgm Paris, France
If built on-demand, this could be really long to build, so would need a batch. Imagine a multi-million-node site.
If built during filter processing, it would only work when nodes are being rendered, and would not be reasonably synchronous because the viewing times of the various nodes could be wildly different.
It seems this could/should be something like search indexing, using a cron-based batch process. Or it could leverage the fact that search is already doing this crawling, and rely on implementing hook_search_preprocess() to refresh its own index.
This is still complex but manageable. But the value is questionable when related to the complexity. If you are reading this issue and are interested, please comment on it.