Skip to content

Publish old versions of docs#1178

Open
alexdewar wants to merge 5 commits intomainfrom
versioned-docs
Open

Publish old versions of docs#1178
alexdewar wants to merge 5 commits intomainfrom
versioned-docs

Conversation

@alexdewar
Copy link
Collaborator

@alexdewar alexdewar commented Mar 5, 2026

Description

I fancied doing something a bit different, so I had a go at including docs for previous releases in our GitHub Pages deployment.

My implementation treats the version of the docs in the current checked-out version of the repo as the main one and generates a TOC in a versions.md file, which is linked to from the main documentation. I've added a Python script to check out each previous version of the docs in turn and run just build-docs on it. The previous versions of the docs are not included when you run just build-docs by default, partly because this would be unnecessary and slow (you likely don't care about previous versions of the docs when developing locally) and also to avoid infinite recursion when calling just build-docs for previous versions. Instead, you have to run just build-docs::all_with_old.

Some of the old versions of the docs may have problems of one kind or another that prevent them from being built with this script without modification. For example, the book.toml config file in v2.0.0 is incompatible with the latest version of mdbook. To work around these sorts of problems, I've added a feature where we can add patch files to docs/release/patches and these are applied (using git) before the documentation is built. You can generate these files with the git format-patch command. For v2.0.0, we just need @Aurashk's fix in 3129c56. In this particular case, we could just cherry-pick the commit instead, as it is in main, but note that this may not be true of every change we need to make to old broken documentation. We don't want things to depend on commits in random unmerged branches, so it seemed cleaner to just allow for adding arbitrary patches. Let me know if you think this process should be documented, because I haven't done so yet 😄.

I've tested this locally and by deploying directly from this branch and it all seems to work, at least at first glance...

Shortcomings/future work:

  • Hyperlinks in old docs aren't currently checked by CI. We could easily do this, but I figured it was unnecessary and would slow things down.
  • It isn't obvious from the site for v2.0.0 that this isn't the main documentation page. When v2.1.0 is released, it won't be obvious from the v2.0.0 site that it is for an old version of the docs.
  • It would be nicer to have the "main" docs be listed as just another "version" ("development"?), but that was harder to do so I've left it for now
  • We could list all versions of the documentation in the TOC in the left-hand pane; currently there's just a link to a page with all of them listed. Again this would have been a bit of a faff to do.

I figure this is probably good enough to merge for now. We can always polish in future.

Let me know what you think!

Closes #1072.

Type of change

  • Bug fix (non-breaking change to fix an issue)
  • New feature (non-breaking change to add functionality)
  • Refactoring (non-breaking, non-functional change to improve maintainability)
  • Optimization (non-breaking change to speed up the code)
  • Breaking change (whatever its nature)
  • Documentation (improve or add documentation)

Key checklist

  • All tests pass: $ cargo test
  • The documentation builds and looks OK: $ cargo doc
  • Update release notes for the latest release if this PR adds a new feature or fixes a bug
    present in the previous release

Further checks

  • Code is commented, particularly in hard-to-understand areas
  • Tests added that prove fix is effective or that feature works

@codecov
Copy link

codecov bot commented Mar 5, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 87.30%. Comparing base (7bb8e2e) to head (8866664).

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #1178   +/-   ##
=======================================
  Coverage   87.30%   87.30%           
=======================================
  Files          57       57           
  Lines        7706     7706           
  Branches     7706     7706           
=======================================
  Hits         6728     6728           
  Misses        673      673           
  Partials      305      305           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@alexdewar alexdewar force-pushed the versioned-docs branch 2 times, most recently from db1970a to d57bcd8 Compare March 6, 2026 07:55
This will improve performance on successive runs. Note that we store the cache on GitHub Actions, so it will make things much faster on CI.
@alexdewar alexdewar marked this pull request as ready for review March 6, 2026 12:25
Copilot AI review requested due to automatic review settings March 6, 2026 12:25
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds multi-version documentation publishing to the GitHub Pages deployment by generating a versions TOC and building docs for historical release tags (optionally patched) into the deployed book/ output.

Changes:

  • Generate a versions.md page (from a Jinja template) listing links to per-release docs builds.
  • Add tooling to iterate git release tags, optionally apply per-release patch files, and build/copy old docs into book/release/<tag>/.
  • Update Just/CI wiring so the deploy workflow builds all_with_old (current + old versions) for GitHub Pages.

Reviewed changes

Copilot reviewed 10 out of 10 changed files in this pull request and generated 8 comments.

Show a summary per file
File Description
docs/templates/versions.md.jinja Template for a generated page linking to historical docs versions.
docs/release/patches/v2.0.0/0001-remove-unrecognised-parameter.patch Patch applied to v2.0.0 docs so it can build with current tooling.
docs/release/init.py Utilities to detect and list release tags.
docs/generate_versions_docs.py Script to render versions.md from tags via Jinja.
docs/build_old_docs.py Script to clone, checkout tags, apply patches, and build/copy old docs.
docs/SUMMARY.md Adds the “Documentation for old versions” page to the book TOC.
docs/.gitignore Ignores generated versions.md.
build-docs.just Adds versions generation to default docs build and all_with_old target for old docs builds.
.github/workflows/deploy-docs.yml Fetches tags and runs all_with_old for deployments; configures git identity for patch application.
.github/actions/generate-docs/action.yml Adds an input to select which build-docs target to run.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

apply_patches_for_release(release, repo_path)

# Build docs
sp.run(("just", f"{repo_path!s}/build-docs"), check=True)
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The just invocation here doesn't set the working directory to the cloned repo and passes a path-like argument ("{repo_path}/build-docs") that doesn't match the repo's mod build-docs usage. This is likely to fail to build old docs. Run just with --working-directory (or -d) pointing at repo_path and explicitly invoke the module recipe (e.g., build-docs::all) to match how docs are built elsewhere.

Suggested change
sp.run(("just", f"{repo_path!s}/build-docs"), check=True)
sp.run(
("just", "--working-directory", str(repo_path), "build-docs::all"),
check=True,
)

Copilot uses AI. Check for mistakes.
Comment on lines +24 to +25
# Add a symlink to cargo cache dir
os.symlink(REPO_ROOT / "target", dest / "target")
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

os.symlink can fail on platforms/filesystems that don't allow symlinks (notably Windows without Developer Mode/admin), which would break just build-docs::old for some developers. Consider making this best-effort (skip if it fails or if dest/target already exists) since it's only an optimization for cargo caching.

Suggested change
# Add a symlink to cargo cache dir
os.symlink(REPO_ROOT / "target", dest / "target")
# Add a symlink to cargo cache dir (best-effort; it's only an optimization)
src = REPO_ROOT / "target"
dst = dest / "target"
if not dst.exists():
try:
os.symlink(src, dst)
except (OSError, NotImplementedError):
# Symlinks may not be supported on this platform/filesystem; ignore.
pass

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not a bad idea... It does mean these particular users will have to endure rebuilding the cargo cache every time they rerun the command though, which will be hideously slow.


DOCS_DIR = Path(__file__).parent.absolute()

sys.path.append(str(DOCS_DIR / "release"))
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As in docs/build_old_docs.py, using sys.path.append(...) can cause from release import ... to resolve to an unrelated installed module named release. Prefer sys.path.insert(0, ...) so the local docs/release package is imported reliably.

Suggested change
sys.path.append(str(DOCS_DIR / "release"))
sys.path.insert(0, str(DOCS_DIR / "release"))

Copilot uses AI. Check for mistakes.

def get_releases() -> list[str]:
"""Get all release tags for this repo."""
ret = sp.run(("git", "tag"), capture_output=True, check=True, encoding="utf-8")
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

git tag output order is not guaranteed to be semver-correct (e.g., v2.10.0 vs v2.9.0) and may change across environments. Consider using git tag --sort=version:refname (and optionally reversing) so versions.md and the build order are deterministic and correctly ordered by version.

Suggested change
ret = sp.run(("git", "tag"), capture_output=True, check=True, encoding="utf-8")
ret = sp.run(("git", "tag", "--sort=version:refname"), capture_output=True, check=True, encoding="utf-8")

Copilot uses AI. Check for mistakes.
Comment on lines +5 to +8
all: cli-help file-format examples versions book api

# Build all documentation, including old docs
all_with_old: all old
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Including versions in the default all build means CI jobs that run the default docs build (e.g. the hyperlink check workflow) will now generate docs/versions.md containing links to release/<tag>/index.html, but those files won’t exist unless old docs are built too. This is likely to make the link checker fail. Either build all_with_old in the link-checking workflow, or adjust the link checker to ignore the old-docs links/page when old docs aren’t generated.

Suggested change
all: cli-help file-format examples versions book api
# Build all documentation, including old docs
all_with_old: all old
all: cli-help file-format examples book api
# Build all documentation, including old docs
all_with_old: all old versions

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The link checker is passing, but I have realised that for old versions of the docs with "previous versions of the documentation" section (not v2.0.0, but the v2.1.0 release will include it) will link to their local version and that will contain broken links. I guess the fix is to use an absolute URL pointing to the version of this page in the main docs.

- name: Checkout repository
uses: actions/checkout@v6
with:
fetch-tags: true
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Building old docs checks out historical tags; with a shallow checkout, the tag targets may not be present even if tags are fetched. To make git checkout <tag> reliable, set fetch-depth: 0 (full history) in the checkout step in addition to fetch-tags: true.

Suggested change
fetch-tags: true
fetch-tags: true
fetch-depth: 0

Copilot uses AI. Check for mistakes.
REPO_ROOT = Path(__file__).parent.parent.absolute()
DOCS_DIR = REPO_ROOT / "docs"

sys.path.append(str(DOCS_DIR / "release"))
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sys.path.append(...) adds the local docs/release module to the end of the import search path, which can accidentally import an unrelated third-party release module if one is installed. Use sys.path.insert(0, ...) (or a different import approach) so the local module is unambiguously preferred.

Suggested change
sys.path.append(str(DOCS_DIR / "release"))
sys.path.insert(0, str(DOCS_DIR / "release"))

Copilot uses AI. Check for mistakes.
# Build TOC for old versions
versions:
@echo Building TOC for old versions of documentation
@uv run docs/generate_versions_docs.py
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The versions target invokes uv run docs/generate_versions_docs.py, which declares an unpinned jinja2 dependency in its # /// script dependencies header; uv will fetch and execute the latest jinja2 from PyPI at docs-build time, creating a supply-chain risk. If an attacker compromises the jinja2 package or the PyPI registry, they could run arbitrary code in the docs build environment (e.g. in CI with access to repository tokens or the deployed docs). To mitigate this, pin jinja2 (and any other script dependencies) to immutable versions via a lockfile or explicit version specifiers, or vendor these dependencies so the build does not perform on-the-fly installs from the public registry.

Copilot uses AI. Check for mistakes.
@alexdewar alexdewar requested review from Aurashk and tsmbland March 6, 2026 12:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Documentation page for each release

2 participants