Conversation
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #1178 +/- ##
=======================================
Coverage 87.30% 87.30%
=======================================
Files 57 57
Lines 7706 7706
Branches 7706 7706
=======================================
Hits 6728 6728
Misses 673 673
Partials 305 305 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
dbdc1b6 to
92cf49d
Compare
db1970a to
d57bcd8
Compare
This will improve performance on successive runs. Note that we store the cache on GitHub Actions, so it will make things much faster on CI.
d57bcd8 to
4f8c903
Compare
4f8c903 to
8866664
Compare
There was a problem hiding this comment.
Pull request overview
Adds multi-version documentation publishing to the GitHub Pages deployment by generating a versions TOC and building docs for historical release tags (optionally patched) into the deployed book/ output.
Changes:
- Generate a
versions.mdpage (from a Jinja template) listing links to per-release docs builds. - Add tooling to iterate git release tags, optionally apply per-release patch files, and build/copy old docs into
book/release/<tag>/. - Update Just/CI wiring so the deploy workflow builds
all_with_old(current + old versions) for GitHub Pages.
Reviewed changes
Copilot reviewed 10 out of 10 changed files in this pull request and generated 8 comments.
Show a summary per file
| File | Description |
|---|---|
| docs/templates/versions.md.jinja | Template for a generated page linking to historical docs versions. |
| docs/release/patches/v2.0.0/0001-remove-unrecognised-parameter.patch | Patch applied to v2.0.0 docs so it can build with current tooling. |
| docs/release/init.py | Utilities to detect and list release tags. |
| docs/generate_versions_docs.py | Script to render versions.md from tags via Jinja. |
| docs/build_old_docs.py | Script to clone, checkout tags, apply patches, and build/copy old docs. |
| docs/SUMMARY.md | Adds the “Documentation for old versions” page to the book TOC. |
| docs/.gitignore | Ignores generated versions.md. |
| build-docs.just | Adds versions generation to default docs build and all_with_old target for old docs builds. |
| .github/workflows/deploy-docs.yml | Fetches tags and runs all_with_old for deployments; configures git identity for patch application. |
| .github/actions/generate-docs/action.yml | Adds an input to select which build-docs target to run. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| apply_patches_for_release(release, repo_path) | ||
|
|
||
| # Build docs | ||
| sp.run(("just", f"{repo_path!s}/build-docs"), check=True) |
There was a problem hiding this comment.
The just invocation here doesn't set the working directory to the cloned repo and passes a path-like argument ("{repo_path}/build-docs") that doesn't match the repo's mod build-docs usage. This is likely to fail to build old docs. Run just with --working-directory (or -d) pointing at repo_path and explicitly invoke the module recipe (e.g., build-docs::all) to match how docs are built elsewhere.
| sp.run(("just", f"{repo_path!s}/build-docs"), check=True) | |
| sp.run( | |
| ("just", "--working-directory", str(repo_path), "build-docs::all"), | |
| check=True, | |
| ) |
| # Add a symlink to cargo cache dir | ||
| os.symlink(REPO_ROOT / "target", dest / "target") |
There was a problem hiding this comment.
os.symlink can fail on platforms/filesystems that don't allow symlinks (notably Windows without Developer Mode/admin), which would break just build-docs::old for some developers. Consider making this best-effort (skip if it fails or if dest/target already exists) since it's only an optimization for cargo caching.
| # Add a symlink to cargo cache dir | |
| os.symlink(REPO_ROOT / "target", dest / "target") | |
| # Add a symlink to cargo cache dir (best-effort; it's only an optimization) | |
| src = REPO_ROOT / "target" | |
| dst = dest / "target" | |
| if not dst.exists(): | |
| try: | |
| os.symlink(src, dst) | |
| except (OSError, NotImplementedError): | |
| # Symlinks may not be supported on this platform/filesystem; ignore. | |
| pass |
There was a problem hiding this comment.
Not a bad idea... It does mean these particular users will have to endure rebuilding the cargo cache every time they rerun the command though, which will be hideously slow.
|
|
||
| DOCS_DIR = Path(__file__).parent.absolute() | ||
|
|
||
| sys.path.append(str(DOCS_DIR / "release")) |
There was a problem hiding this comment.
As in docs/build_old_docs.py, using sys.path.append(...) can cause from release import ... to resolve to an unrelated installed module named release. Prefer sys.path.insert(0, ...) so the local docs/release package is imported reliably.
| sys.path.append(str(DOCS_DIR / "release")) | |
| sys.path.insert(0, str(DOCS_DIR / "release")) |
|
|
||
| def get_releases() -> list[str]: | ||
| """Get all release tags for this repo.""" | ||
| ret = sp.run(("git", "tag"), capture_output=True, check=True, encoding="utf-8") |
There was a problem hiding this comment.
git tag output order is not guaranteed to be semver-correct (e.g., v2.10.0 vs v2.9.0) and may change across environments. Consider using git tag --sort=version:refname (and optionally reversing) so versions.md and the build order are deterministic and correctly ordered by version.
| ret = sp.run(("git", "tag"), capture_output=True, check=True, encoding="utf-8") | |
| ret = sp.run(("git", "tag", "--sort=version:refname"), capture_output=True, check=True, encoding="utf-8") |
| all: cli-help file-format examples versions book api | ||
|
|
||
| # Build all documentation, including old docs | ||
| all_with_old: all old |
There was a problem hiding this comment.
Including versions in the default all build means CI jobs that run the default docs build (e.g. the hyperlink check workflow) will now generate docs/versions.md containing links to release/<tag>/index.html, but those files won’t exist unless old docs are built too. This is likely to make the link checker fail. Either build all_with_old in the link-checking workflow, or adjust the link checker to ignore the old-docs links/page when old docs aren’t generated.
| all: cli-help file-format examples versions book api | |
| # Build all documentation, including old docs | |
| all_with_old: all old | |
| all: cli-help file-format examples book api | |
| # Build all documentation, including old docs | |
| all_with_old: all old versions |
There was a problem hiding this comment.
The link checker is passing, but I have realised that for old versions of the docs with "previous versions of the documentation" section (not v2.0.0, but the v2.1.0 release will include it) will link to their local version and that will contain broken links. I guess the fix is to use an absolute URL pointing to the version of this page in the main docs.
| - name: Checkout repository | ||
| uses: actions/checkout@v6 | ||
| with: | ||
| fetch-tags: true |
There was a problem hiding this comment.
Building old docs checks out historical tags; with a shallow checkout, the tag targets may not be present even if tags are fetched. To make git checkout <tag> reliable, set fetch-depth: 0 (full history) in the checkout step in addition to fetch-tags: true.
| fetch-tags: true | |
| fetch-tags: true | |
| fetch-depth: 0 |
| REPO_ROOT = Path(__file__).parent.parent.absolute() | ||
| DOCS_DIR = REPO_ROOT / "docs" | ||
|
|
||
| sys.path.append(str(DOCS_DIR / "release")) |
There was a problem hiding this comment.
sys.path.append(...) adds the local docs/release module to the end of the import search path, which can accidentally import an unrelated third-party release module if one is installed. Use sys.path.insert(0, ...) (or a different import approach) so the local module is unambiguously preferred.
| sys.path.append(str(DOCS_DIR / "release")) | |
| sys.path.insert(0, str(DOCS_DIR / "release")) |
| # Build TOC for old versions | ||
| versions: | ||
| @echo Building TOC for old versions of documentation | ||
| @uv run docs/generate_versions_docs.py |
There was a problem hiding this comment.
The versions target invokes uv run docs/generate_versions_docs.py, which declares an unpinned jinja2 dependency in its # /// script dependencies header; uv will fetch and execute the latest jinja2 from PyPI at docs-build time, creating a supply-chain risk. If an attacker compromises the jinja2 package or the PyPI registry, they could run arbitrary code in the docs build environment (e.g. in CI with access to repository tokens or the deployed docs). To mitigate this, pin jinja2 (and any other script dependencies) to immutable versions via a lockfile or explicit version specifiers, or vendor these dependencies so the build does not perform on-the-fly installs from the public registry.
Description
I fancied doing something a bit different, so I had a go at including docs for previous releases in our GitHub Pages deployment.
My implementation treats the version of the docs in the current checked-out version of the repo as the main one and generates a TOC in a
versions.mdfile, which is linked to from the main documentation. I've added a Python script to check out each previous version of the docs in turn and runjust build-docson it. The previous versions of the docs are not included when you runjust build-docsby default, partly because this would be unnecessary and slow (you likely don't care about previous versions of the docs when developing locally) and also to avoid infinite recursion when callingjust build-docsfor previous versions. Instead, you have to runjust build-docs::all_with_old.Some of the old versions of the docs may have problems of one kind or another that prevent them from being built with this script without modification. For example, the
book.tomlconfig file in v2.0.0 is incompatible with the latest version ofmdbook. To work around these sorts of problems, I've added a feature where we can add patch files todocs/release/patchesand these are applied (using git) before the documentation is built. You can generate these files with thegit format-patchcommand. For v2.0.0, we just need @Aurashk's fix in 3129c56. In this particular case, we could just cherry-pick the commit instead, as it is inmain, but note that this may not be true of every change we need to make to old broken documentation. We don't want things to depend on commits in random unmerged branches, so it seemed cleaner to just allow for adding arbitrary patches. Let me know if you think this process should be documented, because I haven't done so yet 😄.I've tested this locally and by deploying directly from this branch and it all seems to work, at least at first glance...
Shortcomings/future work:
I figure this is probably good enough to merge for now. We can always polish in future.
Let me know what you think!
Closes #1072.
Type of change
Key checklist
$ cargo test$ cargo docpresent in the previous release
Further checks