This package provides multiple Python runtime environments for Platforma Backend, supporting different Python versions with shared build configuration and version-specific overrides.
- Python 3.12.10 - Latest stable version with newest package compatibility
- Python 3.12.10-atls - A variant of 3.12.10 with custom ATLS packages.
This project uses a monorepo structure similar to the Java Corretto setup:
runenv-python-3/
├── shared-config.json # Shared configuration for all versions
├── python-3.12.10/ # Python 3.12.10 specific package
│ ├── config.json # Version-specific overrides
│ └── package.json # Package metadata
├── catalogue/ # Main package referencing all versions
├── scripts/ # Build and publish scripts
├── scripts/config-merger.js # Configuration merger utility
└── package.json # Root package with all entrypoints
Contains common settings for all Python versions:
- Registries: PyPI.org as default with NVIDIA PyPI as additional source
- Build settings: Timeouts, logging, parallel downloads
- Platform-specific rules: Skip/force source rules for different platforms
- Common dependencies: Base package list
Each Python version can override shared settings:
- Package overrides: Version-specific package versions
- Additional skip rules: Version-specific exclusions
- Custom dependencies: Version-specific package lists
The config-merger.js utility:
- Loads shared configuration
- Loads version-specific configuration
- Merges settings with version-specific overrides taking precedence
- Validates the final configuration
- shared-config.json: Contains common settings for all Python versions (registries with PyPI.org as default and NVIDIA PyPI as additional, dependencies, platform rules, build options).
- python-/config.json: Contains version-specific overrides (dependencies, overrides, skip/force rules for that version).
{
"registries": {
"additional": ["https://pypi.nvidia.com"]
},
"packages": {
"dependencies": [
"pandas==2.2.3",
"numpy==2.2.6",
"scipy==1.15.3"
],
"skip": { ... },
"forceSource": { ... }
},
"build": {
"enableLogging": true,
"parallelDownloads": false,
"timeout": 300
}
}{
"packages": {
"dependencies": [ ... ],
"overrides": { ... },
"skip": { ... },
"forceSource": { ... }
}
}- You only need to specify fields you want to override for a specific version.
- If you omit
dependencies, the shared ones are used. - If you provide
overrides, only those packages are version-overridden.
The build system supports intelligent package handling with two types of exceptions:
Packages that should be completely skipped for specific platforms:
{
"packages": {
"skip": {
"cudf-cu12": {
"macosx-x64": "CUDA packages not supported on macOS",
"macosx-aarch64": "CUDA packages not supported on macOS",
"windows-x64": "CUDA packages not supported on Windows"
}
}
}
}Packages that should always be built from source for specific platforms:
{
"packages": {
"forceSource": {
"parasail": {
"linux-aarch64": "parasail has no binary wheels for Linux ARM64",
"macosx-aarch64": "parasail has no binary wheels for macOS ARM64"
}
}
}
}Platform keys follow the format: {os}-{arch}
linux-x64- Linux AMD64linux-aarch64- Linux ARM64macosx-x64- macOS Intelmacosx-aarch64- macOS Apple Siliconwindows-x64- Windows AMD64
{
"build": {
"enableLogging": true,
"parallelDownloads": false,
"timeout": 300
}
}- enableLogging: Enable detailed build logging
- parallelDownloads: Enable parallel package downloads (experimental)
- timeout: Build timeout in seconds
pnpm build# Using Turbo filter
pnpm build --filter=@platforma-open/milaboratories.runenv-python-3.12.10
pnpm build --filter=@platforma-open/milaboratories.runenv-python-3.12.10-atls
# Direct script usage
node scripts/build.js 3.12.10
node scripts/build.js 3.12.10-atls# Publish all packages
pnpm postbuild-publish
# Publish specific version (from version directory)
cd python-3.12.10 && pnpm postbuild-publish
cd python-3.12.10-atls && pnpm postbuild-publishpnpm cleanupThe script will automatically merge shared-config.json and python-<version>/config.json.
For advanced use cases, such as creating version variants with pre-compiled binaries or custom modules, the build system supports a copyFiles directive in the config.json.
This allows you to copy files and directories from your package source into the final Python environment during the build.
The copyFiles directive is an array of objects, where each object specifies a from and to path.
from: The source path, relative to the package directory (e.g.,python-3.12.10-atls/).to: The destination path, relative to the root of the installed Python environment (e.g.,pydist/linux-x64/).
The system provides a special placeholder, {site-packages}, which automatically resolves to the correct site-packages directory for the current Python version and OS. This is the recommended way to install custom Python modules.
This configuration copies custom binaries to the bin/ directory and Python modules to the site-packages directory for each platform.
{
"packages": {
"dependencies": [
"torch==2.7.0",
"ImmuneBuilder==1.2"
],
"platformSpecific": {
"linux-x64": {
"copyFiles": [
{ "from": "linux-x64/bin/", "to": "bin/" },
{ "from": "linux-x64/site-packages/", "to": "{site-packages}/" }
]
},
"macosx-aarch64": {
"copyFiles": [
{ "from": "macosx-aarch64/bin/", "to": "bin/" },
{ "from": "macosx-aarch64/site-packages/", "to": "{site-packages}/" }
]
}
}
}
}The source files for this example would be structured as follows:
python-3.12.10-atls/
├── linux-x64/
│ ├── bin/
│ │ └── custom_tool
│ └── site-packages/
│ └── custom_module/
│ └── __init__.py
├── macosx-aarch64/
│ ├── bin/
│ └── site-packages/
└── config.json
To add a new standard Python version or a custom variant, follow these steps.
-
Create the Package Directory: The directory name defines the version string. Use a suffix for variants.
# For a standard version mkdir python-3.13.0 # For a custom variant mkdir python-3.13.0-custom
-
Create Version-Specific
config.json: Inside the new directory, create aconfig.json.- For a standard version, you can start with an empty config or specify overrides.
- For a variant, this is where you define its unique dependencies or
copyFilesdirectives.
-
Create
package.json: Copy an existingpackage.jsonand update the following:name: Should include the full version string (e.g.,@platforma-open/milaboratories.runenv-python-3.13.0-custom).description: Update with the correct version.scripts.build: Ensure the script callsbuild.jswith the correct full version string.
Crucial point for variants: The
buildscript must pass the full version string so the build system can locate the correct configuration."scripts": { "build": "node ../scripts/build.js 3.13.0-custom" }
The build script is smart: it will use
3.13.0-customto find the config but will use3.13.0to download the base portable Python. -
Update Workspace (
pnpm-workspace.yaml): Add the new package directory topnpm-workspace.yaml.packages: - 'python-3.12.10' - 'python-3.12.10-atls' - 'python-3.13.0-custom' # Add new version here
-
Update Catalogue: Add the new package as a dependency and an entrypoint in the
catalogue/package.json. -
Test the Build:
pnpm build --filter=@platforma-open/milaboratories.runenv-python-3.13.0-custom
Each Python version has different package compatibility:
- Latest package versions (uses shared configuration)
- Full CUDA support with platform-specific exclusions
- Experimental TensorFlow ARM64 builds
- Dependencies: pandas 2.2.3, numpy 2.2.6, scipy 1.15.3, scikit-learn 1.6.1, etc.
All versions support:
- Linux: x64, aarch64
- macOS: x64, aarch64
- Windows: x64
- Node.js >= 20
- pnpm >= 9.14.4
pnpm install- Make changes to shared or version-specific configs
- Test with
pnpm build --filter=<package-name> - Run full build with
pnpm build - Publish with
pnpm postbuild-publish
The project uses Turbo for build orchestration with a simplified configuration:
{
"tasks": {
"build": {
"inputs": ["$TURBO_DEFAULT$"],
"outputs": ["./dist/**"]
},
"postbuild-publish": {
"dependsOn": ["build"],
"passThroughEnv": [...]
}
}
}- Independent builds: Each Python version builds independently
- Environment passthrough: Proper AWS and registry credentials handling
- Cleanup scripts: Comprehensive cleanup including build directories
- Be specific: Only add exceptions when absolutely necessary
- Document reasons: Always provide clear explanations for why exceptions exist
- Test thoroughly: Verify that exceptions work on all affected platforms
- Keep updated: Remove exceptions when packages add support for new platforms
- Use simple configs: Prefer
skipandforceSourceoverplatformSpecificwhen possible
- Version Management: Keep supported versions up to date
- Registry Selection: Only include registries you trust and need
- Package Exceptions: Document clear reasons for all exceptions
- Configuration Reuse: Create version-specific configs for different needs
- Validation: Test configurations on all target platforms
- Check that config files are valid JSON
- Verify the files are in the correct locations
- Check file permissions
- Look for console warnings during build startup
The build will warn if you try to build an unsupported Python version.
If packages fail to download, check:
- Registry URLs are accessible
- Registry URLs are in the correct format
- Network connectivity to all registries
If the build fails to load the exceptions configuration:
- Check that config files are valid JSON
- Verify the files are in the correct locations
- Check file permissions
- Look for console warnings during build startup
The build will continue with an empty configuration if the files cannot be loaded.
- Large packages: Some builds may produce large artifacts (200MB+) due to dependencies
- Platform-specific: Builds are optimized for each platform (Linux, macOS, Windows)
- Cleanup: Use
pnpm cleanupto remove build artifacts and free disk space
- There is no longer a
pythonsection or a singlebuild-config.json. - No config override file is needed; the system is automatic.
- The build system automatically tries binary wheels first, then falls back to source builds.
- Multiple PyPI registries are supported via configuration.
- The catalogue package provides a unified interface to all Python versions.
- Build artifacts are stored in
pydist/directories for each platform.