This repository automates the creation, versioning, and deployment of multiple Ansible Execution Environments (EE) using a dynamic GitLab CI/CD monorepo architecture and ansible-builder.
This project was developed using AI assistance. Artificial intelligence tools were used to help draft the architecture, generate functions, and write documentation. Please note that while AI accelerated the development process, every line of code has been manually audited, tested, and validated by the maintainer.
This includes the disclosure and most of this README as well.
This is meant to be run using registry.redhat.io. You can use this image instead of the Red Hat provided ones.
Change to image: ghcr.io/ansible/community-ansible-dev-tools:latest on your execution environment.
Instead of hardcoding jobs for every environment, this pipeline uses a dynamic generator. It scans the repository for EE configurations and automatically spins up parallel build jobs for each one.
The pipeline execution flows like this:
- Dynamic Pipeline Generation: A Python script (
generate_pipeline.py) scans theexecution-environments/directory. It dynamically generates achild-pipeline.ymlfile containing a dedicated build job for every folder it finds. - Execute Builds (Child Pipeline): GitLab triggers the child pipeline, running parallel jobs for each environment. Each isolated job performs the following:
- Generate Context: Uses
ansible-builderto parseexecution-environment.ymland dependencies (requirements.yml,requirements.txt,bindep.txt) into a standard Docker build context. - Build and Push: Builds an image based on the
execution-environment.ymlspecified image, and tags it with a unique ID (YYYY-MM-DD-run-ID) as well aslatest. - (This is commented out!!) Update AAP: A Python script (
update_ee.py) securely authenticates with the AAP API to update the specific Execution Environment resource with the newly pushed image tag.
- Generate Context: Uses
To support multiple execution environments, the repository is structured as a monorepo. Each environment lives in its own subdirectory under execution-environments/.
| .gitlab-ci.yml # Trigger pipeline
| generate_pipeline.py # Builds the dynamic child jobs
| update_ee.py # AAP API integration script
| execution-environments/
| ├── observability/
| │ ├── aap_name.txt # (Optional) The exact name of the EE in the AAP UI
| │ ├── execution-environment.yml
| │ ├── requirements.yml
| │ ├── requirements.txt
| │ └── bindep.txt
| ├── hypervisors/
| │ ├── aap_name.txt
| │ └── ...
Create a new folder under execution-environments/ (e.g., execution-environments/networking/).
Add your standard execution-environment.yml and requirement files.
Create a plain text file named aap_name.txt in that folder containing only the exact name of the EE as it appears in the Ansible Automation Platform UI (e.g., EE - Networking).
Commit and push. The pipeline will automatically detect the new folder and build it.
The following variables need to be setup on the repository CI/CD
| Key | Description | Visibility |
|---|---|---|
| AAP_TOKEN | AAP Token for AAP auth | Masked and hidden |
| AAP_URL | AAP URL | Visible |
| EXTERNAL_REGISTRY | Image Registry URL | Visible |
| REGISTRY_TOKEN | Registry Robot account token | Masked and hidden |
| REGISTRY_USER | Registry Robot account name | Visible |
| REPO_PROJECT | Project inside image registry | Visible |
| INTERNAL_CA_CERT | Needs to be setup as file. CA | Visible |
Because this repository functions as a monorepo containing multiple Execution Environments, the CI/CD pipeline is designed with "smart" change detection. It will only consume compute resources and build new Docker images when absolutely necessary.
Here is exactly when the pipeline runs and what it builds:
When you merge a Merge Request or push directly to the main branch, the pipeline analyzes the commit to see exactly which files were modified.
- Scenario A (EE Files Changed): If you edit files inside a specific directory (e.g.,
execution-environments/observability/requirements.yml), the pipeline will spin up a build job only for theobservabilityenvironment. All other environments are skipped. - Scenario B (Global Script Changed): If you modify the
update_ee.pyscript, the pipeline assumes a global logic change occurred and will trigger builds for all environments. - Scenario C (Docs/Unrelated Files Changed): If you only update the
README.md,.gitignore, or other root-level files, a lightweight dummy job runs just to satisfy GitLab's requirements. No images are built.
You can configure a Pipeline Schedule in GitLab (e.g., every Sunday at 2:00 AM) to automatically run maintenance.
- What happens: The pipeline completely ignores the change-detection logic. It loops through the entire repository and forces a fresh build and push of every single Execution Environment.
- Why: This ensures your Ansible environments automatically inherit the latest security patches from the base
ansible-builderimage and updatedpippackages without requiring a human to commit new code.
If you need to force an update immediately across the board (e.g., a critical zero-day vulnerability was announced and patched in the base OS).
- How to trigger: Go to Build > Pipelines in GitLab and click the Run pipeline button.
- What happens: Just like the scheduled run, this triggers a full rebuild of all Execution Environments simultaneously, pushing fresh
:latesttags for everything to your registry.