Skip to content

dbarreda/gitlab-ansible-ee

Repository files navigation

Ansible Execution Environment CI/CD Pipeline

This repository automates the creation, versioning, and deployment of multiple Ansible Execution Environments (EE) using a dynamic GitLab CI/CD monorepo architecture and ansible-builder.

Disclosure

This project was developed using AI assistance. Artificial intelligence tools were used to help draft the architecture, generate functions, and write documentation. Please note that while AI accelerated the development process, every line of code has been manually audited, tested, and validated by the maintainer.

This includes the disclosure and most of this README as well.

Use of Red Hat repositories

This is meant to be run using registry.redhat.io. You can use this image instead of the Red Hat provided ones.

Change to image: ghcr.io/ansible/community-ansible-dev-tools:latest on your execution environment.

🚀 How it Works

Instead of hardcoding jobs for every environment, this pipeline uses a dynamic generator. It scans the repository for EE configurations and automatically spins up parallel build jobs for each one.

The pipeline execution flows like this:

  1. Dynamic Pipeline Generation: A Python script (generate_pipeline.py) scans the execution-environments/ directory. It dynamically generates a child-pipeline.yml file containing a dedicated build job for every folder it finds.
  2. Execute Builds (Child Pipeline): GitLab triggers the child pipeline, running parallel jobs for each environment. Each isolated job performs the following:
    • Generate Context: Uses ansible-builder to parse execution-environment.yml and dependencies (requirements.yml, requirements.txt, bindep.txt) into a standard Docker build context.
    • Build and Push: Builds an image based on the execution-environment.yml specified image, and tags it with a unique ID (YYYY-MM-DD-run-ID) as well as latest.
    • (This is commented out!!) Update AAP: A Python script (update_ee.py) securely authenticates with the AAP API to update the specific Execution Environment resource with the newly pushed image tag.

📁 Repository Structure

To support multiple execution environments, the repository is structured as a monorepo. Each environment lives in its own subdirectory under execution-environments/.

| .gitlab-ci.yml               # Trigger pipeline
| generate_pipeline.py         # Builds the dynamic child jobs
| update_ee.py                 # AAP API integration script
| execution-environments/
| ├── observability/
| │   ├── aap_name.txt         # (Optional) The exact name of the EE in the AAP UI
| │   ├── execution-environment.yml
| │   ├── requirements.yml
| │   ├── requirements.txt
| │   └── bindep.txt
| ├── hypervisors/
| │   ├── aap_name.txt
| │   └── ...

➕ How to Add a New Execution Environment

Create a new folder under execution-environments/ (e.g., execution-environments/networking/).

Add your standard execution-environment.yml and requirement files.

Create a plain text file named aap_name.txt in that folder containing only the exact name of the EE as it appears in the Ansible Automation Platform UI (e.g., EE - Networking).

Commit and push. The pipeline will automatically detect the new folder and build it.

📋 Variables

The following variables need to be setup on the repository CI/CD

Key Description Visibility
AAP_TOKEN AAP Token for AAP auth Masked and hidden
AAP_URL AAP URL Visible
EXTERNAL_REGISTRY Image Registry URL Visible
REGISTRY_TOKEN Registry Robot account token Masked and hidden
REGISTRY_USER Registry Robot account name Visible
REPO_PROJECT Project inside image registry Visible
INTERNAL_CA_CERT Needs to be setup as file. CA Visible

🚦 Pipeline Triggers & Build Scenarios

Because this repository functions as a monorepo containing multiple Execution Environments, the CI/CD pipeline is designed with "smart" change detection. It will only consume compute resources and build new Docker images when absolutely necessary.

Here is exactly when the pipeline runs and what it builds:

1. Pushing Code to main (Smart Builds)

When you merge a Merge Request or push directly to the main branch, the pipeline analyzes the commit to see exactly which files were modified.

  • Scenario A (EE Files Changed): If you edit files inside a specific directory (e.g., execution-environments/observability/requirements.yml), the pipeline will spin up a build job only for the observability environment. All other environments are skipped.
  • Scenario B (Global Script Changed): If you modify the update_ee.py script, the pipeline assumes a global logic change occurred and will trigger builds for all environments.
  • Scenario C (Docs/Unrelated Files Changed): If you only update the README.md, .gitignore, or other root-level files, a lightweight dummy job runs just to satisfy GitLab's requirements. No images are built.

2. Scheduled Pipeline (Full Rebuild)

You can configure a Pipeline Schedule in GitLab (e.g., every Sunday at 2:00 AM) to automatically run maintenance.

  • What happens: The pipeline completely ignores the change-detection logic. It loops through the entire repository and forces a fresh build and push of every single Execution Environment.
  • Why: This ensures your Ansible environments automatically inherit the latest security patches from the base ansible-builder image and updated pip packages without requiring a human to commit new code.

3. Manual Web Trigger (Emergency/Force Rebuild)

If you need to force an update immediately across the board (e.g., a critical zero-day vulnerability was announced and patched in the base OS).

  • How to trigger: Go to Build > Pipelines in GitLab and click the Run pipeline button.
  • What happens: Just like the scheduled run, this triggers a full rebuild of all Execution Environments simultaneously, pushing fresh :latest tags for everything to your registry.

About

Gitlab CI/CD pipelines for creating Ansible EE

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages