Geospatial Sample Data Management System
New Mexico Bureau of Geology and Mineral Resources
OcotilloAPI is a FastAPI-based backend service designed to manage geospatial sample location data across New Mexico. It supports research, field operations, and public data delivery for the Bureau of Geology and Mineral Resources.
- π RESTful API for managing sample location data
- πΊοΈ Native GeoJSON support via PostGIS
- π Filtering by location, date, type, and more
- π¦ PostgreSQL + PostGIS database backend
- π Optional authentication and role-based access
- π§Ύ Interactive API documentation via OpenAPI and ReDoc
The API exposes OGC API - Features endpoints under /ogcapi using pygeoapi.
In App Engine deployments, /admin and /ogcapi are served from the same
application as the primary API. The service is intended to scale to zero
outside business hours and be kept warm during the workday with Cloud Scheduler
hits to /_ah/warmup.
curl http://localhost:8000/ogcapi
curl http://localhost:8000/ogcapi/conformance
curl http://localhost:8000/ogcapi/collections
curl http://localhost:8000/ogcapi/collections/locationscurl "http://localhost:8000/ogcapi/collections/locations/items?limit=10&offset=0"
curl "http://localhost:8000/ogcapi/collections/wells/items?limit=5"
curl "http://localhost:8000/ogcapi/collections/springs/items?limit=5"
curl "http://localhost:8000/ogcapi/collections/locations/items/123"curl "http://localhost:8000/ogcapi/collections/locations/items?bbox=-107.9,33.8,-107.8,33.9"
curl "http://localhost:8000/ogcapi/collections/wells/items?datetime=2020-01-01/2024-01-01"Use filter + filter-lang=cql2-text with WITHIN(...):
curl "http://localhost:8000/ogcapi/collections/locations/items?filter=WITHIN(geometry,POLYGON((-107.9 33.8,-107.8 33.8,-107.8 33.9,-107.9 33.9,-107.9 33.8)))&filter-lang=cql2-text"curl "http://localhost:8000/ogcapi/openapi?ui=swagger"- Python 3.11+
uvpackage manager- Docker Desktop 4+ if wanting to host server/database locally with containers
- PostgreSQL with PostGIS extension if wanting to host server/database locally without containers
git clone https://github.com/DataIntegrationGroup/OcotilloAPI.git
cd OcotilloAPI| Mac/Linux | Windows |
uv venv
source .venv/bin/activate
uv sync --locked |
uv venv
source .venv/Scripts/activate
uv sync --locked |
pre-commit install# Edit `.env` to configure database connection and app settings
cp .env.example .envNotes:
- Create file gcs_credentials.json in the root directory of the project, and obtain its contents from a teammate.
- PostgreSQL uses the default port 5432.
Minimum vars to set in .env for local development:
POSTGRES_USERPOSTGRES_PASSWORDPOSTGRES_DB(ocotilloapi_devwhen using Docker Compose dev)POSTGRES_HOST(localhostfor local psql/pytest against mapped Docker port)POSTGRES_PORT(5432)MODE(developmentrecommended locally)SESSION_SECRET_KEY(required if you want to use/admin)
Auth-related vars (required when auth is enabled, optional when AUTHENTIK_DISABLE_AUTHENTICATION=1):
AUTHENTIK_DISABLE_AUTHENTICATIONAUTHENTIK_URLAUTHENTIK_CLIENT_IDAUTHENTIK_AUTHORIZE_URLAUTHENTIK_TOKEN_URL
pygeoapi vars:
PYGEOAPI_MOUNT_PATH(default/ogcapi)PYGEOAPI_RUNTIME_DIR(default/tmp/pygeoapi)PYGEOAPI_POSTGRES_HOSTPYGEOAPI_POSTGRES_PORTPYGEOAPI_POSTGRES_DBPYGEOAPI_POSTGRES_USERPYGEOAPI_POSTGRES_PASSWORD
Optional telemetry vars:
SENTRY_DSNAPITALLY_CLIENT_IDENVIRONMENT
In development set MODE=development to allow lexicon enums to be populated. When MODE=development, the app attempts to seed the database with 10 example records via transfers/seed.py; if a contact record already exists, the seed step is skipped.
Choose one of the following:
Option A: Local PostgreSQL + PostGIS
# run database migrations
alembic upgrade head
# start development server
uvicorn app.main:app --reloadNotes:
- Requires PostgreSQL with PostGIS installed locally.
- Use the
POSTGRES_*settings in.envfor your local instance.
Option B: Docker Compose (dev)
# include -d flag for silent/detached build
docker compose up --buildNotes:
- Requires Docker Desktop.
- By default, spins up two containers:
dbfor PostGIS/PostgreSQLappfor the primary API, admin UI, and OGC API onhttp://localhost:8000
dbinitializes both application databases in the same Postgres service:ocotilloapi_devocotilloapi_test
alembic upgrade headruns in theappcontainer on startup.- Compose uses hardcoded DB names:
- dev:
ocotilloapi_dev - test:
ocotilloapi_test(created by init SQL indocker/db/init/01-create-test-db.sql)
- dev:
- The database listens on port
5432both inside the container and on your host. EnsurePOSTGRES_PORT=5432andPOSTGRES_DB=ocotilloapi_devin your.envto run local commands against the Docker dev DB (e.g.,uv run pytest,uv run python -m transfers.transfer). - To restore a local or GCS-backed SQL dump into your local target DB, run
source .venv/bin/activate && python -m cli.cli restore-local-db path/to/dump.sqlorsource .venv/bin/activate && python -m cli.cli restore-local-db gs://ocotillo/sql-exports/latest.sql.gz. SESSION_SECRET_KEYonly needs to be set in.envif you plan to use/admin; without it, the API and/ogcapistill boot, but/adminwill be unavailable.
To get staging data into the database: python -m transfers.transfer from the root directory of the project.
app/
βββ .env # Environment variables
βββ .pre-commit-config.yaml # pre-commit hook configuration file
βββ constants.py # Static variables used throughout the code
βββ docker-compose.yml # Docker compose file to build database and start server
βββ entrypoint.sh # Used by Docker to run database migrations and start server
βββ main.py # FastAPI entry point
|
βββ alembic/ # Alembic configuration and migration scripts
βββ api/ # Route declarations
βββ core/ # Settings, application config, and dependencies
βββ db/ # Database models, sessions, and engine
βββ docker/ # Custom Docker files
βββ schemas/ # Pydantic schemas and validations
βββ services/ # Reusable business logic, helpers, and database interactions
βββ tests/ # Code tests
βββ transfers/ # Scripts to transfer data from NM_Aquifer to current db schema
- Revise models in the
db/directory - Revise schemas in the
schemas/directory- Add validators for both fields and models as necessary
- Validations on incoming data only should be handled by Pydantic and 422 errors will be raised (default Pydantic)
- Validations against values in the database will be handled at the endpoint with custom checks and 409 errors will be raised
- Revise tests
- Revise fixtures in
tests/conftest.py - Revise fields in POST test payloads and asserts
- Revise fields in PATCH test payloads and asserts
- Revise fields in GET all and GET by ID test asserts
- Add tests for validations as necessary
- Revise fixtures in
Bonus:
- Update transfer scripts by revising fields and delineating where they come from in
NM_Aquifer
Notes:
- All
Createschema fields are defined as<type>if non-nullable and<type> | None = Noneif nullable - All
Updateschema fields are optional and default toNone - All
Responseschema fields are defined as<type>if non-nullable and<type> | Noneif nullable - All raised exceptions should use the
PydanticStyleExceptionas defined inservices/exceptions_helper.py - Errors handled by the database should be enumerated and handled in a database_error_handler in each router's file---
The oco command exposes project automation and bulk data utilities.
# Display available commands
oco --help
# Bulk import water level data from a CSV
oco water-levels bulk-upload --file water_levels.csv --output jsonThe bulk upload command parses and validates each row, creates the corresponding field events/samples/observations, and prints a JSON summary (matching the API response shape) so the workflow can be automated or scripted.
# Run unit tests
pytest
# Run Behave BDD specs
behave tests/featuresTests require a local Postgres/PostGIS instance. Set
POSTGRES_*values in.env, run migrations, and ensure the database is reachable before running the suites.
Legacy or staging datasets can be imported using the transfer utilities:
python -m transfers.transferConfigure the .env file with the appropriate credentials before running transfers.
If contact transfers fail with OwnerKey normalization collisions, add or update
transfers/data/owners_ownerkey_mapper.json to map inconsistent OwnerKey values
to a single canonical spelling before re-running the transfer.
To drop the existing schema and rebuild from migrations before transferring data, set:
export DROP_AND_REBUILD_DB=true