Sanguine is a web-based platform for exploring hospital blood usage, patient blood management metrics, and surgery-level outcomes. It is intended for clinicians, researchers, and administrators who need to understand transfusion patterns, identify opportunities to improve care, and reduce blood product cost.
For deployment or partnership inquiries, contact contact@intelvia.io.
- Overview
- Architecture
- Development Setup
- Common Development Tasks
- Testing
- Deployment
- Derived Data Pipeline
- Security and Monitoring
Sanguine combines a React frontend, a Django backend, a MariaDB source database, and cached parquet artifacts to make large-scale PBM exploration practical in the browser. The backend prepares derived datasets and parquet caches, and the frontend uses DuckDB WASM to query those parquet files client-side for interactive analysis.
We currently support multiple deployments, including University of Utah and partner institutions.
flowchart TD
User[Clinician / Researcher / Admin]
Nginx[VM Nginx: HTTPS terminator]
Backend[Backend Container: Django]
Frontend[Frontend Container: Nginx serving React app]
Cache[(Backend parquet_cache)]
Schema[(Intelvia schema source + derived tables)]
Epic[(Client EPIC-derived data)]
User --> Nginx
Nginx --> Backend
Nginx --> Frontend
Backend --> Cache
Cache --> Schema
Schema --> Epic
For local development, run backend and MariaDB in Docker and run the frontend on your host for fast HMR.
- Copy
.env.defaultto.envin the project root. - Start backend and MariaDB:
docker compose -f docker-compose.dev.yml up- In another terminal, start the frontend:
cd frontend
yarn install
yarn serve- Open
http://localhost:8080.
Notes:
- The frontend uses relative
/api/...paths and Vite proxies them tohttp://localhost:8000. - The backend test runner automatically creates the derived artifact tables after the test database is created.
docker compose -f docker-compose.dev.yml exec -it backend bash
poetry run python manage.py recreatedatadocker compose -f docker-compose.dev.yml exec -it backend bash
poetry run python manage.py destroydata
poetry run python manage.py migrate
poetry run python manage.py migrate_derived_tables
poetry run python manage.py mock50million
poetry run python manage.py refresh_derived_tables
poetry run python manage.py generate_parquetspoetry run python manage.py generate_parquetspoetry run python manage.py refresh_derived_tablespoetry run python manage.py generate_parquets --generate visit_attributes
poetry run python manage.py generate_parquets --generate procedure_hierarchy
poetry run python manage.py generate_parquets --generate surgery_casesRun the backend suite from the backend container:
docker compose -f docker-compose.dev.yml exec -it backend bash
poetry run python manage.py test api.tests --verbosity 2 --parallel 8The custom Django test runner at backend/api/tests/runner.py runs migrate_derived_tables after the test database is created, so GuidelineAdherence, VisitAttributes, and SurgeryCaseAttributes exist before fixtures are populated and refreshed.
The frontend currently exposes lint, typecheck, and build validation:
cd frontend
yarn lint
yarn typecheck
yarn buildThe production deployment uses separate frontend and backend containers:
- Frontend container: Nginx serving the built React application
- Backend container: Django served by Gunicorn
- External VM nginx: SSL termination and routing to the containers
Start the production stack with:
docker-compose up
# or
podman-compose upDeployment expectations:
- The VM-level nginx handles SSL termination.
- Requests are routed to the frontend container, which proxies API traffic to the backend container.
- Required environment variables must be present for Django, MariaDB, CAS auth, and any deployment-specific settings. These can be set in a
.envfile or injected through the deployment pipeline. - After deploy, the backend should run
migrate,migrate_derived_tables, andgenerate_parquetsas part of the environment bootstrap.
The backend manages three SQL-owned derived artifacts:
GuidelineAdherenceVisitAttributesSurgeryCaseAttributes
Their schema and refresh SQL live in backend/api/models_derived/.
Key commands:
poetry run python manage.py migrate_derived_tables
poetry run python manage.py refresh_derived_tables
poetry run python manage.py generate_parquetsResponsibilities:
-
migrate_derived_tablesCreates or replaces the physical derived tables from*_schema.sql. -
refresh_derived_tablesTruncates and repopulates the derived tables from the source MariaDB tables using*_refresh.sql. -
generate_parquetsRefreshes the required derived tables, reads them, normalizes values, and writes the parquet cache artifacts used by the frontend.
The derived artifacts are intentionally not represented as Django models. They are treated as SQL-managed cache tables whose correctness is validated by integration tests and parquet generation tests.
Security controls in Sanguine include:
- Limited firewall and VPN access
- CAS / SSO authentication
- Role-based access control in Django
- Service accounts with limited DB permissions
- VM patching and monitoring handled by hospital IT
- Encryption in transit with SSL
The backend supports Sentry for deployment-specific error monitoring.
Set these backend environment variables:
SENTRY_DSNSENTRY_ENVIRONMENTSENTRY_TRACES_SAMPLE_RATESENTRY_SEND_DEFAULT_PIISENTRY_CAPTURE_HANDLED_HTTP_ERRORS
When SENTRY_DSN is not set, Sentry is disabled.
Unhandled backend exceptions are sent to Sentry when configured and are also written to container logs. Handled 4xx/5xx responses can also be reported when SENTRY_CAPTURE_HANDLED_HTTP_ERRORS=True.
