A small CLI utility to fetch game data from the Eve Frontier API and store it locally (SQLite) or in PostgreSQL.
- Downloads datasets (characters, tribes, killmails, assemblies, types, fuels, solarsystems).
- Persists them into a local SQLite DB by default, or into PostgreSQL when requested.
- Supports incremental updates, full reloads, optional static dataset fetches, and fetching SSU inventories and assembly locations.
- Python 3.8+
- Packages:
requests,tqdm,psycopg2-binary(for PostgreSQL)
Install prerequisites:
pip install requests tqdm psycopg2-binaryef_data_fetcher.py— main script (run from this folder)
- Default behavior: SQLite local file (no extra configuration required).
- To use PostgreSQL, supply
-db postgreson the CLI (or change DB_TYPE in code) and ensure the database exists and credentials are correct.
PostgreSQL credentials can be provided via environment variables (recommended) or kept in the script.
Note: The script does not auto-create the PostgreSQL database — create it beforehand (for example with createdb efdata).
Run -h to see all options:
python ef_data_fetcher.py -hKey arguments:
-reload: Force full refetch instead of incremental updates.-static: Fetch static datasets (solarsystems, types, fuels). Useful for the first run.-locations: Fetch missing assembly locations (may be slow).-ssu [all|online|anchored]: Fetch Smart Storage Unit inventories (optional value, defaultall).-data [default|skip|characters|tribes|killmails|assemblies]: Run specific dataset fetch or skip the data fetch step.-db [sqlite|postgres]: Choose DB backend.
Examples:
-
Default run (SQLite default):
python ef_data_fetcher.py
-
Default run and request static datasets:
python ef_data_fetcher.py -static
-
Only fetch SSU inventories (online) and skip normal data fetch:
python ef_data_fetcher.py -data skip -ssu online
-
The script attempts incremental updates by comparing API metadata counts to the local DB counts; use
-reloadto force full re-download. -
SSU inventory fetching hits the per-assembly endpoint and can be slow — use
-ssu onlineor-ssu anchoredto limit scope. -
If PostgreSQL connection fails, verify credentials and that the DB server is reachable. Common psycopg2 errors are often caused by incorrect credentials or server settings.
-
Quick checks:
-
- Confirm the environment variables are set (see above) or that
DB_POSTGRES_CONFIGvalues in the script are correct.
- Confirm the environment variables are set (see above) or that
-
- Ensure the PostgreSQL user has permission to connect to the
efdatadatabase.
- Ensure the PostgreSQL user has permission to connect to the
-
- Run
psql -h <host> -U <user> -d <dbname>to test connectivity from the same machine.
- Run
-
- If you see Unicode/decoding errors from
psycopg2.connect, check for stray non-UTF8 bytes in credential files or environment values.
- If you see Unicode/decoding errors from
-
Use
python ef_data_fetcher.py -hto confirm CLI flags and correct usage.
- The code centralizes paginated fetch+insert logic — follow that pattern (helper functions) when adding new datasets.
- If you run the script against PostgreSQL regularly, consider moving DB credentials to environment variables or a separate config file.
The project requires the following Python packages. You can install them with:
pip install -r requirements.txtThis repository contains private project code — follow the original project's licensing decisions if you intend to reuse or redistribute.
data_viewer.py — Web viewer
data_viewer.pyis a small Flask web application that serves a dark-themed UI for browsing theefdata.dbSQLite database created by the fetcher. It uses jQuery DataTables with server-side processing for efficient paging, searching and sorting of large tables.
Requirements for the web viewer
- Python 3.10+ (tested with 3.13)
- Packages:
flask,sqlite3(stdlib),requests(if used),pytz(optional)
Run locally (development):
pip install flask
python data_viewer.pyThe app listens on port 8000 by default and is accessible at http://localhost:8000.
Production notes
- Do not use the Flask development server in production. Use
gunicornoruwsgiand put a reverse proxy (nginx) in front. - Example with gunicorn:
pip install gunicorn
gunicorn -w 4 -b 0.0.0.0:8000 data_viewer:appExposed API endpoints
/— web UI/api/assemblies,/api/characters,/api/tribes,/api/killmails/api/solarsystems,/api/types,/api/fuels,/api/storage— additional tables/api/query— accepts POST JSON with aquery(SELECT only). Be careful: this endpoint is powerful; secure or remove it before exposing publicly.
Security
- Add authentication (basic auth, token, or OAuth) and enable HTTPS when exposing the viewer to the internet.
- Consider removing or protecting
/api/queryto prevent unintended access to the database.
Database & Tables
- Database file:
efdata.db(created in the same directory as the script) - Main tables:
assemblies,characters,tribes,killmails,solarsystems,types,fuels,assemblies_content - Note:
location_x,location_y,location_zare stored asTEXTto preserve large/precise values.
Running on a server / Scheduling
- You can run this script on a VPS. For periodic updates use
cron(Linux) or Task Scheduler (Windows).
Example cron (hourly):
0 * * * * cd /path/to/ef-fetcher/sqlite/requests && /path/to/venv/bin/python ef_data_fetcher.py >> ef_data_fetcher.log 2>&1Security & Operational Notes
- The script issues unauthenticated API GET requests to public endpoints; ensure your server has outbound access.
- Keep an eye on API rate limits and network reliability — the script will raise exceptions on non-2xx responses.
assemblies_contentfetches per-assembly endpoints when-ssuis used; this may take longer depending on number of SSUs.
Troubleshooting
- If you get
requestserrors, check network connectivity and API availability. - If the DB schema changes, delete or back up
efdata.dband run with-initto recreate static tables as needed.