Scrap Trawler is an open-source browser extension designed for tournament organizers using EventLink. It enables users to extract event data, including:
- Player registration
- Pairings & standings
- Penalties & infractions
This project is not affiliated with, endorsed by, or sponsored by Wizards of the Coast (WOTC). Magic: The Gathering, EventLink, and all related trademarks belong to Wizards of the Coast LLC.
The name "Scrap Trawler" is inspired by the card Scrap Trawler from Aether Revolt (a set in Magic: The Gathering, published by Wizards of the Coast). The Scrap Trawler card allows players to "retrieve artifacts from the graveyard," much like this extension retrieves event data from the EventLink system.
"Scrap Trawler" is not a copyrighted name and is used here as an homage to the spirit of data recovery and organization. This project does not claim ownership over any trademarks or intellectual property related to Wizards of the Coast or Magic: The Gathering.
- Extracts event data directly from EventLink using the user's session.
- Supports GraphQL API requests for efficient data retrieval.
- Saves event data for historical reference.
- Runs as a lightweight browser extension with a simple UI.
- Designed with OpenSSF security best practices.
- Clone the repository:
git clone https://github.com/Guibod/Scrap-Trawler.git cd Scrap-Trawler - Install dependencies:
pnpm install
- Build the extension:
pnpm build
- Load it in your browser (for development):
- Open Chrome → Extensions → Enable Developer Mode
- Click Load Unpacked and select the
dist/directory
This project uses GitHub Actions for automated builds and linting.
- Lint & Test: Runs ESLint and unit tests on every push.
- Build: Compiles the extension for production.
- Security Checks: Uses OpenSSF best practices for secure development.
Contributions are welcome! Please follow these steps:
- Fork the repo and create a feature branch.
- Follow the code style (ESLint & Prettier configured).
- Submit a PR, describing your changes.
This project follows OpenSSF Best Practices. All PRs are checked for security vulnerabilities using:
- GitHub Dependabot
- Snyk Security
- Code Scanning with CodeQL
This project is open-source under the MIT License. See LICENSE for details.
Have questions? Open an issue or join the discussion!
Happy scraping! 🚀
