- Data ingestion is hard, Airbyte makes it more easy and scalable
- Take advantage of many data sources from files, api, databases,...
- Let users have a choice other than traditional, centralize solutions (Kafka, BigQuery, Snowflake,...)
docker pull mihthanh27/airbyte-destination-streamr
Go to Airbyte > Settings > Destination
- Install
nvm - Install Node.js
nvm install 14 && nvm use 14 - Update
npmto version 7.x by runningnpm install -g npm@7 - Install
lernaby runningnpm install -g lerna - Run
npm run prepareto install dependencies for all projects (npm run cleanto clean all) - Run
npm run buildto build all projects (for a single project add scope, e.gnpm run build -- --scope faros-destination) - Run
npm run testto test all projects (for a single project add scope, e.gnpm run test -- --scope faros-destination) - Run
npm run lintto apply linter on all projects (for a single project add scope, e.gnpm run lint -- --scope faros-destination)
- Audit fix
npm audit fix - Clean your project
lerna run clean(sometimes you also wannarm -rf ./node_modules)
Read more about lerna here - https://github.com/lerna/lerna
In order to build a Docker image for a connector run the docker build command and set path argument.
For example for Faros Destination connector run:
docker build . --build-arg path=destinations/streamr-destination -t airbyte-destination-streamrAnd then run it:
docker run airbyte-destination-streamrConnector Docker images are automatically published to Docker Hub after updates
to the main branch. They are tagged by the version listed in the connector's
package.json. If the connector is updated without incrementing the version,
GitHub will NOT overwrite the existing image in Docker Hub.

