About the Role
Join a high-impact U.S.-based startup transforming the music industry through automation. As a Senior Data Engineer, you'll architect intelligent pipelines that help music creators get paid fairly by reconciling millions of rows of royalty data across 15+ streaming platforms. You'll build the data layer from scratch using modern tools and take full ownership of performance, quality, and auditability.
About You
● You have 5+ years of experience as a Data Engineer working on production-grade ETL pipelines.
● You're passionate about clean, reliable data and solving complex transformation problems.
● You communicate effectively with both technical and non-technical stakeholders.
● You thrive in fast-paced environments and love building systems from the ground up.
What You'll Be Doing
● Design and implement an end-to-end ETL pipeline to process millions of rows of royalty data per day.
● Develop platform-specific CSV parsers and build a robust dbt project with clear transformation layers.
● Integrate with streaming APIs (Spotify, Chartmetric) for data enrichment and validation.
● Ensure high data quality using frameworks like Great Expectations.
● Build monitoring dashboards and set up alerts to maintain pipeline health.
● Normalize financial data across currencies and territories and handle song metadata matching.
What We're Looking For
● Proactivity and a solution-oriented mindset.
● Attention to detail in financial data handling.
● Ability to prioritize tasks and manage time effectively.
● Comfort working independently in a startup environment.
Technical Requirements
Must-Haves
● 5+ years of experience in data engineering roles.
● 3+ years of experience with dbt in production.
● Advanced SQL and Python skills.
● Experience with orchestration tools like Airflow or Dagster.
● Strong knowledge of PostgreSQL, including schema design and optimization.
● Familiarity with Git and CI/CD pipelines.
Nice-to-Haves
● Knowledge of the music industry or financial reconciliation processes.
● Experience with Great Expectations or similar tools for data quality.
● Integration experience with 5+ third-party APIs.
● Handling of multi-currency and multi-territory data.
● Experience with dbt Cloud, Terraform, Looker, or Metabase.