Overview
As Tech Lead Data Engineer, you’ll architect and maintain the data backbone powering every feature across our product suite.
Key Responsibilities
- Design, implement, and optimize large-scale ETL workflows in Databricks (Apache Spark, Delta Lake, DBT).
- Develop algorithms that transform raw market data into actionable insights.
- Own data quality and lineage, instituting tests, monitoring, and alerting for mission-critical pipelines.
- Evolve our cloud data platform (AWS & GCP) for scale, performance, and cost efficiency.
- Mentor engineers, championing best practices in code reviews, documentation, and DevOps for data.
Requirements
- Expert in Python plus working knowledge of Scala or Java.
- Hands-on with Databricks, cluster tuning, job orchestration, and Delta Tables.
- Strong analytical SQL, modular data-model design, and CI/CD for transformations.
- Production experience on AWS or GCP data services (e.g., S3/GCS, EMR/Dataproc).
- Solid grasp of EL(T) patterns, workflow scheduling, and incremental processing.
- Building feature stores or inference-ready tables for ML/LLM workflows.
- Familiarity with Apache Airflow, Luigi, or Dagster for DAG orchestration.
Benefits
- Competitive Compensation – $150k – $185k CAD/USD plus benefits and performance bonuses.
- Remote flexibility – Remote-first within ±3 hrs ET; optional collaboration hubs in Toronto and NYC.
- Growth opportunities – Budget for conferences and continuous learning.
Location
United States
How to Apply
Submit your resume, GitHub profile, and a brief note on why you’re excited about building AI-driven FinTech to jobs@mitremedia.com.