Job Description
About the job
Job Title: Databricks Engineer
Location: Canada (REMOTE)
Type: Contract
Responsibilities:
Architect and implement scalable Lakehouse data platforms using Databricks and Delta Lake.
Design robust batch and streaming data pipelines leveraging Apache Spark, structured streaming, and modern ELT patterns.
Lead migration of Jobs from other cloud data platform to Databricks.
Implement secure data governance, access control, and lineage using Unity Catalog.
Architect integrations with cloud platforms such as Google Cloud Platform.
Optimize performance and manage compute costs through efficient cluster configuration and Spark workload tuning.
Collaborate with data engineers, analytics teams, and ML engineers to enable scalable data products and analytics.
Define best practices for data quality, reliability, and observability across the data platform.
Provide technical leadership, architecture guidance, and mentorship to data engineering teams.
Requirements:
10+ years of experience in data engineering or data architecture in Big Data platforms.
3+ years of hands-on experience with Databricks platform architecture
Strong expertise in Apache Spark, Delta Lake, Databricks SQL, Python, SQL, and GCP Platform services
