Job Description
Job Title: Data Engineer – Cloud / GCP (Hybrid, Toronto)
Location: Toronto, ON – Hybrid (2–3 days in office)
Type: Contract – ASAP to April 30th, with strong possibility of 6-month extension
We’re looking for a skilled Data Engineer with hands-on experience in Google Cloud Platform (GCP) to design, build, and maintain scalable data pipelines for a high-impact, global analytics program. You’ll collaborate with cross-functional teams to ensure our data infrastructure is robust, secure, and optimized for performance. This role is fully bilingual (English/Spanish) and ideal for engineers ready to work with large-scale structured and unstructured datasets.
What You’ll Do:
Build and maintain modern ELT/ETL data pipelines from scratch.
Integrate and orchestrate workflows using Airflow and cloud-native tools.
Apply SQL & Python to manipulate, clean, and validate large datasets.
Support CI/CD pipelines using GitHub, Bitbucket, and Terraform.
Conduct data quality checks and monitor pipeline performance.
Translate technical concepts for non-technical stakeholders.
Must-Have Skills:
3–4 years in ELT/ETL data pipelines.
2–4 years with GCP and Airflow.
3+ years with CI/CD pipelines and source control (GitHub, Bitbucket, Terraform).
2–4 years in data modeling, SQL, and Python.
Bilingual: English/Spanish mandatory.
Nice-to-Have:
Power BI or other visualization tools.
Experience with DevOps or Agile/Scrum teams.
Banking or FI sector experience.
Why You’ll Love This Role:
Work at the center of a data modernization journey for a global financial program.
Make a real impact by shaping data-driven insights across multiple geographies.
Hybrid work environment with flexible collaboration.
Apply today to join a team where your expertise powers key business decisions and supports enterprise-wide data initiatives. Send your resume to tinak@corgta.com
