Job Description

  • Contractor
  • Anywhere

Role : GCP Data Engineer
Location: Toronto, Ontario, Canada
Job Type: Full Time/Contract
Work Model: Hybrid

Please find the below JD for your Reference:-
GCP Data Engineer : The focus is on hashtag#GCPDataEngineer, who has hand-on experience on hashtag#datapipelines, hashtag#dataproc, hashtag#composer, hashtag#hive, hashtag#bigquery, hashtag#sql, hashtag#python, hashtag#airflow.
10+ years of experience with hashtag#Data Warehouse / hashtag#Data Platforms
5+ years of experience creating ELT data pipelines from scratch, working with structured, semi-structured, and unstructured data and SQL.
Experience building data pipelines, and hashtag#GCP Services, Dataproc, BigQuery, Cloud Spanner, Cloud Run functions, Dataflow, Pub/Sub etc
2+ years of experience configuring and using data ingestion tools such as DBT, Qlik, Airbyte or others.
5+ years of experience with Cloud: hashtag#GCP
5+ years of experience working as a hashtag#datadeveloper, hashtag#dataengineering, programming, ETL, ELT, processes for data integration.
5+ years continuous integrations and continuous deployment pipeline (CI/CD) and working with source control systems such as hashtag#GitHub, hashtag#Bitbucket, and hashtag#Terraform
Platform Engineer:
· Experience: 8+ years in platform engineering or hashtag#DevOps roles with a strong focus on hashtag#Python, hashtag#Kubernetes, and hashtag#Docker.
Technical Skills:
Python (Expert): Advanced skills in scripting, automation, and integration with platform tools.
Kubernetes (Expert): Deep knowledge of Kubernetes architecture, deployment, and orchestration in cloud-native environments.
Docker: Proficient in containerization, Dockerfile optimization, and multi-stage builds.
CI/CD Tools: Strong experience with Jenkins and BitBucket for automated builds and deployments.
Storage Solutions: Working knowledge of MinIO and Longhorn for data persistence and redundancy.
Soft Skills: Problem-solving, effective communication, and a collaborative mindset