Lead Data Engineer

Job Description

  • Contractor
  • Anywhere

Hello Connections!!!

This is a contract position for a Lead Data Engineer in Burnaby, Canada. The candidate needs to have valid work visa and attend the office 2-3 days a week in Burnaby. Please share updated resume at bijaya.padhi@firstvibe.ca
Lead Data Engineer with 8 + years ,Telecom Supply Chain
We are seeking an experienced person to design, build, and optimize scalable data solutions for our Telecom Supply Chain analytics and operations platform. Technical expertise in Apache Spark, Scala, Python, Azure, and Databricks, along with strong domain knowledge of telecom supply chain processes such as inventory management, logistics, fulfillment, and device lifecycle management.
Data Architecture & Engineering
Design and develop end-to-end data pipelines leveraging Apache Spark, Scala, and Python for large-scale processing.
Architect and maintain data lake and data warehouse solutions on Azure and Databricks.
Implement robust ETL/ELT processes for ingesting structured, semi-structured, and unstructured data from diverse telecom supply chain systems (SAP, Oracle ERP, logistics feeds, etc.).
Drive best practices in data modeling, partitioning, and schema design to support analytical workloads.
Performance Optimization
Optimize Spark jobs for performance, scalability, and cost-efficiency.
Monitor and fine-tune Azure Data Lake Storage, Synapse, and Databricks environments for reliability and throughput.
Implement CI/CD pipelines for automated testing and deployment of data solutions.
Collaboration & Cross-functional Integration
Programming Languages
Advanced in Scala, Python, SQL
Big Data Frameworks
Expert in Apache Spark, experience with Delta Lake
Cloud Platforms
Hands-on with Microsoft Azure (Data Lake, Synapse, Data Factory)
Data Platforms
Databricks – advanced experience in workspace configuration, notebooks, clusters
ETL Tools
Strong experience with Azure Data Factory, Airflow, or equivalent orchestration tools
Version Control & CI/CD
Git, Azure DevOps, Jenkins
Database Systems
SQL Server, Oracle, Snowflake (preferred)
Data Modeling
Dimensional modeling, Data Vault, or similar methodologies
Performance Optimization
Spark tuning, cluster scaling, data partitioning strategies

Domain Expertise
Deep understanding of Telecom Supply Chain operations, including:
Device procurement and logistics
Inventory and warehouse management
Order fulfillment and reverse logistics
Network asset lifecycle and configuration data
Familiarity with ERP systems (SAP ABAP, Oracle ERP) and their integration with data platforms.
Proven experience delivering enterprise-grade data solutions in a cloud-native, large-scale environment.
Strong analytical, problem-solving, and communication skills.
Telecom or Supply Chain project experience is mandatory.

Azure Data Engineer or Databricks certification.
Experience with real-time streaming (Kafka, Event Hubs).
Exposure to machine learning pipelines and MLOps integration.
Familiarity with Agile delivery and DevOps practices.