Job Description

About the job
“Skills and Responsibilities:

 

•Must have skills*Excellent communicator working with multiple stakeholders in fast paced environment

•Proficiency in Scheduling data pipelines using Cloud Composer and Airflow

•Experience in Building and optimizing data storage solutions using GCP services like Big Query, Cloud Storage, and Cloud SQL

•Experience in Migrating data pipelines from RDBMS and Big Data platforms to GCP

•Experience in Streaming services like Kafka or Google Pub/Sub

•Proficiency in Optimizing data processing workflows for efficiency, cost-effectiveness, and scalability

•Proven experience as a Data Engineer with hands-on GCP expertise

•Proficiency in Python, SQL, and data modeling

•Expertise in integrating Streaming and Batch processing using Dataflow

•Proficiency in No SQL database Systems like Big Table and HBase

•Experience in building hybrid data pipelines between On Prem Hadoop platform and Google Cloud Platform

•Experience in Building Data warehouse and Data Mart solutions using Big Query”