GCP Data Engineer (CDF)
Experience : 3 - 8 years
Timings - IST
Rate: 1.25k
Primary Skills : Python, SQL, GCP, CDF, CDAP
Secondary Skills: GCP (BigQuery/ Dataproc) / other Cloud DW/ETL - IICS, AWS Glue, AZURE Synapse, Snowflake, Talend
Required Experience
● Must be able to code in Python and SQL on Big Data platforms
● Must have vast experience of implementing DWH/ETL solutions involving multiple data
sources (SAP specifically) and complex transformations for large enterprise customers, preferably Fortune 1000
● Must have 3+ years of experience developing and deploying data pipelines on Cloud native ETL tools eg. CDAP, IIICS, AWS Glue, Azure Synapse, Snowflakes, GCP Composer, etc.
● Must have prior experience of migrating on-premises workloads to GCP CDF or other cloud native services
● Must be proficient in troubleshooting and performance tuning of GCP Services
● Must have executed projects using Agile Scrum methodology and aware of all processes
involved in Scrum
● Good to have experience with cloud deployment of pipelines and orchestration tools
(Airflow, Composer).
● Good to have- Hands on experience/ working knowledge of JDBT, Rest API, Hive, Java
● Should have experience with design of data models which serve multiple applications
underlying the same model (common schemas across multiple scenarios).
● Should have extensive knowledge of large-scale data processing concepts and
technologies.