GCP Data Engineer_Roji_Thetechglor

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

3-8 years of experience in data engineering, specifically with GCP and ETL solutions., Proficiency in Python and SQL, particularly on Big Data platforms., Experience with cloud native ETL tools and migrating workloads to GCP., Familiarity with Agile Scrum methodology and data modeling concepts..

Key responsibilities:

  • Develop and deploy data pipelines using cloud native ETL tools.
  • Implement DWH/ETL solutions involving multiple data sources and complex transformations.
  • Troubleshoot and optimize performance of GCP services.
  • Collaborate with teams to design data models that support multiple applications.

CodersBrain logo
CodersBrain SME https://www.codersbrain.com/
201 - 500 Employees
See all jobs

Job description

GCP Data Engineer (CDF)
 
Experience : 3 - 8 years
Timings - IST
Rate: 1.25k

Primary Skills : Python, SQL, GCP, CDF, CDAP
Secondary Skills: GCP (BigQuery/ Dataproc) / other Cloud DW/ETL - IICS, AWS Glue, AZURE Synapse, Snowflake, Talend
 
Required Experience
● Must be able to code in Python and SQL on Big Data platforms
● Must have vast experience of implementing DWH/ETL solutions involving multiple data
sources (SAP specifically) and complex transformations for large enterprise customers, preferably Fortune 1000
● Must have 3+ years of experience developing and deploying data pipelines on Cloud native ETL tools eg. CDAP, IIICS, AWS Glue, Azure Synapse, Snowflakes, GCP Composer, etc.
● Must have prior experience of migrating on-premises workloads to GCP CDF or other cloud native services
● Must be proficient in troubleshooting and performance tuning of GCP Services
● Must have executed projects using Agile Scrum methodology and aware of all processes
involved in Scrum
● Good to have experience with cloud deployment of pipelines and orchestration tools
(Airflow, Composer).
● Good to have- Hands on experience/ working knowledge of JDBT, Rest API, Hive, Java
● Should have experience with design of data models which serve multiple applications
underlying the same model (common schemas across multiple scenarios).
● Should have extensive knowledge of large-scale data processing concepts and
technologies.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Troubleshooting (Problem Solving)
  • Problem Solving

Related jobs