Match score not available

GCP Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Proficient in Python for data manipulation, Experience with GCP tools like BigQuery and Dataflow, Strong SQL skills for data management, Hands-on experience with ETL/ELT pipelines, Familiarity with data modeling and warehousing principles.

Key responsabilities:

  • Design, develop, and maintain ETL/ELT pipelines
  • Implement data warehousing solutions on GCP
  • Optimize data pipelines for performance and scalability
  • Collaborate with stakeholders to address data needs
  • Ensure data quality and enforce governance best practices
Elfonze Technologies logo
Elfonze Technologies Scaleup https://www.elfonze.com/
201 - 500 Employees
See more Elfonze Technologies offers

Job description

This is a remote position.

Key Responsibilities:

  • Design, develop, and maintain ETL/ELT pipelines using Python and GCP data services.
  • Implement data warehousing solutions on GCP, including data ingestion, transformation, and storage strategies.
  • Optimize data pipelines and infrastructure for performance, scalability, and reliability.
  • Collaborate with data scientists, analysts, and stakeholders to understand data needs and translate requirements into technical solutions.
  • Monitor data quality, troubleshoot issues, and ensure accuracy across systems.
  • Implement best practices in data governance, security, and compliance on GCP.
  • Develop data processing workflows using GCP services, such as Dataflow, BigQuery, and Pub/Sub.
  • Continuously improve data pipeline architecture and look for ways to optimize processes.

Mandatory Skills and Qualifications:

  • Python Programming: Proficiency in Python for data manipulation, ETL/ELT, and automation.
  • GCP Data Engineering Services: Experience with key GCP tools such as BigQuery, Dataflow, Dataproc, Cloud Composer, and Pub/Sub.
  • SQL Proficiency: Strong SQL skills for querying, analyzing, and managing data in cloud data warehouses.
  • ETL/ELT Development: Hands-on experience designing and building ETL/ELT pipelines for large datasets.
  • Data Modeling: Familiarity with data modeling concepts and best practices for structured and semi-structured data.
  • Data Warehousing: Understanding of data warehousing principles and experience with cloud-based data warehouses, particularly BigQuery.
  • Version Control: Experience using Git or other version control systems.
  • Data Quality and Monitoring: Proficiency in ensuring data accuracy and implementing data validation/testing practices.

Preferred Skills (Not Mandatory):

  • Experience with Apache Spark or other distributed data processing frameworks.
  • Familiarity with Terraform or other Infrastructure-as-Code (IaC) tools for GCP resource provisioning.
  • Knowledge of CI/CD practices and tools for data engineering.
  • Google Cloud certifications (e.g., Professional Data Engineer).


Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving
  • Collaboration

Data Engineer Related jobs