Match score not available

GCP Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Proficiency in BigQuery, Experience with Google Composer, Familiarity with Python is a plus, Knowledge of DataStage and Informatica.

Key responsabilities:

  • Design and maintain data pipelines on GCP
  • Integrate and transform diverse data sources
  • Manage GCP resources for availability and security
  • Implement big data solutions and optimize performance
  • Ensure data quality and compliance
Elfonze Technologies logo
Elfonze Technologies Scaleup https://www.elfonze.com/
201 - 500 Employees
See more Elfonze Technologies offers

Job description

This is a remote position.

Job Description
  • Data Pipeline Development: Design, implement, and maintain robust and scalable data pipelines using GCP services (such as Cloud Data flow, Cloud Taproot, BigQuery, Cloud Storage, Pub/Sub, and Cloud Functions).
  • Data Integration: Extract, transform, and load (ETL) data from diverse sources (e.g., on-premises databases, cloud data stores, APIs) into GCP-based storage and processing solutions.
  • Cloud Infrastructure Management: Manage GCP resources and services, ensuring high availability, scalability, and security for data-related workloads.
  • Big Data Processing: Implement big data solutions using GCP tools like Google BigQuery, Cloud Taproot, and Apache Beam to handle large volumes of structured and unstructured data.
  • Data Warehousing: Build and maintain BigQuery data models and work with large-scale data warehouses for analytics and reporting.
  • Data Quality Assurance: Ensure data integrity, consistency, and quality through validation, error handling, and data cleansing processes.
  • Performance Optimization: Optimize data workflows, queries, and infrastructure for performance and cost efficiency on GCP.
  • Collaboration: Work closely with Data Scientists, Data Analysts, and Business Intelligence teams to ensure data meets analytical needs and is optimized for insights.
  • Security and Compliance: Implement best practices for data governance, security, and compliance, ensuring that all data processes meet industry standards and regulatory requirements.
  • Automation and Monitoring: Build automated processes for data pipeline monitoring, alerting, and performance tracking to ensure smooth operation in production environments.
  • Documentation: Document data architecture, pipeline designs, and processes, ensuring clarity and knowledge sharing across teams.


  • Requirements
    Mandatory Skills BigQuery, Google Composer
    Nice to have skills Python, DataStage, Informatics

    Required profile

    Experience

    Spoken language(s):
    English
    Check out the description to know which languages are mandatory.

    Other Skills

    • Collaboration

    Data Engineer Related jobs