Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Bachelor's degree in Computer Science or related field., 3-5 years of experience in data engineering roles., Proficiency in BigQuery, Python, FastAPI, and PostgreSQL., Experience with data ingestion techniques and workflow orchestration tools..

Key responsibilities:

  • Design and implement scalable data pipelines for increasing data loads.
  • Optimize SQL transformations and query performance in BigQuery.
  • Develop data ingestion processes from various sources ensuring data integrity.
  • Utilize orchestration tools to schedule and monitor data workflows.

Evnek logo
Evnek https://www.Evnek.com
11 - 50 Employees
See all jobs

Job description

This is a remote position.

Job Title: Data Engineer
Experience: 3–5 Years
Location: Remote
Notice Period: Immediate Joiners Only

Role Summary
We are seeking a skilled Data Engineer to design, develop, and maintain scalable and reliable data pipelines. The ideal candidate will have expertise in BigQuery, data ingestion techniques, orchestration tools, and a strong command over Python, FastAPI, and PostgreSQL. Experience in handling end-to-end data workflows is essential.

Key Responsibilities
  • Data Pipeline Development: Design and implement scalable data pipelines that can handle increasing data loads without compromising performance.
  • BigQuery Optimization: Write complex SQL transformations and optimize query performance using best practices such as partitioning, clustering, and efficient JOIN operations.
  • Data Ingestion: Develop robust data ingestion processes from various sources, including RESTful APIs and file-based systems, ensuring data integrity and consistency.
  • Workflow Orchestration: Utilize orchestration tools like Prefect or Apache Airflow to schedule and monitor data workflows, ensuring timely and reliable data processing.
  • Tech Stack Proficiency: Leverage Python and FastAPI for building data services and APIs, and manage data storage and retrieval using PostgreSQL.
  • End-to-End Workflow Management: Own the entire data workflow process, from ingestion and transformation to delivery, ensuring data quality and availability.

Key Responsibilities:

  • Data Pipeline Development: Design and implement scalable data pipelines that can handle increasing data loads without compromising performance.
  • BigQuery Optimization: Write complex SQL transformations and optimize query performance using best practices such as partitioning, clustering, and efficient JOIN operations.
  • Data Ingestion: Develop robust data ingestion processes from various sources, including RESTful APIs and file-based systems, ensuring data integrity and consistency.
  • Workflow Orchestration: Utilize orchestration tools like Prefect or Apache Airflow to schedule and monitor data workflows, ensuring timely and reliable data processing.
  • Tech Stack Proficiency: Leverage Python and FastAPI for building data services and APIs, and manage data storage and retrieval using PostgreSQL.
  • End-to-End Workflow Management: Own the entire data workflow process, from ingestion and transformation to delivery, ensuring data quality and availability.


Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication
  • Problem Solving

Data Engineer Related jobs