Bachelor's degree in Computer Science or related field., 3-5 years of experience in data engineering roles., Proficiency in BigQuery, Python, FastAPI, and PostgreSQL., Experience with data ingestion techniques and workflow orchestration tools..
Key responsibilities:
Design and implement scalable data pipelines for increasing data loads.
Optimize SQL transformations and query performance in BigQuery.
Develop data ingestion processes from various sources ensuring data integrity.
Utilize orchestration tools to schedule and monitor data workflows.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Evnek Technologies strives to be the leader in cloud and analytics services. The insights and quality services we deliver help build trust and confidence in our clients. We develop outstanding relationships with our clients and deliver on our promises. Moreover, we play a crucial role in challenging the traditional technologies and look for opportunities to innovate.
Evnek Technologies specializes in Cloud and Enterprise Data Management. We offer services in data integration, business intelligence, data warehousing, architecture, modeling and analytics. We are thought leaders and work with the latest technologies in the Big Data, Cloud, and Data Science space.
Evnek Technologies is passionate about adding value. We partner with our clients and ensure a positive user experience all the while building elegant and scalable solutions that evolve with the company and remain dynamic in an ever-changing business landscape.
We are seeking a skilled Data Engineer to design, develop, and maintain scalable and reliable data pipelines. The ideal candidate will have expertise in BigQuery, data ingestion techniques, orchestration tools, and a strong command over Python, FastAPI, and PostgreSQL. Experience in handling end-to-end data workflows is essential.
Key Responsibilities
Data Pipeline Development: Design and implement scalable data pipelines that can handle increasing data loads without compromising performance.
BigQuery Optimization: Write complex SQL transformations and optimize query performance using best practices such as partitioning, clustering, and efficient JOIN operations.
Data Ingestion: Develop robust data ingestion processes from various sources, including RESTful APIs and file-based systems, ensuring data integrity and consistency.
Workflow Orchestration: Utilize orchestration tools like Prefect or Apache Airflow to schedule and monitor data workflows, ensuring timely and reliable data processing.
Tech Stack Proficiency: Leverage Python and FastAPI for building data services and APIs, and manage data storage and retrieval using PostgreSQL.
End-to-End Workflow Management: Own the entire data workflow process, from ingestion and transformation to delivery, ensuring data quality and availability.
Key Responsibilities:
Data Pipeline Development: Design and implement scalable data pipelines that can handle increasing data loads without compromising performance.
BigQuery Optimization: Write complex SQL transformations and optimize query performance using best practices such as partitioning, clustering, and efficient JOIN operations.
Data Ingestion: Develop robust data ingestion processes from various sources, including RESTful APIs and file-based systems, ensuring data integrity and consistency.
Workflow Orchestration: Utilize orchestration tools like Prefect or Apache Airflow to schedule and monitor data workflows, ensuring timely and reliable data processing.
Tech Stack Proficiency: Leverage Python and FastAPI for building data services and APIs, and manage data storage and retrieval using PostgreSQL.
End-to-End Workflow Management: Own the entire data workflow process, from ingestion and transformation to delivery, ensuring data quality and availability.
Required profile
Experience
Spoken language(s):
English
Check out the description to know which languages are mandatory.