Offer summary
Qualifications:
Bachelor’s degree in quantitative/technical field, 3+ years of data engineering experience, Strong SQL, Python, and cloud platform knowledge, Experience with Redshift, Snowflake, or BigQuery, Analytical skills and troubleshooting abilities.
Key responsabilities:
- Develop and optimize ETL pipelines using tools like Apache Airflow
- Collaborate with data producers for efficient data models
- Implement data governance practices and ensure scalability
- Identify and resolve data-related issues to optimize workflows
- Contribute to modern data architecture development and maintenance