Offer summary
Qualifications:
Bachelor's degree in Computer Science or similar, At least 3 years of data engineering experience, Strong expertise in Python, Snowflake, PySpark, and AWS, Proficient in ELT/ETL tools and streaming technologies, Advanced SQL skills for optimized data integration.
Key responsabilities:
- Build and maintain data pipelines to Snowflake
- Support Data Operations with scalable solutions
- Standardize data pipeline frameworks across regions
- Ensure data reliability and quality management
- Collaborate with cross-functional teams for requirements