Offer summary
Qualifications:
Bachelor's or Master's degree in Computer Science, Engineering or related field, 2+ years developing large-scale software and 5+ years in data engineering, Strong experience with Spark, Scala, Python, and AWS services, Experience in data warehousing, SQL, Airflow, problem-solving skills, Knowledge of data governance, security best practices and high-level English skill.
Key responsabilities:
- Work with stakeholders to prioritize data projects and align infrastructure with business goals
- Design and maintain optimal data pipeline architecture from various sources
- Monitor and optimize data performance, identify opportunities for improvement
- Solve complex problems, implement data privacy and security solutions
- Enhance team's dev-ops capabilities, stay updated on data engineering trends