Offer summary
Qualifications:
Bachelor’s degree or equivalent experience, 5+ years of Data Engineering or ETL Development experience, Strong experience with PySpark, Python, Iceberg, Hive, S3, Trino, Hands-on experience with Hadoop, relational databases, SQL, Familiarity with Agile methodologies, GitLab, CI/CD.
Key responsabilities:
- Designs and develops applications and systems
- Estimates system work efforts for solutions
- Manages end-user requests and provides support
- Leads cross-functional initiatives and meets deadlines
- Provides training and guidance to less experienced staff