Offer summary
Qualifications:
5-7 years of experience, Proficient in PySpark, Experience with Data Lakehouse, Knowledge of Amazon EMR and Apache Airflow.Key responsabilities:
- Develop and maintain data pipelines
- Optimize data processing and storage solutions