Offer summary
Qualifications:
3+ years experience in data engineering, Proficient in Spark, PySpark or Scala, Experience with Redshift or Trino, Familiarity with BI tools like Looker or Tableau, Knowledge of ETL using Apache Airflow is a plus.Key responsabilities:
- Evaluate technologies and develop POCs
- Build scalable and reliable business applications
- Design and maintain data pipelines
- Mentor junior engineers and collaborate with teams
- Drive architecture decisions and improvements