Offer summary
Qualifications:
Bachelor's or Master's degree in Computer Science or Informatics, Experience with programming languages like PySpark, Scala or Python, Proficiency in Apache Spark, Experience with data processing pipelines.Key responsabilities:
- Collaborate with architects and analysts
- Develop and optimize data ingestion libraries
- Design real-time data streaming solutions
- Create end-to-end data processing pipelines