Offer summary
Qualifications:
Bachelor's degree in Computer Science, Engineering, or related field., At least 7 years industry experience in big data systems., 3+ years coding experience with Spark DataFrames and PySpark., Strong knowledge of SQL, dimensional modeling, Hive, and Snowflake., Familiarity with ETL workflow tools, version control, and CI/CD..
Key responsabilities:
- Influence innovative solutions and foster technological excellence.
- Be a cornerstone of the dynamic team.
- Apply problem-solving skills to make sound decisions efficiently.