Offer summary
Qualifications:
Bachelor's degree in Computer Science, Engineering, or related field., 4-7 years of experience in big data systems, data processing, and SQL databases., 3+ years coding with Spark data frames, Spark SQL, & PySpark., Experience with SQL, dimensional modeling, Hive, Snowflake, Airflow, Looker., Experience in building reports and dashboards using BI tools..
Key responsabilities:
- Architect scalable systems based on business requirements.
- Build and scale data infrastructure for batch & real-time processing.
- Automate cloud infrastructure, CI/CD pipelines, & testing automation.
- Collaborate with stakeholders like data scientists and product managers.
- Provide critical insights for analytics/data-driven decision-making.