Bachelor's degree in Computer Science, Engineering, or a related field., Proficiency in SQL and experience with data warehousing solutions., Familiarity with ETL processes and data pipeline construction., Knowledge of programming languages such as Python or Java is a plus..
Key responsibilities:
Design and implement scalable data pipelines to support analytics and reporting.
Collaborate with data scientists and analysts to understand data needs.
Monitor and optimize data systems for performance and reliability.
Ensure data quality and integrity throughout the data lifecycle.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Qinshift is a global technology company with a strong European presence, aiming to solve business problems for forward-leaning companies worldwide. Our team of over 3000 genuine tech experts builds and designs software and delivers end-to-end enterprise solutions, visionary UX and UI design, reliable managed services, and innovative product development offerings. Our company also provides cutting-edge tech consultancy services. We cater to a diverse clientele, including large telecom and satellite operators, financial and banking institutions, manufacturing and automotive companies, as well as mobility and health organizations, supporting their digital transformation journey. With an ever-present human perspective, we focus on adding real value and innovative solutions to shape long-term relationships with our customers. Our delivery model is designed to ensure a seamless, multi-location delivery of solutions, even for the most challenging projects. Qinshift is part of the KKCG Technology pillar.