Data Engineer (Snowflake)

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Bachelor's degree in Computer Science, Engineering, or a related field., Experience with Snowflake and AWS services such as S3, Glue, and Lambda., Proficiency in data modeling, ETL/ELT processes, and data pipeline development., Strong analytical skills and ability to collaborate with cross-functional teams..

Key responsibilities:

  • Design, build, and maintain enterprise-grade data warehouses using Snowflake.
  • Develop scalable and secure data pipelines on AWS.
  • Collaborate with business and analytics teams to translate requirements into technical solutions.
  • Implement and maintain ETL/ELT processes for structured and semi-structured data.

Addepto logo
Addepto Startup http://www.addepto.com
51 - 200 Employees
See all jobs

Job description

Addepto is a leading consulting and technology company specializing in AI and Big Data, helping clients deliver innovative data projects. We partner with top-tier global enterprises and pioneering startups, including Rolls Royce, Continental, Porsche, ABB, and WGU. Our exclusive focus on AI and Big Data has earned us recognition by Forbes as one of the top 10 AI consulting companies.


As a Data Engineer, you will have the exciting opportunity to work with a team of technology experts on challenging projects across various industries, leveraging cutting-edge technologies. Here are some of the projects we are seeking talented individuals to join:

  • Development of an operational warehouse for a big automotive client supporting near real-time processing and building foundations for integrating AI into their business processes.

  • Building and development of a modern data warehouse for the US retail industry, enabling the client to make data-driven decisions and build foundations for future AI features. This role will require a consultant mindset to help and guide the client through the product's roadmap.

  • Development and maintenance of a large platform for processing automotive data. A significant amount of data is processed in both streaming and batch modes. The technology stack includes Spark, Cloudera, Airflow, Iceberg, Python, and AWS.


🚀 Your main responsibilities:

  • Design, build, and maintain enterprise-grade data warehouses using Snowflake.

  • Develop scalable and secure data pipelines on AWS (e.g., S3, Glue, Lambda, IAM).

  • Collaborate with business and analytics teams to translate requirements into technical solutions.

  • Build data models and dashboards in Tableau to support business decision-making.

  • Implement and maintain ETL/ELT processes for structured and semi-structured data.

  • Optimize performance, cost, and scalability of cloud-based data platforms.

  • Ensure data governance, security, and quality best practices are followed.

  • Participate in architecture decisions and recommend improvements to existing systems.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Problem Solving

Data Engineer Related jobs