Match score not available

Analytical Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Solid foundation in SQL and Python, Experience with modern data frameworks.

Key responsabilities:

  • Lead project delivery and client engagement
  • Design and implement data pipelines
  • Contribute to data platform features
  • Support data modeling in a lakehouse context
  • Write and validate data quality checks
Aboitiz Data Innovation logo
Aboitiz Data Innovation Information Technology & Services Scaleup https://adi.tech
51 - 200 Employees
See more Aboitiz Data Innovation offers

Job description

Logo Jobgether

Your missions

Aboitiz Data Innovation (ADI) is one of the leading up and coming start-ups in the field of Data Science and Artificial Intelligence. We believe data can drive change for a better world by advancing businesses across industries and communities.

We are seeking talented and motivated Analytical Engineers (Mid-level) to join our Data Platform team, who are passionate about designing and creating robust Data Modelling, developing complex data pipelines and data quality checks, and helping our clients navigate through the complex world of data.


If you are excited to work in a fast-paced start-up environment, but against a variety of interesting traditional business domains, this role will provide a great opportunity to grow your career with us. 


As a Mid-level Analytical Engineer, you will work closely with our Data Engineers, Data Architect and Data Platform Engineers to support the design, development, and maintenance of lakehouses. You will have the opportunity to work with cutting-edge, open sourced technologies and contribute to the growth of our data capabilities, while mentored in all aspects of the data engineering life cycle. You will also gain hands-on experience in driving and leading project delivery in the data engineering fields.


Responsibilities:

This role will be client facing, and delivery project focused.  


Commercial Projects

  • Lead the delivery of small to medium size commercial project end to end, assisting the delivery of large or mega size projects as Individual Contributor
  • Conduct / Participate in requirement gathering meetings with clients
  • Independently lead the design of data modeling in the context of a lakehouse
  • Collaborate with Data Engineers, and Data Platform Engineers to develop solutions with our internal data toolings, standards and best practices
  • Design and implement various data pipelines in a config driven approach, including but not limited to ingestion, transformation and reverse ETL, based on client functional requirements
  • Collaborate with Machine Learning engineers to design, create and maintain various upstream pipelines for ML feature engineering in a config driven approach
  • Write data quality checks to validate the accuracy, completeness, and consistency of data at various stage of the pipelines. 


Data Platform / Data Product

  • Assist in ideate, iterate, design and implement features from scratch or based on existing commercial projects that could be provide additional values to the platform users, including but not limited to billing, alerts, query usage analysis, storage/compute resource usage analysis
  • Contribute directly / or making suggestions to platform toolings / internal data products through identifying the reusable part of the Commercial Projects to accelerate the delivery of future projects, thus improve our platform efficiency and capabilities


Requirements

  • Solid foundation in SQL and Python for analytical use cases
  • Solid understanding in Data modeling, Past experience in data modeling is a must
  • Solid understanding of modern data warehouse, lakehouse or lake, past experience with one of the modern warehouse/lakehouse solutions (e.g., Delta Lake, Redshift, Snowflake) is a must 
  • Experience with implementing Data Quality Checks
  • Experience with modern data processing frameworks (e.g., dbt)
  • Experience with modern orchestrators (e.g., Airflow, Dagster)
  • Excellent communication and collaboration abilities
  • Hands-on experience of cloud platforms (e.g., AWS, Azure, GCP)
  • Strong analytical and problem-solving skills
  • Self-motivated with a passion for learning and continuous improvement
  • Good to know
    • Traditional database concepts (e.g Postgres) knowledge
    • Databricks (either Azure or AWS) and understanding of Spark

Required profile

Experience

Industry :
Information Technology & Services
Spoken language(s):
Check out the description to know which languages are mandatory.

Soft Skills

  • Problem Solving
  • Communication
  • Self-Motivation
  • Collaboration
  • Analytical Skills

Data Engineer Related jobs