Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

5+ years of experience in data engineering or backend data development., Strong knowledge of data pipeline design, integration frameworks, and ETL tools., Proficiency in SQL and at least one programming language (e.g., Python, Scala)., Experience with cloud or hybrid data architectures and real-time data streaming frameworks..

Key responsabilities:

  • Design, build, and maintain scalable and reliable data pipelines for ingesting data from various sources.
  • Develop and maintain ETL/ELT processes to support real-time and batch analytics.
  • Collaborate with Data Architects to design optimal data models and storage structures for analytics workloads.
  • Work closely with business and analytics teams to understand data needs and translate them into technical solutions.

Intellectsoft logo
Intellectsoft Computer Software / SaaS SME https://www.intellectsoft.net/
51 - 200 Employees
See all jobs

Job description

Join our team in building a modern, high-impact Analytical Platform for one of the largest integrated resort and entertainment companies in Southeast Asia. This platform will serve as a unified environment for data collection, transformation, analytics, and AI-driven insights—powering decisions across marketing, operations, gaming, and more.

You’ll work closely with Data Architects, ML Engineers, Business Analysts, and DevOps to design and implement scalable data solutions.

Requirements

  • 5+ years of experience in data engineering or backend data development.
  • Strong knowledge of data pipeline design, integration frameworks, and ETL tools.
  • Experience working with cloud or hybrid data architectures.
  • Proficiency in SQL and at least one programming language (e.g., Python, Scala).
  • Hands-on experience with distributed data processing (e.g., Spark, Flink) is a plus.
  • Familiarity with data lake, data warehouse, or lakehouse architectures.
  • Experience with real-time data streaming and ingestion frameworks is a strong advantage.
  • Understanding of data security, privacy, and compliance best practices.
  • Experience working in Agile/Scrum environments.

Nice to have skills

  • Experience with modern open-source tools (e.g., Airflow, dbt, Delta Lake, Apache Kafka).
  • Exposure to machine learning pipelines or working alongside ML teams.
  • Familiarity with BI tools and data visualization concepts.
  • Experience working in regulated industries (e.g., gaming, finance, hospitality).

Responsibilities:

  • Design, build, and maintain scalable and reliable data pipelines for ingesting data from various sources (internal systems, APIs, external platforms).
  • Work with structured, semi-structured, and unstructured data, ensuring data quality, consistency, and integrity.
  • Develop and maintain ETL/ELT processes to support real-time and batch analytics.
  • Collaborate with Data Architects to design optimal data models and storage structures for analytics workloads.
  • Implement data validation, deduplication, and transformation logic.
  • Contribute to the definition of data governance, security, and access policies.
  • Participate in platform scaling and performance optimization initiatives.
  • Work closely with business and analytics teams to understand data needs and translate them into technical solutions.

Benefits

  • 35 absence days per year for work-life balance
  • Udemy courses of your choice
  • English courses with native-speaker
  • Regular soft-skills trainings
  • Excellence Сenters meetups
  • Online/offline team-buildings
  • Business trips

Required profile

Experience

Industry :
Computer Software / SaaS
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Hospitality
  • Collaboration
  • Problem Solving

Data Engineer Related jobs