Match score not available

Data Engineer (w/ Python)

Remote: 
Full Remote
Work from: 

Offer summary

Qualifications:

Proficiency in Python and ETL tools, Experience with SQL databases and APIs, Strong understanding of data processing pipelines, Bachelor's degree in Computer Science or related field.

Key responsabilities:

  • Design and implement data processing pipelines
  • Build components to connect with data sources
The Codest logo
The Codest Information Technology & Services SME https://thecodest.co/
51 - 200 Employees
See all jobs

Job description

🌍 Hello World!

We are The Codest -  International Tech Software Company with tech hubs in Poland delivering global IT solutions and projects. Our core values lie in “Customers and People First” approach that prioritises the needs of our customers and a collaborative environment for our employees, enabling us to deliver exceptional products and services.

Our expertise centers on web development, cloud engineering, DevOps and quality.  After many years of developing our own product - Yieldbird, which was honored as a laureate of the prestigious Top25 Deloitte awards, we arrived at our mission: to help tech companies build impactful product and scale their IT teams through boosting IT delivery performance. Through our extensive experience with product development challenges, we have become experts in building digital products and scaling IT teams. 

But our journey does not end here - we want to continue our growth. If you’re goal-driven and looking for new opportunities, join our team! What awaits you is an enriching and collaborative environment that fosters your growth at every step.

We are currently looking for:

DATA ENGINEER

Here, you will have an opportunity to contribute to a banking app for one of the leading financial groups in Japan. The platform is equipped with bank modules and data management features and it is customer-facing as well. The Data Flow Team of 20 members is dedicated to integrating internal, vendor-hosted, and third-party systems, managing data flows triggered by specific rules or events with frequencies ranging from every ten minutes to monthly.

💡 Key Responsibilities:

  • Design and implement data processing pipelines according to requirements, including steps for data transformation, validation, and mapping

  • Build necessary components to connect with various data sources and destinations, such as APIs, SQL databases, S3 buckets, and SFTP servers

  • Update and modify existing data flows within the ETL tool as needed

  • Conduct comprehensive testing and validation to ensure the accuracy of data transformations, verification, and final delivery

  • Create and execute unit and regression tests

  • Provide post-deployment support and resolve any issues that arise



Required profile

Experience

Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Teamwork
  • Problem Solving

Data Engineer Related jobs