Match score not available

Senior Data Engineer (GCP)

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Proficiency in Python and SQL, Experience with GCP, messaging systems, DWH platforms, Knowledge of orchestration/scheduling tools, DevOps principles, Familiarity with GIT, Version Control System, Ability to lead client discussions for improvements.

Key responsabilities:

  • Choose tech/tools, RD, maintenance of the platform's components
  • Construct data intake procedures, efficient models, policies aligned to strategic plans
  • Ensure compliance with security and privacy standards, provide training
  • Craft, construct, uphold structure, tools, and procedures for data processing
  • Opportunity to work remotely, internationally, and continuous learning and development

Job description

Logo Jobgether

Your missions

We are looking for a Senior Data Engineer, whose role involves crafting, constructing, and upholding the structure, tools, and procedures essential for an organization to gather, store, modify, and scrutinize extensive data amounts.

Remote work.

Contract: B2B, UoP

Duties

  • Working alongside Platform Engineers to assess and choose suitable technologies and tools for the project
  • R&D, maintenance, and monitoring of the platform's components
  • Implementing intricate data intake procedures
  • Constructing efficient data models
  • Implementing and executing policies aligned to the strategic plans of the company concerning used technologies, work organization, etc.
  • Ensuring compliance with industry standards and regulations in terms of security and data privacy applied in the data processing layer
  • Providing training and fostering knowledge-sharing

Requirements

  • Proficiency in a programming language like Python and SQL
  • Knowledge of the BigQuery DWH platform
  • Working with Spark messaging systems
  • Experience as a programmer and knowledge of software engineering, good principles, practices, and solutions
  • Familiarity with cloud Google Cloud Platform (GCP)
  • Knowledge of at least one orchestration and scheduling tool, for example, Airflow, Prefect, Dragster, etc.
  • Familiarity with DevOps area and tools - GKE, Docker
  • Experience with Version Control System, preferably GIT
  • Ability to actively participate/lead discussions with clients to identify and assess concrete and ambitious avenues for improvement

Offer

  • Salary: 160 - 200 PLN net + VAT/h B2B (depending on knowledge and experience)
  • 100% remote work
  • Flexible working hours
  • Possibility to work from the office located in the heart of Warsaw
  • Opportunity to learn and develop with the best Big Data experts
  • International projects
  • Possibility of conducting workshops and training
  • Certifications
  • Co-financing sport card
  • Co-financing health care
  • All equipment needed for work

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
Check out the description to know which languages are mandatory.

Data Engineer Related jobs