Match score not available

GCP Data Engineer

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

3-5 years experience in GCP analytics projects, Proficiency in GCP analytics services, Expertise in Python, Scala, and SQL, Knowledge of ETL, dimensional modeling, and data governance, Certification in GCP or equivalent preferred.

Key responsabilities:

  • Design and optimize ETL data pipelines
  • Solve complex data problems for insights
  • Source and format data from various touchpoints
  • Create data products for analytics teams
  • Prep data for unified databases ensuring quality
Mindcurv logo
Mindcurv Information Technology & Services SME https://www.mindcurvgroup.com
501 - 1000 Employees
See more Mindcurv offers

Job description

About Mindcurv

We help our customers rethink their digital business, experiences, and technology to navigate the new digital reality. We do this by designing sustainable and accountable solutions for humans living in a digital world. Mindcurv holistically covers the market’s need to digitalise business processes and customer experiences and take advantage of the cloud following DevOps and agile principles. We cater to the following six solution lines:

  1. Strategy and Advisory
  2. Creative Services and Digital Products
  3. Client Engagement Platforms
  4. Digital Experience and Solutions
  5. Data Services
  6. Cloud Platforms and Managed Services

We are made up of a team of experts from various domains that define, create, and improve digital experiences. They engage with people to achieve solutions that enable businesses to grow and scale sustainably.

 

Within our Cloud Platforms Solution Line, we apply an agile approach to provide true on-demand cloud platforms. We implement and operate secure cloud and hybrid global infrastructures using automation techniques for our clients business critical application landscapel

 

Your role

The ideal candidate will have a strong background in data architecture, ETL processes, and cloud technologies, with a focus on building scalable data solutions. You will translate technical and business requirements into automated, operable, scalable and robust solutions. Your responsibilities will include:

  • Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals
  • Solving complex data problems to deliver insights that helps our business to achieve their goals. 
  • Source data (structured→ unstructured) from various touchpoints (API, DB etc.), format and organize them into an analyzable format.
  • Creating data products for analytics team members to improve productivity
  • Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline.
  • Designing and modeling skills for Data warehouses (cloud data warehouse especially)
  • Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions
  • ​Preparing data to create a unified database and build tracking solutions ensuring data quality
  • Create Production grade analytical assets deployed using the guiding principles of CI/CD.

 

Who you are 

  • A highly skilled Data Engineer with extensive experience on the Google Cloud Platform (GCP). 
  • You have 3-5 years of hands-on experience in GCP based analytics projects as a Team lead (Sr role) or team member.
  • Proficiency in GCP analytics services: BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, Cloud Functions, and Cloud Composer.
  • You are an expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) and have extensive experience in data analysis (Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL (required).
  • You are comfortable with one of the many BI tools such as Tableau, Power BI, Looker (not mandatory).
  • ​You have working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs.
  • Added bonus for working in cloud Data warehouses like Big query or Snowflake.
  • Certification in any one of the following or equivalent
    • GCP- Professional Data Engineer
    • Snowflake- Snowpro core- Data Engineer

What do we offer you?

Perks, like drinks and snacks, pizza sessions, gaming nights and sensational parties, a pension scheme and a great salary, we’ve got them all.

And if you crave an intellectual challenge, Mindcurv has you covered. Interesting projects involving the latest, hyper innovative tech. An agile, entrepreneurial environment, with lots of freedom and no politics. Work-life balance, a culture of transparency and a management team with their ears to the ground.

As we typically are the trailblazers and innovators, we’re working Remote First. A hybrid way of working in which you can work from home. You just come into one of our offices in Essen, Cologne, Düsseldorf, Munich, Frankfurt a.M., Hamburg, Jena, Utrecht, Madrid, Cochin, Coimbatore or Trivandrum when it adds value to be on-site. Looking for a glance behind the scenes? Check this blog.

Our high performers

You know who really thrive with us? Self-starters, team-players and continuous learners, with an uncanny ability to handle ambiguity. We’ll equip you with everything you need to succeed, help you explore the length and breadth of your domain and provide you with constant growth opportunities, to enrich your career.

Ready for change?

Are you ready for the next step in your career? For a role in which you can be fully yourself and bring out the best? In yourself, your colleagues and your clients? Don’t wait any longer and apply for this job right now.

 

 

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Reporting

Data Engineer Related jobs