Match score not available

Data/Platform Engineer (Remote)

Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Strong software development experience with Python, Knowledge of CI/CD and IaC using Terraform, Experience with cloud technologies AWS, Familiarity with data integration tools like Snowflake and Apache Airflow, Hands-on experience with data pipelines and large datasets.

Key responsabilities:

  • Manage tooling integrations and access control.
  • Automate workflows via CI/CD frameworks.
  • Engineer data pipelines and ensure integrity.
  • Develop and maintain documentation for operations.
  • Collaborate with teams for access control data management.
Minutes to Seconds logo
Minutes to Seconds https://www.minutestoseconds.com
2 - 10 Employees
See more Minutes to Seconds offers

Job description

About the job

At Minutes to Seconds, we match people having great skills with tailor-fitted jobs to achieve well-deserved success. We know how to match people to the right job roles to create that perfect fit. This changes the dynamics of business success and catalyzes the growth of individuals. Our aim is to provide both our candidates and clients with great opportunities and the ideal fit every time. We have partnered with the best people and the best businesses in Australia in order to achieve success on all fronts. We’re passionate about doing an incredible job for our clients and job seekers. Our success is determined by the success of individuals at the workplace.


We would love the opportunity to work with YOU!!


Minutes to Seconds is looking for an Data/Platform Engineer in a Contract position.


Requirements
Job Overview

The primary goal of developers is efficiency, consistency, scalability and reliability.

 We are responsible for the Platform … all the tooling integrations, security, access-control, data classification/management, orchestration, self-service “lab” concept, observability, reliability … as well as data availability (data ingestion)

 We are NOT responsible for – Data Modeling, Data Warehousing, Reporting (PowerBI)

… although we do work with PBI team for access control from PBI to Snowflake.

 Everything we do is achieved through code – nothing is manual (or ClickOps) – everything is automated – through the effectiveness of our CI/CD framework – GitHub, GitHub-Actions, Terraform, Python.

Orchestration is centrally managed using Managed Airflow

We manage RBAC / Access Control

We’re responsible for Tooling Integrations and all the connectivity and authentication requirements

·         Ingestion Methods/Patterns

o    Fivetran

o    Snowflake-Snowpipe (File-Based sources)

o    Snowflake-Secure Data Share

·         Solid Software Development (Full SDLC) Experience with excellent Coding skills:

o    Python (required).

o    Good knowledge of Git and GitHub (required)

o    Good code-management experience/best practices (required).

·         Understanding of CI/CD to automate and improve the efficiency, speed, and reliability of software delivery.

    • Best Practices/principals
    • Github Actions
      • automate workflows directly from their GitHub repositories.
      • Automation of building, testing, and deploying code – inc. code linting, security scanning, and version management.
    • Experience with testing frameworks
    • Good knowledge of IaC (Infrastructure as Code) using Terraform (required)
      • EVERYTHING we do is IaC
  • Strong verbal and written skills are a must, ideally with the ability to communicate in both technical and some business language
  • A good level of experience with cloud technologies – AWS - namely S3, Lambda, SQS, SNS, API Gateway (API Development), Networking (VPCs), PrivateLink and Secrets Manager.
  • Extensive hands-on experience engineering data pipelines and a solid understanding of the full data supply chain, from discovery & analysis, data ingestion, processing & transformation, to consumption/downstream data-integration.

·         A passion for continuous improvement and learning, for optimization both in terms of cost and efficiency, as well as ways of working. Obsessed with data observability (aka data reconciliation), ensuring pipeline and data integrity.

  • Experience working with large structured/semi-structured datasets
    • A good understanding of Parquet, Avro, JSON/XML
  • Experience with Apache Airflow / MWAA or similar orchestration tooling.
  • Experience with Snowflake as a Data Platform
    • Solid understanding of Snowflake Architecture – compute, storage, partitioning etc.
    • Key features - such as COPY-INTO, Snowpipe, object-level tagging and masking policies
    • RBAC (security model) – design and administration – intermediate skill required

o    query performance tuning, and zero copy clone – nice to have

o    virtual warehouse (compute) sizing

  • TSQL experience – ability to understand complex queries and think about optimisation - advantageous
  • Data Modelling experience – advantageous
  • Exposure to dbt (data build tool) for data transformations – advantageous
  • Exposure to Alation or other Enterprise Metadata Management (EMM) tooling – advantageous
  • Documentation: architectural designs, operational procedures, and platform configurations to ensure smooth onboarding and troubleshooting for team members.
Please send resume at soumik.saha@minutestoseconds.com


Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Verbal Communication Skills

Data Engineer Related jobs