Match score not available

Data Software Development Engineer

Remote: 
Full Remote
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Bachelor's degree in Computer Science or related field., 2+ years of experience with Python., 2+ years of experience in AWS data technologies., Experience with data modeling and storage techniques..

Key responsabilities:

  • Build and contribute to a data platform.
  • Work on ingestion, normalization, and enrichment of marketing data.

Sky Systems, Inc. (SkySys) logo
Sky Systems, Inc. (SkySys) Information Technology & Services Startup https://myskysys.com/
11 - 50 Employees
See all jobs

Job description

Role: Data Software Development Engineer
Position Type: Full-Time Contract (40hrs/week)
Contract Duration: Long-term contract
Work Hours: US Time
Work Schedule: 8 hours/day (Mon-Fri)
Location: 100% Remote (Candidates can work from anywhere in Brazil)

The data software development engineer is responsible for building and contributing to our next generation data platform. This individual will be working on a development team focused on the ingestion, normalization, and enrichment of marketing data as it flows through a data lake, various databases, and data orchestration procedures. This position will also work on piping that data to many of the internal consumers and marketing automation tools. The ideal candidate will have experience with data modeling, data access and data storage techniques using the AWS ecosystem and Snowflake.


Key functions:

  • Experience with the AWS Data processing ecosystems, (Glue, Lambda, Step Functions, EMR, API-Gateway, RDS, DMS)
  • Software programming skills using Python, Scala, or PySpark in a data engineering context
  • Experience with structured and unstructured data, RDBMS and NoSQL Data Modeling, and data processing using JSON and Parquet files
  • Knowledge of agile software development lifecycles such as Scrum and Kanban
  • Experience in DevOps, CI/CD, AWS Cloud formation and deployment of serverless applications
  • Experience with API and Web Services
  • Experience with databases such as Snowflake, Postgres and knowledge of SQL language for data profiling
  • Experience in dataflow monitoring using cloud watch metrics and create monitoring dashboards

Skills and abilities:

  • Experience with the AWS Data processing ecosystems, (Glue, Lambda, Step Functions, EMR, API-Gateway, RDS, DMS)
  • Software programming skills using Python, Scala, or PySpark in a data engineering context
  • Experience with structured and unstructured data, RDBMS and NoSQL Data Modeling, and data processing using JSON and Parquet files
  • Knowledge of agile software development lifecycles such as Scrum and Kanban
  • Experience in DevOps, CI/CD, AWS Cloud formation and deployment of serverless applications
  • Experience with API and Web Services
  • Experience with databases such as Snowflake, Postgres and knowledge of SQL language for data profiling
  • Experience in dataflow monitoring using cloud watch metrics and create monitoring dashboards

Minimum qualifications:

  • Bachelor's degree in Computer Science or related field.
  • 2+ years' work experience showing proficiency in Python.
  • 2+ years' work experience in AWS data technologies (Glue, Kinesis, Lambda, Step Functions, EMR, ECS, SQS, SNS, DMS)

Preferred qualifications:

  • 3+ years' work experience in building/maintaining data lake using AWS Data Services
  • 3+ years' work experience developing in Python, PySpark, Scala, or Rust.
  • 2+ years' experience with databases such as Snowflake, Postgres, or Redshift.
  • 2+ years' experience working in an Agile development environment
  • 2+ years' experience with UNIX/Linux and shell scripting

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Teamwork
  • Problem Solving

Data Engineer Related jobs