data Engineer_roji_innowrap

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field., 3-6 years of experience in a Data Engineer role., Proficiency in SQL and experience with relational and NoSQL databases like MySQL and MongoDB., Familiarity with data pipeline management tools and programming languages such as Python, Java, or Scala..

Key responsibilities:

  • Build and optimize data pipelines, architectures, and datasets.
  • Perform root cause analysis on data to identify business opportunities and improvements.
  • Develop and enhance data architecture using AWS services and other big data technologies.
  • Collaborate with cross-functional teams to support data transformation and integrity.

CodersBrain logo
CodersBrain SME https://www.codersbrain.com/
201 - 500 Employees
See all jobs

Job description

Job Description
  •   
• Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
• Experience building and optimizing big data, data pipelines, architectures and data sets.
• Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
• Strong analytic skills related to working with unstructured datasets.
• Build processes supporting data transformation, data structures, metadata, dependency and workload management.
• A successful history of manipulating, processing and extracting value from large disconnected datasets.
• Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
• Experience supporting and working with cross-functional teams in a dynamic environment.

We are looking for a candidate with 3 - 6 years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:

• Develop and improve the current data architecture using AWS Redshift, AWS S3, AWS Aurora (Postgres),Spark, AWS Glue and Hadoop/EMR.
• Improve upon the data ingestion models, ETL jobs, and alarming to maintain data integrity and data availability.
• Experience with relational SQL and NoSQL databases including MySQL and MongoDB.
• Experience with data pipeline and workflow management tools like Airflow, etc.
• Experience with stream-processing systems: Kinesis, Spark-Streaming, etc.
• Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc
• Strong project management and organizational skills.
• Experience supporting and working with cross-functional teams in a dynamic environment.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Analytical Skills
  • Organizational Skills
  • Teamwork

Related jobs