Big Data Engineer _ Tanvi _ Tech Mahindra @ 7 Y _ Bangalore and Pune

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Bachelor's degree in Engineering in Computer Science or Electronics & Telecommunication., Minimum of 2 years of Big Data experience and 5+ years of software development experience., Strong knowledge of distributed systems and hands-on engineering skills with Big Data technologies like Hadoop, AWS, and MongoDB., Familiarity with programming languages such as Java, Python, and Ruby, as well as experience with real-time analytics and NoSQL data stores..

Key responsibilities:

  • Develop scalable infrastructure for collecting, analyzing, and processing large amounts of data.
  • Collaborate with various teams and organizations, including partners and customers, to deliver Cloud Native solutions.
  • Build and maintain big data environments both on-premises and in the cloud.
  • Deploy container-based applications and develop API-centric solutions.

CodersBrain logo
CodersBrain SME https://www.codersbrain.com/
201 - 500 Employees
See all jobs

Job description

Job Description

We are making substantial investments to continue its leadership position in the emerging digital business ecosystem as companies pivot to the “#NewAgeDELIVERY”.
 
A key part of this investment lies in designing systems based on Big Data Architecture to help our clients improve agility and speed to market by leveraging modern tools, techniques, and technology to deliver Cloud Native solutions.
 
The successful candidate will have deep experience in one or many Big Data frameworks as well as deep experience with creating Cloud Native solutions through all phases of the software development lifecycle across several Cloud providers including AWS, Azure, and others.
 
 
Qualifications
 
Role
·         Develop scalable infrastructure and platform to collect, analyize and process large amounts of structured and unstructured datawith real-time data interpretation.
·         Work closely across an array of various teams and organizations in the company (including partners, customers)
 
Basic Qualifications:
·         Bachelor's degree in Engineering in Computer Science/Electronics & Telecommunication.
·         Minimum of 2 year of Big Data experience
 
Preferred Qualifications:
·         5+ years of software development experience using multiple computer languages. Experience building large scale distributed data processing systems/applications or large-scale internet systems (cloud computing)
·         Strong foundational knowledge and experience with distributed systems and computing systems in general. Hands-on engineering skills.
·         Should be able to develop big data solution and how that big data solution can be delivered using big data technology such as Hadoop/HDFS,MapReduce, Hive, AWS EMR, MongoDB,Airflow,Oozie,, Yarn, Ambari, ZooKeeper;Sqoop, BIRT or any other big data frameworks(with full life cycle of Hadoop Solution)
·         Hands-on experience with Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning).
·         Firm understanding of major programming/scripting languages like Java, Linux, Ruby, Kafka, Camunda,Phyton and/or R, Shell script.
·         Should be able to build a big data environment on premises or in the cloud
·         Broad understanding and experience of real-time analytics, NoSQL data stores, data modeling and data management, analytical tools, languages, or libraries (e.g. SAS, SPSS, R, Mahout).
·         Experience with deploying container based applications using tools such as Docker
·         Experience developing API centric solutions (REST API)

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Teamwork
  • Communication
  • Problem Solving

Data Engineer Related jobs