AWS Big data developer_Manju_Capgemini

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

4 to 6 years of experience as an AWS Developer., Proficient in Hadoop Architecture and its ecosystem, including HDFS and Spark SQL., Strong experience with AWS Big Data tools such as EMR, Glue, and Athena., Certified in AWS Solution Architect with knowledge of Python and version control tools..

Key responsibilities:

  • Develop and manage AWS Big Data solutions using various AWS tools.
  • Write and optimize queries for data processing and analysis.
  • Collaborate with teams to implement data storage and retrieval strategies.
  • Ensure best practices in data management and version control are followed.

CodersBrain logo
CodersBrain SME https://www.codersbrain.com/
201 - 500 Employees
See all jobs

Job description

Location : Bangalore

Mandatory :
• Minimum of 4 to 6 years of relevant work experience as AWS Developer
• Good knowledge of Hadoop Architecture and its ecosystem and possess experience in data
storage HDFS, writing queries HQL or Spark SQL, data processing and analysis using Pyspark
• Strong hands on Experience in AWS Big data tools EMR, Glue, Athena, MSK/Kinesis, IAM, EC2, S3
• Strong hands on Experience in AWS Databases -RDS, Dynamo DB, Redshift/Spectrum, 
•Strong hands on Python script programming using Jupiter notebook
•Experience of version control tools like Git, TFS/Bit Bucket
• Knowledge on AWS Aurora, Neptune, SNS, SQS, Redis, Cloud formation, Lambda, VPC, Glacier, EBS, EFS, Cloudwatch
• Knowledge of CICD, Docker, Terraform, RabbitMQ/Apache Kafka
• Working Knowledge of Presto DB, Apache Spark, Apache Hive, Apache Hudi and Delta tables
• Certified in ‘AWS Solution Architect’.

Good To Have :
• Nice to have knowledge of AWS Data lake formation
• Hive, Presto, Flink connectors.
• Knowledge of Medical domain ( Dicom,HL7, FHIR)

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Related jobs