Senior Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field., 5+ years of data engineering experience, with at least 3 years in cloud-based solutions., Proficiency in programming languages like Python, Java, or Scala, and solid knowledge of SQL and NoSQL technologies., Experience with big data tools and familiarity with containerization and orchestration technologies..

Key responsabilities:

  • Design, develop, and maintain scalable data infrastructure on cloud platforms like AWS, GCP, or Azure.
  • Implement cloud-native solutions and manage real-time and batch data pipelines using tools like Apache Airflow and Kafka.
  • Optimize ETL/ELT processes and ensure data privacy, security, and regulatory compliance.
  • Mentor junior engineers, evaluate emerging technologies, and drive automation in deployment and monitoring.

Tawzef for Recruitment & HR Consultancy logo
Tawzef for Recruitment & HR Consultancy

Job description

This is a remote position.

● Design, develop, and maintain scalable, secure, and cost-effective data infrastructure on cloud platforms such as AWS, GCP, or Azure.
● Implement cloud-native solutions like data lakes, warehouses (e.g., Snowflake, BigQuery, Redshift), and serverless computing. Build, deploy, and manage real-time and batch data pipelines using tools such as Apache Airflow, Apache Kafka, or cloud- native orchestration solutions.
● Optimize ETL/ELT processes for seamless data ingestion, transformation, and integration from diverse sources. Ensure high performance, scalability, and cost-efficiency of cloud data solutions through tuning and capacity planning.
● Implement caching strategies and data partitioning techniques for large-scale datasets. Enforce best practices for data privacy, security, and regulatory compliance (e.g., GDPR, PCI DSS).
● Implement identity management, encryption, and monitoring tools to safeguard sensitive data. Work closely with cross-functional teams to understand business requirements and deliver cloud-based data solutions.
● Mentor junior data engineers and contribute to team knowledge sharing. Evaluate emerging cloud and big data technologies to recommend enhancements to our data infrastructure.
● Drive automation in deployment, testing, and monitoring using CI/CD pipelines and Infrastructure-as-Code (e.g., Terraform, CloudFormation).

Requirements
● Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field. 5+ years of data engineering experience with at least 3 years working on cloud-based solutions.
● Proven experience in the fintech domain is a significant advantage. Strong expertise in cloud platforms such as AWS (S3, Redshift, Glue), GCP (BigQuery, Dataflow), or Azure (Data Factory, Synapse Analytics).
● Proficiency in programming languages like Python, Java, or Scala.
● Solid knowledge of database systems and SQL, including NoSQL technologies (e.g., DynamoDB, MongoDB).
● Experience with big data tools such as Apache Spark, Hadoop, or Databricks.
● Strong familiarity with containerization (Docker) and orchestration (Kubernetes).
● Hands-on experience with Infrastructure-as-Code tools (Terraform, CloudFormation)

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Mentorship
  • Collaboration

Data Engineer Related jobs