Match score not available

DevOps Engineer for Data Domain AAA (REF3440K)

Remote: 
Full Remote
Work from: 

Offer summary

Qualifications:

Experience as a DevOps Engineer with Kubernetes., Proficiency in automation and configuration management tools., Familiarity with cloud platforms., Knowledge of DevOps principles and best practices..

Key responsabilities:

  • Manage applications on the CaaS platform using Kubernetes.
  • Oversee the deployment pipeline and coordinate with development teams.
Deutsche Telekom IT Solutions HU logo
Deutsche Telekom IT Solutions HU XLarge https://www.deutschetelekomitsolutions.hu/
5001 - 10000 Employees
See more Deutsche Telekom IT Solutions HU offers

Job description

Company Description

The largest ICT employer in Hungary, Deutsche Telekom IT Solutions (formerly IT-Services Hungary, ITSH) is a subsidiary of the Deutsche Telekom Group. Established in 2006, the company provides a wide portfolio of IT and telecommunications services with more than 5000 employees. ITSH was awarded with the Best in Educational Cooperation prize by HIPA in 2019, acknowledged as one of the most attractive workplaces by PwC Hungary’s independent survey in 2021 and rewarded with the title of the Most Ethical Multinational Company in 2019. The company continuously develops its four sites in Budapest, Debrecen, Pécs and Szeged and is looking for skilled IT professionals to join its team.

Job Description

We are seeking a talented and experienced DevOps Engineer with expertise in Kubernetes to join our team. As a DevOps Engineer, you will play a crucial role in managing deployments, automating processes, and facilitating seamless communication with our development team based in India. Your primary responsibility will be to ensure the reliability and efficiency of our Data Integration Layer (DIL) application running on a Kubernetes platform provided as a Container-as-a-Service (CaaS).

Key Responsibilities:

  1. Kubernetes Expertise:
    • Leverage your deep knowledge of Kubernetes to effectively manage applications on the CaaS platform.
    • Collaborate with the CaaS provider to optimize cluster configurations.
  2. Deployment Management:
    • Oversee the deployment pipeline, ensuring smooth and efficient releases.
    • Coordinate closely with development teams to streamline the deployment process.
  3. Automation:
    • Develop and maintain automation scripts and tools for operational efficiency.
  4. Collaboration:
    • Work closely with our Indian development team, providing support and maintaining open lines of communication.
    • Promote a collaborative DevOps culture within the organization.
  5. Continuous Integration/Continuous Deployment (Magenta CI/CD):
    • Implement and enhance CI/CD pipelines for applications hosted on the CaaS platform.
    • Monitor and optimize the CI/CD process for reliability and speed.
  6. Monitoring and Troubleshooting:
    • Implement monitoring solutions (CaaS resources / DIL pipelines)
    • Identify and resolve issues related to application performance and reliability.
  7. Security:
    • Ensure the security of applications and data within the Kubernetes clusters.
    • Implement security best practices and conduct security assessments
    • Knowledge about PSA/SoCs   

Qualifications
  • Experience as a DevOps Engineer with a focus on Kubernetes.
  • Proficiency in English, both written and verbal.
  • Knowledge of DevOps principles and best practices.
  • Experience with containerization technologies.
  • Proficiency in automation and configuration management tools.
  • Familiarity with cloud platforms.
  • Excellent problem-solving and communication skills.


If you are a proactive, self-motivated DevOps Engineer with a passion for Kubernetes and a desire to collaborate in a global environment, we would love to hear from you. Join us in optimizing our applications and ensuring their success on our Kubernetes-based CaaS platform.

Additional Information

DIL description: The Data Integration Layer is user interface-based data integration platform with batch and streaming data processing capacities. DIL enables any data engineer or data scientist to quickly build data pipelines (data transfer between source and target systems) via visual exploration tool. DIL captures user inputs and generates the appropriate code to run the pipeline in the Extract, Transform & Load (ETL) engine with advance feature like Logs visualizer, Governance, Collaborative development support and metrics (statistics) to accelerate the use cases using one platform.
DIL is also used in Data Tribe to initiate and to foster the migration to GCP (google cloud platform).

* Please be informed that our remote working possibility is only available within Hungary due to European taxation regulation.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication
  • Problem Solving

DevOps Engineer Related jobs