Data Operations Engineer

  • Kategoria
    IT
  • Rodzaj umowy
    Praca Stala
  • Lokalizacja
    Warszawa

Adecco Poland Sp. z o.o. jest Agencją Zatrudnienia nr 364





Client Description:

As a Data Operations Engineer, you will ensure the stability, reliability, and performance of data-related services and technologies in our hybrid cloud platform. You will operate, maintain, and automate key components of our Kubernetes-based infrastructure, working closely with cross-functional teams to deliver scalable and reliable solutions.


This role offers an opportunity to make a meaningful impact by improving service availability, enhancing operational processes, and collaborating on innovative projects in a fast-paced, agile environment.


 Team is building a cloud-native, hybrid platform to support the development and delivery of cutting-edge software products for the energy sector. This platform provides self-service capabilities to application teams for infrastructure, data management, CI/CD, and operational excellence. 


Responsibilities:

  • Operate, maintain, and manage data technologies such as databases (e.g., PostgreSQL) and message brokers (e.g., Kafka) within Kubernetes environments.
  • Develop and enhance operational processes, focusing on reliability, scalability, and performance.
  • Implement and maintain CI/CD pipelines and Infrastructure as Code (IaC) solutions using tools like Terraform and GitOps (e.g., ArgoCD).
  • Monitor system health, troubleshoot issues, and ensure observability through tools like Prometheus, Grafana, and distributed tracing platforms.
  • Automate routine tasks and deployment processes to minimize manual interventions.
  • Collaborate with internal teams (Customer Success, DevOps, and Software Engineering) to improve platform services.
  • Contribute to incident, problem, change, and release management processes.
  • Support the establishment and growth of an operations team comprising internal and external members.



Candidate Profile:

  • Strong operational experience with containerized and distributed systems in production environments.
  • Proficiency in Kubernetes, including operators and custom resource management.
  • Solid understanding of relational (PostgreSQL) and non-relational (Kafka) data technologies.
  • Hands-on experience with CI/CD pipelines and tools like GitLab CI/CD.
  • Experience with automation and Infrastructure as Code tools (e.g., Terraform, ArgoCD).
  • Knowledge of observability tools such as Prometheus, Grafana, and tracing tools (e.g., Jaeger, OpenTelemetry).
  • Strong troubleshooting and problem-solving skills.
  • Fluency in English (C1 level or higher).