DataOps Engineer (f/m/d)

Deutsche Börse Group Futurama Business Park, Sokolovska 662/136b, Praha 18600 Nadstandardní
Are you driven by innovation just like us? Do you also have technology as part of your DNA? Join us on the journey of fostering growth and contribute to the prosperity of future generations. Rate. Code. Test. With us.

What will be your position in our team?

Datawarehouse Luxembourg builds, maintain and operates applications related to Clearstream Data Warehouse area via numerous applications called: CDL application (Clearstream Data Layer built on cloud latest technologies), Claims and Risk Management, Risk management, Regulatory Reporting, Customer’s Reporting, Reference Data Repository and services related to Trends and Patterns Monitoring, Credit Customer rating application (Jewel).

What will your working days look like? 

  • Design, build and optimize data pipelines to facilitate the extraction of data from multiple sources
  • Be first level support for end user’s requests when dealing with data pipeline issues
  • Develop ETL/ELT (batch/stream) from multiple sources using Spark and/or Kafka
  • Operate the data pipelines to ensure key SLAs are managed across a wide range of producers and consumers
  • Support various components of the data pipelines, including ingestion, validation, cleansing and curation
  • Promote data collaboration, orchestration, quality, security, access, and ease of use
  • Gather data requirements from analytics and business departments
  • Write and maintain operational and technical documentation and perform tasks in Agile methodology

You are a good fit if you have..

   Business skills

  • Hands on experience with cloud native technologies, Azure/GCP
  • Direct experience in building data pipelines such as Data Factory, Data Fusion, or Apache Airflow
  • Understanding of wide range of big data technologies and data architecture
  • Familiarity with data platforms and technologies such as Kafka, Delta Lake, Spark; Databricks is a plus
  • Demonstrated experience in one or more programming languages, preferably Python
  • Good knowledge of CI/CD and version control tools such as Jenkins and GitHub Actions
  • Experience with monitoring tools like Prometheus, Grafana, ELK stack is a plus

   Personal skills

  • Ability to bring both engineering and operational (DevOps) mindset to the role
  • Strong team player willing to cooperate with colleagues across office locations and functions
  • Very strong English language skills are a must

Co je potřeba

Data Factory
Data Fusion
Apache Airflow
Delta Lake


  • Praxe
    3 roky
  • Vzdělání
  • Jazyky
    Anglicky – domluvit se


  • Typ odměny
  • Vzdálená práce
  • Pracovní prostředí
  • Typ práce nebo projektu
    Moderní technologie Inovační projekty Transformační projekty

Jaké jsou benefity?

  • Hybrid working model
  • Fully covered public transportation card & MultiSport card
  • Meal allowance and Flexible benefit account
  • Extra Vacation days
  • Discounted company shares
  • Employee referral bonus
  • Fully covered sick leave and maternity leave
  • High-coverage life and accident insurance & Pension Fund contribution
  • Budget for personal growth
  • Company initiatives (LGBTQ+ network, ToastMasters, Women@DBG, ...)
  • Stable and multicultural company with an excellent location in Karlín

Další informace o pozici

  • Typ smlouvy
    Interní pozice (HPP a další)
  • Typ pracovního úvazku
    Plný úvazek
  • Typ firmy
    Velká firma nebo korporace
  • Místo pracoviště
    Futurama Business Park, Sokolovska 662/136b, Praha 18600