.
Senior DataOps Engineer (m/f/nb)
  • Kraków
Senior DataOps Engineer (m/f/nb)
Kraków, Kraków, Lesser Poland Voivodeship, Polska
Conrad Electronic sp. z o.o.
4. 12. 2025
Informacje o stanowisku

technologies-expected :


  • Snowflake Data Cloud

about-project :


  • We are a reliable partner who makes the sourcing of technical supplies as easy and efficient as it gets for customers. We do this with passion and innovativeness, day in day out. Our product range comprises millions of quality items, customer-centric solutions and services, and face-to-face expert advice. We are fully committed to this mission. We see change as a catalyst for opportunities, and supply customers with everything they need to successfully complete their projects. As a family-run business, operating in a sustainable way and building long-term business relationships are paramount to us. We stand for quality and reliability.
  • Join the AI & Data Platform team to drive operational excellence for cutting-edge applied AI challenges in retail. We are seeking a skilled DataOps Engineer to design, build, and maintain the robust data infrastructure essential for our AI and Automation initiatives. This critical role requires blending deep cloud expertise with an unwavering commitment to operational rigor, applying software engineering principles (DevOps/CI/CD) directly to the data ecosystem.
  • We offer a hybrid working set-up either in Hirschau, Munich, (Berlin) or Krakow.

responsibilities :


  • Guarantee Data Reliability: Own DataOps implementation and CI/CD to ensure all production data pipelines are reliable, traceable, and scalable for AI initiatives.
  • Optimize Data Velocity: Architect high-performance Airflow and platform solutions that dramatically accelerate data delivery and time-to-value for automation projects.
  • Establish Data Trust: Drive quality and governance standards using dbt, transforming raw data into trusted, documented assets critical for robust ML applications.
  • Orchestration & Reliability: Design, implement, and rigorously monitor production ETL/ELT pipelines using Apache Airflow (DAGs, custom operators). Implement advanced monitoring, logging, and error recovery to guarantee pipeline stability and data freshness for the AI platform.
  • DataOps & CI/CD: Apply full DataOps and DevOps principles (CI/CD, version control, automated testing) across the entire data development lifecycle. Enforce rigorous code quality standards for all transformation logic (SQL/Python).
  • Infrastructure & Performance: Manage and scale core data components (DW, Airflow) using Infrastructure as Code (IaC). Proactively optimize the AI & Data platform performance (query tuning, partitioning) to support low-latency AI inference and resolve critical production issues.
  • Documentation: Maintain clear and comprehensive documentation for data pipelines and architecture.

requirements-expected :


  • 3+ years in a DataOps or highly related Data Engineer role.
  • Expertise in Apache Airflow (complex DAGs, custom operators, production management).
  • Deep hands-on experience with dbt (model development, data quality testing, package management).
  • Expert-level SQL and deep experience working with public Cloud (GCP preferred) data services (e.g., BigQuery, Snowflake, Redshift).
  • Strong Python skills for ETL/Airflow development.
  • Experience with Infrastructure as Code (IaC) tools like Terraform.
  • Practical knowledge of software engineering principles (testing, CI/CD).
  • Excellent collaboration and analytical skills.
  • Fluent English is neccessary, German language is a plus.

benefits :


  • sharing the costs of sports activities
  • private medical care
  • fruits
  • corporate products and services at discounted prices
  • no dress code
  • coffee / tea
  • holiday funds
  • platforma WellBee

  • Praca Kraków
  • Kraków - Oferty pracy w okolicznych lokalizacjach


    127 271
    17 452