.
[Remote] Data Engineer GCP Airflow BigQuery @ DS STREAM
  • Katowice
[Remote] Data Engineer GCP Airflow BigQuery @ DS STREAM
Katowice, Katowice, Silesian Voivodeship, Polska
DS STREAM
14. 3. 2026
Informacje o stanowisku

Data Engineer

We’re looking for an experienced Data Engineer to build, scale, optimize, and maintain reliable data platforms. You’ll work on real-time pipelines, ML infrastructure, attribution data processing, and cloud-native solutions in a high-volume production environment.

Tech Stack

  • GCP
  • Apache Spark, dbt, BigQuery
  • Python, SQL, Scala
  • Apache Airflow
  • Terraform, GitHub Actions, Docker, Kubernetes
  • VertexAI

Must Have

  • Over 3 years of experience in Data Engineering and building production-scale data platforms.
  • Strong programming skills in Python and SQL.
  • Advanced experience with data modeling, ETL development, and multiple data formats.
  • Strong knowledge of Google Cloud Platform and cloud-native architecture design.
  • Expert knowledge of Apache Airflow for workflow orchestration and automation.
  • Hands-on experience with Apache Spark for batch and high-volume streaming workloads.
  • Good understanding of CI/CD and DevOps tools such as GitHub Actions or Kubernetes.

Nice to Have

  • Experience with MLOps and ML deployment in production, preferably with Vertex AI.
  • Practical experience with Terraform for Infrastructure as Code.
  • Experience building scalable REST APIs for data or ML services.
  • Strong testing practices, including TDD and unit/integration tests for Spark and Airflow.
  • Knowledge of Scala for high-performance data processing.

Data Engineer

We’re looking for an experienced Data Engineer to build, scale, optimize, and maintain reliable data platforms. You’ll work on real-time pipelines, ML infrastructure, attribution data processing, and cloud-native solutions in a high-volume production environment.

Tech Stack

  • GCP
  • Apache Spark, dbt, BigQuery
  • Python, SQL, Scala
  • Apache Airflow
  • Terraform, GitHub Actions, Docker, Kubernetes
  • VertexAI
,[Build and maintain large-scale advertising data pipelines for clicks, impressions, attribution, and event processing. , Improve attribution modeling logic to support better optimization and business decisions. , Ensure high availability, performance, stability, and scalability of real-time data workflows. , Design and support production-grade ML model serving infrastructure., Develop and manage dbt datasets for model training, experimentation, and Feature Store integration.  Requirements: Google Cloud Platform, ETL, Airflow, Python, SQL, Spark, DevOps, CI/CD, Kubernetes, MLOps, AI, Scala, REST API, Terraform, VertexAI Additionally: Sport subscription, Training budget, Private healthcare, Flat structure, Small teams, International projects.

  • Praca Katowice
  • Specjalista ds. telekomunikacji Katowice
  • Specjalista ds. PR Katowice
  • Specjalista ds. transportu Katowice
  • Specjalista ds. HR Katowice
  • Specjalista ds. turystyki Katowice
  • Specjalista ds. Zaopatrzenia Katowice
  • Specjalista ds. finansów Katowice
  • Pracownik ds. administracji Katowice
  • Specjalista ds. reklamy Katowice
  • Specjalista ds. Sprzedaży Katowice
  • Katowice - Oferty pracy w okolicznych lokalizacjach


    124 223
    20 358