.
Mid-Level Data Engineer – Cloud Data Pipelines
  • Kraków
Mid-Level Data Engineer – Cloud Data Pipelines
Kraków, Kraków, Lesser Poland Voivodeship, Polska
ITDS Polska Sp. z o.o.
26. 2. 2026
Informacje o stanowisku

technologies-expected :


  • Python
  • Java
  • Google Cloud Platform
  • BigQuery
  • Dataflow
  • JSON
  • Parque
  • SQL
  • NoSQL

about-project :


  • As a Mid-Level Data Engineer – Cloud Data Pipelines, you will be working for our client, a leading organization in data-driven solutions. You’ll play a pivotal role in designing, building, and maintaining scalable and robust data pipelines, enabling the organization to harness the full potential of their data assets and accelerate digital transformation.
  • Unleash the power of data — drive innovation through seamless cloud pipelines!
  • Krakow-based opportunity with hybrid work model (up to 3 remote days per week).
  • Only candidates with an existing legal right to work in the European Union will be considered for this role.

responsibilities :


  • Develop and implement efficient data pipelines for collecting, transforming, and storing data across various platforms, ensuring reliable data flow.
  • Integrate data from a range of sources including cloud platforms, databases, APIs, and external services.
  • Troubleshoot and optimize existing pipelines for performance and scalability.
  • Implement ETL processes to convert raw data into valuable insights for analytics and reporting.
  • Collaborate with cross-functional teams to understand data needs and support application requirements, including weekend or non-office hours support.
  • Build scalable, automated workflows capable of handling large data volumes with high reliability and low latency.
  • Set up monitoring and alert systems to minimize downtime and maximize pipeline performance.
  • Document data flows, architecture, and processing logic to ensure maintainability and transparency.

requirements-expected :


  • 4+ years of experience as a Dataflow Engineer, Data Engineer, or similar, working with large datasets and distributed systems.
  • Proficiency in programming languages such as Python and Java.
  • Hands-on experience with data pipeline orchestration tools, especially Google Dataflow.
  • Experience working with cloud data platforms like Google Cloud (BigQuery, Dataflow).
  • Strong expertise in ETL frameworks, real-time data streaming, and processing.
  • Familiarity with data formats like JSON and Parquet.
  • Knowledge of SQL and NoSQL databases, along with best practices in data governance, quality, and security.
  • Excellent troubleshooting skills for complex data issues.
  • Strong communication skills to effectively collaborate with both technical and non-technical stakeholders.

offered :


  • Stable and long-term cooperation with very good conditions
  • Enhance your skills and develop your expertise in the financial industry
  • Work on the most strategic projects available in the market
  • Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years
  • Participate in Social Events, training, and work in an international environment
  • Access to attractive Medical Package
  • Access to Multisport Program
  • Access to Pluralsight
  • Flexible hours

benefits :


  • sharing the costs of sports activities
  • private medical care
  • flexible working time
  • fruits
  • integration events
  • corporate gym
  • saving & investment scheme
  • no dress code
  • coffee / tea
  • drinks
  • christmas gifts
  • birthday celebration
  • sharing the costs of a streaming platform subscription
  • access to +100 projects
  • access to Pluralsight

  • Praca Kraków
  • Kraków - Oferty pracy w okolicznych lokalizacjach


    80 055
    18 043