.
DataFlow Engineer
  • Kraków
DataFlow Engineer
Kraków, Kraków, Lesser Poland Voivodeship, Polska
Mindbox Sp. z o.o.
21. 2. 2026
Informacje o stanowisku

technologies-expected :


  • Python
  • Java
  • Google Cloud Platform
  • BigQuery
  • Dataflow
  • JSON
  • Parquet
  • SQL

about-project :


  • Ready to Shape the Future of Data Engineering? Join Us as a Dataflow Engineer!
  • We are looking for a skilled Dataflow Engineer to join our team and play a key role in designing, building, and maintaining robust data pipelines and workflows. In this role, you will collaborate closely with data architects, analysts, and engineers to ensure seamless data flow across systems and efficient processing at scale.
  • Sounds like your kind of challenge?

responsibilities :


  • Design and Build Data Pipelines: Develop and implement efficient data pipelines for collecting, transforming, and storing data across different platforms.
  • Data Integration: Integrate data from various sources, including cloud platforms (Google Cloud), databases (SQL/NoSQL), APIs, and external services.
  • Optimize Data Flow: Troubleshoot and fine-tune existing pipelines for optimal performance.
  • Data Transformation: Implement ETL processes to transform raw data into usable formats for analytics and reporting.
  • Collaboration: Work with cross-functional teams to understand data needs and implement solutions; provide occasional support during weekends or non-office hours.
  • Automation & Scalability: Build scalable, automated workflows to handle large volumes of data with high reliability and low latency.
  • Monitoring & Maintenance: Set up monitoring and alerting systems for data pipelines to ensure minimal downtime and maximum performance.
  • Documentation: Document data flows, pipeline configurations, and processing logic for maintainability and transparency.
  • Note: Detailed project information will be shared during the recruitment process.

requirements-expected :


  • Experience: 4+ years as a Dataflow Engineer, Data Engineer, or similar role working with large datasets and distributed systems.
  • Proficiency in Python and Java.
  • Hands-on experience with Google Dataflow and orchestration tools.
  • Expertise in Google Cloud Platform (BigQuery, Dataflow).
  • Strong experience with ETL frameworks, real-time data streaming, and processing.
  • Familiarity with data formats like JSON, Parquet, etc.
  • Knowledge of SQL and NoSQL databases, data governance, quality, and security best practices.
  • Problem-Solving: Ability to troubleshoot complex data integration and processing issues.
  • Communication: Strong written and verbal communication skills for collaboration with technical and non-technical stakeholders.

offered :


  • Flexible cooperation model – choose the form that suits you best (B2B, employment contract, etc.)
  • Hybrid work setup – remote days available depending on the client’s arrangements
  • Collaborative team culture – work alongside experienced professionals eager to share knowledge
  • Continuous development – access to training platforms and growth opportunities
  • Comprehensive benefits – including Interpolska Health Care, Multisport card, Warta Insurance, and more
  • High quality equipment – laptop and essential software provided

benefits :


  • sharing the costs of sports activities
  • private medical care
  • sharing the costs of professional training & courses
  • life insurance

  • Praca Kraków
  • Kraków - Oferty pracy w okolicznych lokalizacjach


    119 667
    18 417