.
Dataflow Engineer @ Mindbox Sp. z o.o.
  • Kraków
Dataflow Engineer @ Mindbox Sp. z o.o.
Kraków, Kraków, Lesser Poland Voivodeship, Polska
Mindbox Sp. z o.o.
8. 3. 2026
Informacje o stanowisku

At Mindbox we connect top IT talents with technology projects for leading enterprises across Europe. 

Are you passionate about building robust, scalable data pipelines that power analytics and decision-making? We are looking for a Dataflow Engineer to design, develop, and maintain efficient data workflows that ensure seamless data movement across systems.

Sounds like your kind of challenge? 

#Li-Hybrid - Hybrid work model: 2 days per week from the office in Kraków

What you get in return

  • Flexible cooperation model – choose the form that suits you best
    (B2B, employment contract, etc.)
  • Hybrid work setup – remote days available depending on the client’s arrangements
  • Collaborative team culture – work alongside experienced professionals eager to share knowledge
  • Continuous development – access to training platforms and growth opportunities
  • Comprehensive benefits – including Interpolska Health Care, Multisport card, Warta Insurance, and more
  • High quality equipment – laptop and essential software provided

Experience:

  • 4+ years as a Dataflow Engineer, Data Engineer, or similar role working with large datasets and distributed systems.

Technical Skills:

  • Proficiency in Python and/or Java.
  • Hands-on experience with data pipeline orchestration tools (e.g., Google Dataflow).
  • Strong knowledge of Google Cloud services (BigQuery, Dataflow).
  • Expertise in ETL frameworks, real-time data streaming, and processing.
  • Familiarity with data formats like JSON, Parquet.
  • Knowledge of SQL and NoSQL databases, data governance, and security best practices.

Other Skills:

  • Strong problem-solving abilities for complex data integration issues.
  • Excellent communication skills for collaboration with technical and non-technical stakeholders.

Joining this project you’ll become part of Mindbox – a tech-driven company where consulting, engineering, and talent meet to build meaningful digital solutions. We’ll back you up every step of the way, accelerate your development, and ensure your skills make a difference. 

At Mindbox we connect top IT talents with technology projects for leading enterprises across Europe. 

Are you passionate about building robust, scalable data pipelines that power analytics and decision-making? We are looking for a Dataflow Engineer to design, develop, and maintain efficient data workflows that ensure seamless data movement across systems.

Sounds like your kind of challenge? 

#Li-Hybrid - Hybrid work model: 2 days per week from the office in Kraków

What you get in return

  • Flexible cooperation model – choose the form that suits you best
    (B2B, employment contract, etc.)
  • Hybrid work setup – remote days available depending on the client’s arrangements
  • Collaborative team culture – work alongside experienced professionals eager to share knowledge
  • Continuous development – access to training platforms and growth opportunities
  • Comprehensive benefits – including Interpolska Health Care, Multisport card, Warta Insurance, and more
  • High quality equipment – laptop and essential software provided
,[Design and Build Data Pipelines: Develop and implement efficient pipelines for collecting, transforming, and storing data across multiple platforms., Data Integration: Integrate data from diverse sources, including cloud platforms (Google Cloud), databases (SQL/NoSQL), APIs, and external services., Optimize Data Flow: Troubleshoot and fine-tune existing pipelines for optimal performance., Data Transformation: Implement ETL processes to convert raw data into analytics-ready formats., Collaboration: Work with cross-functional teams to understand data needs and provide solutions; support applications during weekends or non-office hours when needed., Automation & Scalability: Build scalable, automated workflows to handle large data volumes with high reliability and low latency., Monitoring & Maintenance: Set up monitoring and alerting systems to ensure minimal downtime and maximum performance., Documentation: Maintain clear documentation of data flows, pipeline configurations, and processing logic. Requirements: Python, Java, Google Cloud, BigQuery, ETL, JSON, SQL, NoSQL, Security Additionally: Sport Subscription, Private healthcare, Life insurance, Training budget, Small teams, Free coffee, Free snacks, In-house trainings, Modern office, No dress code.

  • Praca Kraków
  • Kraków - Oferty pracy w okolicznych lokalizacjach


    117 922
    19 280