.
Dataflow Engineer @ Infolet sp. z o.o.
  • Kraków
Dataflow Engineer @ Infolet sp. z o.o.
Kraków, Kraków, Lesser Poland Voivodeship, Polska
Infolet sp. z o.o.
8. 3. 2026
Informacje o stanowisku

  • Relocation package (4500 PLN total value), paid in three installments (1500 PLN per month) in case your permanent presence in the office is mandatory, and you need to relocate from another city.
  • Benefits: Extended medical care (over 2000 medical facilities in Poland, 80 in Kraków) for you and your family; Multisport Benefit card; life insurance

Must have

  • 5 - 10 years of experience as a Data Engineer, Dataflow Engineer, or in a similar role working with large-scale data systems
  • Strong programming skills in Python and Java
  • Hands-on experience with data pipeline orchestration tools (e.g., Google Dataflow)
  • Solid experience with Google Cloud Platform, including BigQuery and Dataflow
  • Strong background in ETL frameworks, real-time data streaming, and data processing
  • Experience with SQL and NoSQL databases
  • Knowledge of data governance, data quality, and data security best practices
  • Strong problem-solving and troubleshooting skills
  • Fluent English and Polish

Nice to have

  • Experience with additional cloud services or multi-cloud environments
  • Familiarity with data formats such as JSON, Parquet, or similar
  • Experience building highly available, low-latency data pipelines
  • Exposure to on-call or out-of-hours support models
  • Strong communication skills and experience working with both technical and non-technical stakeholders


  • Relocation package (4500 PLN total value), paid in three installments (1500 PLN per month) in case your permanent presence in the office is mandatory, and you need to relocate from another city.
  • Benefits: Extended medical care (over 2000 medical facilities in Poland, 80 in Kraków) for you and your family; Multisport Benefit card; life insurance
,[Design, build, and maintain efficient data pipelines for collecting, transforming, and storing data, Integrate data from various sources, including cloud platforms (Google Cloud), SQL/NoSQL databases, APIs, and external services., Optimize and troubleshoot existing pipelines to ensure high performance and reliability., Implement ETL processes to transform raw data into analytics-ready datasets., Collaborate with cross-functional teams (data engineering, operations) to understand data requirements and deliver solutions., Support data applications when required during weekends or non-office hours, Build scalable, automated workflows capable of processing large data volumes with low latency., Set up monitoring and alerting for data pipelines to minimize downtime, Create and maintain technical documentation for data flows and pipeline configurations Requirements: Dataflow, Python, Java Additionally: Sport subscription, Private healthcare, Life & group insurance, Free coffee, Bike parking, Modern office, No dress code.

  • Praca Kraków
  • Kraków - Oferty pracy w okolicznych lokalizacjach


    117 922
    19 280