.
Big Data Engineer
  • Warsaw
Big Data Engineer
Warszawa, Warsaw, Masovian Voivodeship, Polska
Allegro
14. 12. 2025
Informacje o stanowisku

Big Data Engineer

Miejsce pracy: Warszawa

Technologies we use

Expected

  • Scala
  • Java
  • Spark
  • Apache Beam
  • Google Cloud Platform
  • Azure
  • AWS
  • Unix
  • Linux

About the project

Flexible working hours in an office first model (4/1) that depend on you and your team. Starting later or finishing earlier? No problem! Work hours keep pace with our lifestyles and can start between 7 a.m. and 10 a.m.

The salary range for this position depending on the skill set is as follows (contract of employment, tax-deductible cost):

- Data Engineer: PLN 14 200 - 20 200

- Senior Data Engineer: PLN 18 400 - 25 450

- Annual bonus (depending on your annual assessment and the companys results)

Our team is based in Warsaw.

Your responsibilities

As part of the Data & AI area, we implement projects based on the practical "data science" and "artificial intelligence" applications of an unprecedented scale in Poland. Data & AI is a group of over 150 experienced engineers organized into over a dozen teams with various specializations. Some of them build dedicated tools for creating and launching BigData processes or implementing ML models for the entire organization. Others work closer to the client and are responsible for the implementation of the search engine, creating recommendations, building a buyer profile or developing an experimental platform. There are also research teams in the area whose aim is to find solutions to non-trivial problems requiring the use of machine learning.

We are looking for BigData engineers who want to build a highly scalable and fault-tolerant data ingestion for millions of Allegro customers. The platform collects 5 billion clickstream events every day (up to 150k / sec) from all Allegro sites and Allegro mobile applications. This is a hybrid solution using a mix on-premise and Google Cloud Platform (GCP) services like Spark, Kafka, Beam, BigQuery, Pubsub or Dataflow.

Our requirements

  • Are programming in languages such as Scala or Java, Python
  • Strong understanding of distributed systems, data storage, and processing framework like dbt, Spark or Apache Beam
  • Have knowledge of GCP (especially Dataflow and Composer) or other public cloud environments like Azure or AWS
  • Use good practices (clean code, code review, TDD, CI/CD)
  • Navigate efficiently within Unix/Linux systems
  • Possess a positive attitude and team-working skills
  • Are eager for personal development and keeping their knowledge up to date
  • Know English at B2 level

What we offer

  • Possibility to learn and work with backend (Spring, Kotlin) and AI technologies within the team.
  • Well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)
  • A wide selection of varied benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)
  • English classes that we pay for related to the specific nature of your job
  • Macbook Pro / Air (depending on the role) or Dell with Windows (if you dont like Macs) and other gadgets that you may need
  • Working in a team you can always count on — we have on board top-class specialists and experts in their areas of expertise
  • A high degree of autonomy in terms of organizing your team’s work; we encourage you to develop continuously and try out new things
  • Hackathons, team tourism, training budget and an internal educational platform (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)

Benefits

  • sharing the costs of sports activities
  • private medical care
  • sharing the costs of foreign language classes
  • sharing the costs of professional training & courses
  • life insurance
  • flexible working time
  • integration events
  • no dress code
  • leisure zone
  • extra social benefits

Why is it worth working with us

  • At Allegro, you will be responsible for processing petabytes of data and billions of events daily
  • You will become a participant in one of the largest projects of building a data platform in GCP
  • Your development will align with the latest technological trends based on open source principles (data mesh, data streaming)
  • You will have a real impact on the direction of product development and technology choices. We utilize the latest and best available technologies, as we select them according to our own needs
  • You will have the opportunity to work within a team of experienced engineers and big data specialists who are eager to share their knowledge, including publicly through allegro.tech
  • Once a year, you can take advantage of the opportunity to work in a different team or more often if there’s an internal business need (known as team tourism)

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    165 526
    23 379