.
Data Engineer - Allegro Pay
  • Warsaw
Data Engineer - Allegro Pay
Warszawa, Warsaw, Masovian Voivodeship, Polska
Allegro Pay
16. 5. 2025
Informacje o stanowisku

technologies-expected :


  • Python
  • SQL
  • SQL Server
  • Oracle
  • Spark
  • BigQuery
  • Snowflake

technologies-optional :


  • Java
  • Scala
  • C#
  • Terraform
  • Pulumi

about-project :


  • Allegro Pay is the largest fintech in Central Europe – we are growing fast and need engineers who want to learn and develop, while at the same time solving problems related to serving thousands of RPSs. If, like us, you like flexing your mental muscles to solve complex problems and you would be happy to co-create the infrastructure which underpins our solutions, make sure you apply!
  • In this role, you will be a contributor, helping us expand our modern cloud-based analytical solutions. We embrace challenging and interesting projects and take quality very seriously. Depending on your preference, your position may be more business-oriented or platform-oriented.

responsibilities :


  • Design, monitor and improve data flow processes implemented in Python, SQL, Airflow and Snowpark
  • Implement and maintain Data Mesh processes collecting data from many micro services and cloud sources
  • Work with various data formats and sources, utilizing novel storage solutions
  • Optimize the costs associated with the cloud operations in Snowflake, Azure Cloud and GCP
  • Work with latest technologies such as: Snowflake, Airflow, dbt, .Net, Azure, GCP, Github Actions
  • Play an active role in decision-making processes regarding the selection and implementation of data frameworks

requirements-expected :


  • Have 2+ years of experience in building data-driven solutions using Python
  • Have practical knowledge in creating efficient data processing applications
  • Simply like data to be processed efficiently, they feel satisfied when they ignite a lot of cores to quickly process terabytes of data
  • Can optimize SQL queries in traditional engines (SQL Server, Oracle), Big Data (Spark) or cloud engines (BigQuery, Snowflake)
  • Have experience in working with large data sets, understand database algorithms and data structures (e.g. they know what the difference between merge join and join hash)
  • Can independently make decisions in the areas entrusted to them and take responsibility for the code they create
  • Are not afraid of new technologies and want to expand their range of skills
  • Know how to build and deploy containerized applications on the cloud

offered :


  • We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)
  • A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)
  • 16" or 14" MacBook Pro with M1 processor and, 32GB RAM or a corresponding Dell with Windows (if you don’t like Macs) and other gadgets that you may need
  • Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)
  • English classes that we pay for related to the specific nature of your job

benefits :


  • sharing the costs of sports activities
  • private medical care
  • sharing the costs of foreign language classes
  • sharing the costs of professional training & courses
  • life insurance
  • flexible working time
  • integration events
  • no dress code
  • leisure zone
  • extra social benefits

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    82 468
    9 261