.
Senior Data Engineer
  • Warsaw
Senior Data Engineer
Warszawa, Warsaw, Masovian Voivodeship, Polska
Exadel Poland sp. z o.o.
15. 7. 2025
Informacje o stanowisku

technologies-expected :


  • Degree
  • Python
  • PySpark
  • AWS
  • Apache Airflow
  • SQL

technologies-optional :


  • CI/CD
  • Hadoop
  • BI tools

about-project :


  • The team builds platforms to provide insights to internal and external clients in auto property damage and repair, medical claims, and telematics data. Our data engineers use big data technology to create the best-in-industry analytics capability. This position is an opportunity to use Hadoop and Spark ecosystem tools and technology for micro-batch and streaming analytics. The role has the responsibility to understand, prepare, process, and analyze data to drive operational, analytical, and strategic business decisions.

responsibilities :


  • Own and be responsible for data engineering projects from start to finish
  • Build end-to-end data flows from sources to fully curated and enhanced data sets. This can include the effort to locate and analyze source data, create data flows to extract, profile, and store ingested data, define and build data cleansing and imputation, map to a common data model, transform to satisfy business rules and statistical computations, and validate data content
  • Modify, maintain, and support existing data pipelines to provide business continuity and fulfill product enhancement requests
  • Provide technical expertise to diagnose errors from production support teams
  • Participate as both a leader and a learner in team tasks for architecture, design, and analysis
  • Be a mentor and be supportive to the growth of other Engineers
  • Coordinate within the European team as well as work seamlessly with the US team

requirements-expected :


  • Master’s or bachelor’s degree with Engineering/Programming/Analytics Specialization
  • 4 + years’ experience with building, maintaining, and supporting complex data flows with structural and un-structural data
  • Proficiency in Python and PySpark
  • Hands-on experience in AWS components such as EMR and S3
  • Experience in Apache Airflow to orchestrate and schedule complex data flows
  • Ability to use SQL for data profiling and data validation
  • Experience in Unix commands and scripting
  • English level - Upper-Intermediate+

offered :


  • International projects
  • In-office, hybrid, or remote mode
  • Medical healthcare
  • Recognition program
  • Professional & personal development opportunities
  • Foreign languages classes
  • Well-being program
  • Corporate events
  • Sports compensation
  • Referral program
  • Equipment provision
  • Paid vacation & sick days

benefits :


  • sharing the costs of sports activities
  • private medical care
  • sharing the costs of foreign language classes
  • sharing the costs of professional training & courses
  • remote work opportunities
  • flexible working time
  • fruits
  • corporate products and services at discounted prices
  • integration events

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    89 660
    7 768