.
Senior Data Engineer
  • Warsaw
Senior Data Engineer
Warszawa, Warsaw, Masovian Voivodeship, Polska
WINGED IT SP Z O O
1. 11. 2025
Informacje o stanowisku

technologies-expected :


  • SQL
  • PySpark
  • Python
  • Apache Airflow
  • AWS Glue
  • Kafka
  • Redshift
  • AWS S3
  • AWS Lambda
  • AWS CloudWatch
  • AWS SNS
  • AWS SQS
  • AWS Kinesis
  • Terraform
  • Git
  • CI/CD

about-project :


  • Do you want to play a key role in revolutionizing the future of finance? Join Klarna - a global leader in deferred payments, with over 85 million active users and 2.5 million transactions processed daily! As a pioneer in modern payment solutions, Klarna is developing innovative methods that streamline the shopping experience, enhance transaction security, and expand the availability of purchasing options.
  • Were seeking individuals eager to achieve remarkable results and share their bold vision to redefine the future of payments and fintech. Send us your CV - we can’t wait to meet you!
  • Our client: Klarna
  • Location: Warsaw (hybrid work: 2-3 days in the office)
  • Start date: ASAP
  • Duration: long-term agreement

responsibilities :


  • To own the global UW tables (canonical facts/dimensions for applications, decisions, features, repayments, delinquency) with clear SLAs for freshness, completeness, accuracy, and data lineage;
  • To design for AI-agents and humans: consistent IDs, canonical events, explicit metric definitions, rich metadata (schemas, data dictionaries), and machine-readable data contracts;
  • To build & run pipelines (batch + streaming) that feed UW scoring, real-time decisioning, monitoring, and underwriting optimization;
  • To ensure quality & observability (alerts, audits, reconciliation, backfills) and drive incident/root-cause reviews;
  • To partner closely with Credit Portfolio Management, Policy teams, Modeling teams, and treasury and finance teams to land features for RUE and consumer-centric models, plus regulatory and management reporting.

requirements-expected :


  • 6+ years of experience with SQL, PySpark, Python;
  • Framework knowledge: Apache Airflow, AWS Glue, Kafka, Redshift;
  • Cloud & DevOps: AWS (S3, Lambda, CloudWatch, SNS/SQS, Kinesis), Terraform; Git; CI/CD;
  • Proven ownership of mission-critical data products (batch + streaming);
  • Data modeling, schema evolution, data contracts, and strong observability chops;
  • Familiarity with AI/agent patterns (agent-friendly schemas/endpoints, embeddings/vector search);
  • Strong educational background in Computer Science, Information Technology, or a related field;
  • Language proficiency: Advanced English (minimum B2 level).

offered :


  • Great opportunity for personal development in a stable and friendly large multinational company
  • Start-up mentality, small agile teams
  • Global Reach: Impact millions with seamless shopping and payments
  • Career growth and additional education

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    106 737
    15 409