.
Senior Data Engineer (Python & Databricks)
  • Warsaw
Senior Data Engineer (Python & Databricks)
Warszawa, Warsaw, Masovian Voivodeship, Polska
Cyclad
21. 3. 2026
Informacje o stanowisku

technologies-expected :


  • Python
  • PySpark
  • SQL
  • Databricks
  • Delta Lake

technologies-optional :


  • Azure Data Factory
  • Azure DevOps
  • Git
  • Microsoft Power BI

about-project :


  • In Cyclad we work with top international IT companies in order to boost their potential in delivering outstanding, cutting-edge technologies that shape the world of the future. We are seeking an experienced Senior Data Engineer with Python and Databricks.
  • This role supports a large-scale transformation from SQL Server–based systems to a Databricks / Delta Lake platform. The focus is on enterprise-grade data engineering and software development, not analytics or reporting. The project is SQL2Databricks migration, it involves 3500-4000 SQL DBs (2TB), replicating data in different shapes/ schemas to Databricks.
  • Type of project: IT Services
  • Office location: Poland
  • Work model: Remote from Poland
  • Budget: 140 - 160 PLN net/ h - b2b
  • Project length: till the end of 2026, possible to extend it
  • Only candidates with citizenship in the European Union and residence in Poland
  • Start date: ASAP

responsibilities :


  • Support a large-scale transformation from SQL Server–based systems to a Databricks / Delta Lake platform
  • Transform complex, business-critical SQL logic (stored procedures) into clean, maintainable, and scalable Python / PySpark code
  • Redesign and implement this logic in Python / PySpark within Databricks
  • Contribute to a large, long-running data engineering codebase used by multiple teams
  • Develop production-grade transformation code (packages, modules, reusable components)
  • Design and evolve data models within a Medallion Architecture (Bronze / Silver / Gold) across multiple data layers
  • Ensure software engineering quality, reusability, and long-term maintainability
  • Apply software engineering best practices (clean code, OOP, modularization, refactoring)
  • Work with very large data volumes and highly parallel, event-driven transformations
  • Actively participate in code reviews and technical design discussions
  • Support orchestration workflows (e.g., Azure Data Factory)

requirements-expected :


  • Very strong Python and PySpark skills; proven experience with Databricks and Delta Lake
  • Experience working in large, shared codebases (beyond notebooks)
  • Strong SQL skills, especially reading and understanding complex logic
  • Solid object-oriented programming experience, clean code principles
  • Strong data modelling background (transactional and analytical)
  • Experience in redesigning models during platform migrations
  • Familiarity with layered data architectures (Bronze / Silver / Gold)
  • Very good English skills

offered :


  • Remote working model
  • Full-time job agreement based on b2b and employment contract
  • Private medical care with dental care (covering 70% of costs) + rehabilitation package. Family package option possible
  • Multisport card (also for an accompanying person)
  • Life insurance

benefits :


  • sharing the costs of sports activities
  • private medical care
  • life insurance
  • remote work opportunities
  • flexible working time
  • integration events
  • dental care
  • no dress code

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    113 652
    17 207