.
Senior Azure Data Engineer @ ITDS
  • Warsaw
Senior Azure Data Engineer @ ITDS
Warszawa, Warsaw, Masovian Voivodeship, Polska
ITDS
21. 10. 2025
Informacje o stanowisku

Senior Azure Data Engineer

Warsaw based opportunity with a hybrid work model - 2-3 day/week in the office.

As a Senior Azure Data Engineer you will be working for our client, a global financial technology leader that provides infrastructure and services for capital markets, payments, and financial information management. You will be part of a high-impact cloud modernization program aimed at migrating legacy Cloudera and Hadoop-based data environments to a cutting-edge Azure ecosystem. This project focuses on rebuilding data architectures to enable scalable BI analytics, regulatory reporting, and metadata-driven automation across multiple geographies, while maintaining strict data governance and performance standards.


You’re ideal for this role if you have:

  • Proven experience in large-scale data platform migration and modernization projects
  • Deep expertise in Azure Databricks, Delta Lake, and Azure Data Factory
  • Strong programming skills in Python, PySpark, and SQL
  • Hands-on experience with CI/CD pipelines using Azure DevOps, GitHub, or Jenkins
  • Solid understanding of medallion architecture and metadata-driven data engineering
  • Familiarity with orchestration tools such as Airflow or Azure Data Factory
  • Strong knowledge of data governance, lineage, and regulatory compliance frameworks
  • Excellent analytical and problem-solving abilities in complex data environments
  • Ability to collaborate effectively with cross-functional and business teams
  • A degree in Computer Science, Engineering, or a related technical discipline

Senior Azure Data Engineer

Warsaw based opportunity with a hybrid work model - 2-3 day/week in the office.

As a Senior Azure Data Engineer you will be working for our client, a global financial technology leader that provides infrastructure and services for capital markets, payments, and financial information management. You will be part of a high-impact cloud modernization program aimed at migrating legacy Cloudera and Hadoop-based data environments to a cutting-edge Azure ecosystem. This project focuses on rebuilding data architectures to enable scalable BI analytics, regulatory reporting, and metadata-driven automation across multiple geographies, while maintaining strict data governance and performance standards.

,[Lead the migration of on-premises Cloudera and Hadoop data platforms to Azure Databricks, Design and implement scalable ETL/ELT frameworks based on medallion architecture principles, Develop metadata-driven data ingestion, transformation, and quality validation pipelines, Automate deployment and orchestration using CI/CD pipelines and DevOps tools, Collaborate with business and IT stakeholders to translate data requirements into robust technical solutions, Optimize and refactor Spark and PySpark code to improve performance and cost efficiency, Ensure data security, compliance, and governance across all data layers, Mentor and guide data engineers in best practices for data lakehouse development, Support end-to-end delivery from data ingestion to BI and API publication, Drive innovation by identifying opportunities for automation and platform improvement Requirements: Azure Databricks, Azure Data Factory, Python, PySpark, SQL, CD, Azure DevOps, GitHub, Jenkins, Airflow, Degree Additionally: Sport subscription, Training budget, Private healthcare, Flat structure, Small teams, International projects.

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    124 524
    24 428