Senior Azure Data Engineer
Warsaw based opportunity with a hybrid work model - 2-3 day/week in the office.
As a Senior Azure Data Engineer you will be working for our client, a global financial technology leader that provides infrastructure and services for capital markets, payments, and financial information management. You will be part of a high-impact cloud modernization program aimed at migrating legacy Cloudera and Hadoop-based data environments to a cutting-edge Azure ecosystem. This project focuses on rebuilding data architectures to enable scalable BI analytics, regulatory reporting, and metadata-driven automation across multiple geographies, while maintaining strict data governance and performance standards.
Senior Azure Data Engineer
Warsaw based opportunity with a hybrid work model - 2-3 day/week in the office.
As a Senior Azure Data Engineer you will be working for our client, a global financial technology leader that provides infrastructure and services for capital markets, payments, and financial information management. You will be part of a high-impact cloud modernization program aimed at migrating legacy Cloudera and Hadoop-based data environments to a cutting-edge Azure ecosystem. This project focuses on rebuilding data architectures to enable scalable BI analytics, regulatory reporting, and metadata-driven automation across multiple geographies, while maintaining strict data governance and performance standards.
,[Lead the migration of on-premises Cloudera and Hadoop data platforms to Azure Databricks, Design and implement scalable ETL/ELT frameworks based on medallion architecture principles, Develop metadata-driven data ingestion, transformation, and quality validation pipelines, Automate deployment and orchestration using CI/CD pipelines and DevOps tools, Collaborate with business and IT stakeholders to translate data requirements into robust technical solutions, Optimize and refactor Spark and PySpark code to improve performance and cost efficiency, Ensure data security, compliance, and governance across all data layers, Mentor and guide data engineers in best practices for data lakehouse development, Support end-to-end delivery from data ingestion to BI and API publication, Drive innovation by identifying opportunities for automation and platform improvement Requirements: Azure Databricks, Azure Data Factory, Python, PySpark, SQL, CD, Azure DevOps, GitHub, Jenkins, Airflow, Degree Additionally: Sport subscription, Training budget, Private healthcare, Flat structure, Small teams, International projects.