This role will provide the candidate an opportunity to play a direct role in driving Data Delivery as leader of a team of Data Engineers and Analysts. Specifically, the key data delivery work includes Data Migrations from legacy systems into our new ecosystem through a reusable data migration framework, and delivery of data engineering pipelines and analytical dashboards for our global Reporting & BI workstream, all using a modern data stack.
responsibilities :
Collaborating with business stakeholders and programme leads on the development and execution of a data migration framework as well as strategic Reporting & BI assets.
Lead cross-functional team of data engineers and analysts through end-to-end delivery lifecycle within established timelines, following Agile methodologies
Provide regular updates on Agile Sprint delivery to all stakeholders, handling all technical planning and escalations.
Overseeing SDLC and delivery standards across Data Engineers and Data Analysts, to ensure effective delivery of pipelines, models and dashboards on a modern data stack.
Working with Product Managers of our ecosystem to understand requirements for building automated take-on of transactional data into their systems.
Working with Reporting & BI Product Manager to understand requirements for analytical reporting and implementing best-in-class data engineering standards in team.
Actively contributing to Data Governance forums through overseeing delivery of Data Quality Dashboards for the purpose of driving quality improvements.
Single point of contact for all resource-related issues for Data roles in Poland.
Support IT Service Delivery Lead providing a point of escalation for Incidents, Problems and Change for all data workloads within the ecosystem.
requirements-expected :
8+ years experience in Data Engineering / Analysis roles, with specialization in Data Migrations and Business Intelligence.
Experience in insurance/financial industry and large multinational environments is a distinct plus, with any specific understanding of Client, Policy, Claims data domains.
Some previous hands-on experience with cloud-based Lakehouse architectures through Databricks incl. python (spark) ingestion and transformation, SQL, AWS, PowerBI and similar.
Strong Agile project management skill with ability to lead cross-functional teams
Hands on experience with Agile practices and Agile Scrum/Kanban management tools (Jira, Confluence, Azure DevOps etc.). Agile certification highly desirable.
Excellent communications and interpersonal skills to collaborate effective with stakeholders at all levels
Strong people management skills across diverse, matrixed organization
Ability to work in fast-paced environment and adapt to changing project requirements
Previous hands-on experience in Data Engineering, Data Science (AI/ML), Dashboarding, and either/both DataOps and MLOps.
Azure (or AWS) cloud support experience desirable – IaaS and PaaS including containerization and serverless
Bachelors or Masters degree in Computer Science, Information Technology or similar.