This role will provide the candidate an opportunity to drive our Reporting & Business Intelligence architecture strategy for our new Commercial Risk Solutions ecosystem. Overarching responsibilities include helping to define the architecture and data models for delivering global and country-specific reports, and overseeing delivery as part of a global strategic multi-workstream programme.
responsibilities :
Engage country-level Finance and Operational teams to understand the gap between local needs and global reporting capabilities, truly understanding how data is used in the field.
Collaborate with Product Manager to refine the vision for a global Reporting & BI capability within the Commercial Risk Solutions Ecosystem covering financial and operations reporting (incl. regulatory).
Define the data flows, the data models (entity relationships) and data technology stack requirements that will ensure successful delivery on our vision.
Work with Business/Technical Analysts on creating the required user stories and ensuring they are clearly defined, well understood and properly estimated.
Support a cross-functional Agile team consisting of Finance SMEs, Data Analysts, Data Engineers, Business Analysts, Scrum Master etc. to deliver with quality.
Act as a key contributor and collaborator on Aon’s common data standards to support the exchange of data within and across Aon solution lines and geographies
Ensure successful data integration with regional and global warehouses, internal financial systems and external fiduciary/regulatory bodies.
Adhere to and advance Aon’s approach to data quality improvement as it relates to reporting for the Commercial Risk Solutions ecosystem.
Collaborate with other architects within the ecosystem to ensure collective delivery of the programme.
Collaborate with internal Data & Analytics functions to advance the adoption of BI and AI within Aon.
requirements-expected :
8-10 years experience in a Data Engineering/Modeller/Architecture role, with at least 3 years experience in designing data flows and models.
Strong understanding of database normalisation, dimensional (star schema) models, NoSQL
Strong proficiency in logical and physical data model design and broader data architecture patterns for batch and (near) real-time jobs.
Strong proficiency in most of: SQL, python incl. spark data processing on Databricks/Jupyter/Airflow, PowerBI (or Tableau/Qlik), AI/ML incl. GenAI.
Perspectives and principles on how to implement and govern enterprise data standards.
Familiarisation and some proficiency in all areas of data-related disciplines: data analysis and visualisation, data quality, data warehousing, (big) data engineering, financial reporting.
Experience with Azure cloud services – IaaS, DBaaS and PaaS offerings.
Agile knowledge – Scrum, Kanban, TDD and other practices. Certification desirable.
Experience in technical accounting solutions for insurance policy management, or in financial systems a plus.