Design and develop conceptual, logical, and physical data models to support data warehousing initiatives specific to insurance operations.
Collaborate with business analysts, architects, and IT teams to understand insurance business requirements and translate them into data models.
Perform system analysis to identify, analyze, and extract the right data from source systems, ensuring data models are populated with accurate and relevant data.
Ensure data models are optimized for performance, scalability, and maintainability, with a focus on insurance data such as policy, claims, underwriting, and customer data.
Implement data modeling standards and best practices tailored to the insurance industry.
Conduct data analysis and profiling to understand data quality and integrity, particularly in the context of insurance data.
Conduct testing to validate data models, ensuring they meet business requirements and performance criteria. This includes verifying data accuracy, consistency, and integrity within the data warehousing environment.
Work with ETL developers to ensure data models are accurately implemented in data warehousing solutions.
Maintain and update data models as insurance business requirements evolve.
Document data models and metadata for reference and training purposes.
Participate in data governance and data management activities to ensure compliance with organizational policies and industry regulations.
requirements-expected :
Bachelor’s degree in Computer Science, Information Systems, or a related field. Master’s degree preferred.
Proven experience in data modeling and data warehousing, with a minimum of 2 years in a similar role, preferably within the insurance industry.
Strong understanding of data modeling techniques, including normalization, denormalization, star schema, and snowflake schema.
Experience with data modeling tools such as ERwin, Sparx Enterprise Architect, or similar.
Proficiency in SQL and database management systems (e.g., SQL Server, Oracle).
Familiarity with ETL processes and tools (e.g., Azure Data Factory, Informatica, Talend).
Knowledge of data governance and data quality principles.
Excellent analytical and problem-solving skills, with the ability to perform system analysis to identify and extract relevant data.
Experience in developing and executing test plans to validate data models.
Strong communication skills, with the ability to explain complex data concepts to non-technical stakeholders.