You will play a key role in migrating and building ETL/ELT processes in Snowflake infrastructure under the Data Sphere Program, establishing Snowflake as the primary Data Warehouse platform for Healthcare Commercial.
The project will be managed using the SCRUM methodology, ensuring iterative development and close collaboration with all stakeholders. It is conducted in 3-week sprint intervals, with sprint planning tasks assigned to the Contractor and reviewed by the Product Owner at the end of each sprint.
You will play a key role in migrating and building ETL/ELT processes in Snowflake infrastructure under the Data Sphere Program, establishing Snowflake as the primary Data Warehouse platform for Healthcare Commercial.
The project will be managed using the SCRUM methodology, ensuring iterative development and close collaboration with all stakeholders. It is conducted in 3-week sprint intervals, with sprint planning tasks assigned to the Contractor and reviewed by the Product Owner at the end of each sprint.
,[Developing and optimizing data flows from Source systems to warehouse structures within Snowflake using dbt Cloud, Creating documentation on Snowflake / dbt Cloud ETL code created in the Confluence platform, Estimating tasks assigned via the ticketing system to and timely resolution of those assigned, Participation in Scrum meetings of the Data team to plan the work and allow work review by the Product Owner, Consult the project team and end users regarding the code you created to facilitate proper handover, Implementing ETL processes specified by the Architects to integrate data sources into Snowflake infrastructure seamlessly. Requirements: dbt Cloud, ETL, Snowflake, Confluence, Jira, Azure DevOps Additionally: Sport subscription, Training budget, Private healthcare, Flat structure, International projects.