.
Snowflake Data Engineer @ Spyrosoft
  • Łódź
Snowflake Data Engineer @ Spyrosoft
Łódź, Łódź, Łódź Voivodeship, Polska
Spyrosoft
2. 11. 2025
Informacje o stanowisku

The project is expected to start in Q1 2026

You will play a key role in migrating and building ETL/ELT processes in Snowflake infrastructure under the Data Sphere Program, establishing Snowflake as the primary Data Warehouse platform for Healthcare Commercial.

The project will be managed using the SCRUM methodology, ensuring iterative development and close collaboration with all stakeholders. It is conducted in 3-week sprint intervals, with sprint planning tasks assigned to the Contractor and reviewed by the Product Owner at the end of each sprint.


Requirements:

  • Great knowledge of dbt Cloud for ETL development
  • Experience with working on Snowflake data warehouse as a target system for ETL
  • Proficiency to prepare clear documentation in the Confluence platform
  • Ability to use ticketing systems such as JIRA and/or Azure DevOps
  • Familiarity with Snowflake infrastructure as an advanced
  • Ability to work in an agile BI team (DevOps) and to share skills and experience
  • Fluency in English

The project is expected to start in Q1 2026

You will play a key role in migrating and building ETL/ELT processes in Snowflake infrastructure under the Data Sphere Program, establishing Snowflake as the primary Data Warehouse platform for Healthcare Commercial.

The project will be managed using the SCRUM methodology, ensuring iterative development and close collaboration with all stakeholders. It is conducted in 3-week sprint intervals, with sprint planning tasks assigned to the Contractor and reviewed by the Product Owner at the end of each sprint.

,[Developing and optimizing data flows from Source systems to warehouse structures within Snowflake using dbt Cloud, Creating documentation on Snowflake / dbt Cloud ETL code created in the Confluence platform, Estimating tasks assigned via the ticketing system to and timely resolution of those assigned, Participation in Scrum meetings of the Data team to plan the work and allow work review by the Product Owner, Consult the project team and end users regarding the code you created to facilitate proper handover, Implementing ETL processes specified by the Architects to integrate data sources into Snowflake infrastructure seamlessly. Requirements: dbt Cloud, ETL, Snowflake, Confluence, Jira, Azure DevOps Additionally: Sport subscription, Training budget, Private healthcare, Flat structure, International projects.

  • Praca Łódź
  • Łódź - Oferty pracy w okolicznych lokalizacjach


    103 516
    15 264