We are looking for a Data Engineer with expertise in Snowflake, AWS, and ETL processes, who will work closely with AI scientists and data analysts to design, develop, and maintain data pipelines and systems that support clinical and operational data use cases.
Nice-to-have requirements
We are looking for a Data Engineer with expertise in Snowflake, AWS, and ETL processes, who will work closely with AI scientists and data analysts to design, develop, and maintain data pipelines and systems that support clinical and operational data use cases.
,[Design and develop data architecture incorporating Snowflake, Immuta, Collibra, and cloud security, Implement and optimize ETL/ELT processes, manage raw data, automate workflows, and ensure high performance and reliability of data processing systems, Manage data quality and security, monitor quality, and implement metadata management strategies, Collaborate with analytics and business teams to identify user needs and deliver comprehensive solutions supporting analytics and reporting, Optimize and migrate data systems through data and process conversion from existing systems to Data Vault, Define and enforce coding standards, data modeling, and ETL/ELT architecture, ensure compliance with Data Mesh and other modern approaches, Coach and mentor the data engineering team, conduct code reviews, participate in architectural discussions, and initiate innovative technological solutions Requirements: AWS, ETL, Snowflake, data vault, Docker, Python, SQL, Unix Additionally: Sport subscription, Training budget, Private healthcare, Small teams, International projects, Free coffee, Free breakfast, No dress code, Modern office.