We are looking for an experienced Senior Data Engineer to join our team, specializing in Snowflake, DBT, and cloud data engineering. This role will focus on designing, building, and optimizing scalable data pipelines within Roche’s data ecosystem. The successful candidate will be responsible for integrating, transforming, and automating data processes, as well as enhancing performance to enable high-quality insights and seamless access to data across commercial and foundational platforms.
Data Engineer (Snowflake, DBT, AWS)
Your responsibilities
- Design, develop, and optimize ETL/ELT data pipelines utilizing DBT, Python, and Snowflake.
- Implement best practices for data modeling, partitioning, and performance tuning within Snowflake.
- Ensure data quality, security, and governance by applying data masking, row-level security, and audit controls.
- Automate data workflows and orchestration processes using tools like Airflow, Prefect, or similar.
- Collaborate with data architects, engineers, and business teams to define and implement scalable, high-performance data solutions.
- Manage and optimize cloud-based data integration across AWS, Snowflake
- Maintain high standards for data integrity, availability, and accessibility.
Our requirements
- 8+ years in data engineering, cloud platforms, and large-scale data processing.
- Strong expertise in Snowflake, DBT, Python, and SQL.
- Experience with data lake architectures, batch and streaming data processing, and cloud-native services.
- Proven ability to implement data governance frameworks, manage access control, and follow security best practices.
- Expertise in building and optimizing scalable ETL/ELT pipelines with a focus on performance and automation.