Informacje o stanowisku
We are looking for Cloud Data Engineer (Snowflake) for our client, a global leader collaborating with enterprises to help them transform and manage their business using modern technologies.
What we expect
- at least 3 years of experience in Big Data or Cloud projects in the areas of processing and visualization of large and/or unstructured datasets (including at least 1 year of hands-on Snowflake experience)
- understanding of Snowflakes pricing model and cost optimization strategies for managing resources efficiently
- experience in designing and implementing data transformation pipelines natively with Snowflake or Service Partners
- familiarity with Snowflake’s security model
- practical knowledge of at least one Public Cloud platform in Storage, Compute (+Serverless), Networking and DevOps area supported by commercial project work experience
- at least basic knowledge of SQL and one of programming languages: Python/Scala/Java/bash
- very good command of English
Employment agency entry number 47
this job offer is intended for people over 18 years of age
What we offer
- permanent employment contract
- hybrid, flexible working model - 2 days per weekin office
- co-financing to equip a workplace at home
- development opportunities:
- substantive support from project leaders
- a wide range of internal and external trainings (technical, language, leadership)
- certification support in various areas
- mentoring and a real impact on shaping your career path
- access to a database of over 2,000 training courses on Pluralsight, Coursera, Harvard platforms
- the opportunity to participate in conferences both as a listener and an expert
- relocation package
- benefits as part of the social package (cafeteria system, medical care for the whole family, group insurance on preferential terms)
Your tasks
- design, develop, and maintain Snowflake data pipelines to support various business functions
- collaborate with cross-functional teams to understand data requirements and implement scalable solutions
- optimize data models and schemas for performance and efficiency
- ensure data integrity, quality, and security throughout the data lifecycle
- implement monitoring and alerting systems to proactively identify and address issues
- plan and execute migration from on-prem data warehouses to Snowflake;
- develop AI, ML and Generative AI solution
- stay updated on Snowflake best practices and emerging technologies to drive continuous improvement
Praca WrocławWrocław - Oferty pracy w okolicznych lokalizacjach