.
Cloud Data Engineer (Snowflake)
  • Wrocław
Cloud Data Engineer (Snowflake)
Wrocław, Wrocław, Lower Silesian Voivodeship, Polska
Capgemini Polska
20. 4. 2024
Informacje o stanowisku

technologies-expected :


  • Snowflake Data Cloud
  • Snowpark
  • Python
  • Azure
  • AWS
  • GCP
  • SQL
  • Scala
  • Java
  • Bash

about-project :


  • Insights & Data practice delivers cutting-edge data centric solutions.
  • Most of our projects are Cloud & Big Data engineering. We develop solutions to process large, also unstructured, datasets, with dedicated Cloud Data services on Snowflake, AWS, Azure or GCP.
  • We are responsible for full SDLC of the solution: apart from using data processing tools (e.g., ETL), we code a lot in Python, Scala or Java and use DevOps tools and best practices. The data is either exposed to downstream systems via API, outbound interfaces or visualized on reports and dashboards.
  • Within our AI CoE we deliver Data Science and Machine Learning projects with focus on NLP, Anomaly Detection and Computer Vision.
  • Additionally, we are exploring the area of Quantum Computing, searching for practical growth opportunities for both us and our clients.
  • Currently, over 250 of our Data Architects, Engineers and Scientists work on exciting projects for over 30 clients from different sectors (Financial Services, Logistics, Automotive, Telco and others)
  • Come on Board! :)

responsibilities :


  • design, develop, and maintain Snowflake data pipelines to support various business functions;
  • collaborate with cross-functional teams to understand data requirements and implement scalable solutions;
  • optimize data models and schemas for performance and efficiency;
  • ensure data integrity, quality, and security throughout the data lifecycle;
  • implement monitoring and alerting systems to proactively identify and address issues;
  • plan and execute migration from on-prem data warehouses to Snowflake;
  • develop AI, ML and Generative AI solution;
  • stay updated on Snowflake best practices and emerging technologies to drive continuous improvement.

requirements-expected :


  • at least 3 years of experience in Big Data or Cloud projects in the areas of processing and visualization of large and/or unstructured datasets (including at least 1 year of hands-on Snowflake experience);
  • understanding of Snowflakes pricing model and cost optimization strategies for managing resources efficiently;
  • experience in designing and implementing data transformation pipelines natively with Snowflake or Service Partners;
  • familiarity with Snowflake’s security model;
  • practical knowledge of at least one Public Cloud platform in Storage, Compute (+Serverless), Networking and DevOps area supported by commercial project work experience;
  • at least basic knowledge of SQL and one of programming languages: Python/Scala/Java/bash;
  • very good command of English.

benefits :


  • private medical care
  • sharing the costs of foreign language classes
  • sharing the costs of professional training & courses
  • life insurance
  • flexible working time
  • integration events
  • corporate sports team
  • no dress code
  • parking space for employees
  • extra social benefits
  • sharing the costs of tickets to the movies, theater
  • redeployment package
  • christmas gifts
  • employee referral program
  • charity initiatives
  • free chat/call with a therapist
  • Multisport card

  • Praca Wrocław
  • Wrocław - Oferty pracy w okolicznych lokalizacjach


    116 161
    18 749