Wrocław, Wrocław, Lower Silesian Voivodeship, Polska
Capgemini Polska
22. 11. 2024
Informacje o stanowisku
technologies-expected :
Snowflake
SQL
Python
Scala
Java
Bash
about-project :
Insights & Data delivers state-of-the-art Data solutions. Our expertise primarily lies in Cloud & Big Data engineering, where we develop robust systems capable of processing extensive and complex datasets, utilizing specialized Cloud Data services across platforms like AWS, Azure, and GCP. We oversee the entire Software Development Life Cycle (SDLC) of these solutions, which involves not only leveraging data processing tools such as ETL but also extensive programming in languages like Python, Scala, or Java, coupled with the adoption of DevOps tools and best practices. The processed data is then made accessible to downstream systems through APIs, outbound interfaces, or is visualized via comprehensive reports and dashboards. Additionally, within our AI Center of Excellence, we undertake Data Science and Machine Learning projects with a focus on cutting-edge areas such as Generative AI, Natural Language Processing (NLP), Anomaly Detection, and Computer Vision.
responsibilities :
design, develop, and maintain Snowflake data pipelines to support various business functions;
collaborate with cross-functional teams to understand data requirements and implement scalable solutions;
optimize data models and schemas for performance and efficiency;
ensure data integrity, quality, and security throughout the data lifecycle;
implement monitoring and alerting systems to proactively identify and address issues;
plan and execute migration from on-prem data warehouses to Snowflake;
develop AI, ML and Generative AI solution;
stay updated on Snowflake best practices and emerging technologies to drive continuous improvement.
requirements-expected :
at least 3 years of experience in Big Data or Cloud projects in the areas of processing and visualization of large and/or unstructured datasets (including at least 1 year of hands-on Snowflake experience);
understanding of Snowflakes pricing model and cost optimization strategies for managing resources efficiently;
experience in designing and implementing data transformation pipelines natively with Snowflake or Service Partners;
familiarity with Snowflake’s security model;
practical knowledge of at least one Public Cloud platform in Storage, Compute (+Serverless), Networking and DevOps area supported by commercial project work experience;
at least basic knowledge of SQL and one of programming languages: Python/Scala/Java/bash;
very good command of English.
offered :
Practical benefits: permanent employment contract from the first day; hybrid, flexible working model; equipment package for home office; private medical care with Medicover; life insurance; Capgemini Helpline; NAIS benefit platform;
Access to 70+ training tracks with certification opportunities; platform with free access to Pluralsight, TED Talks, Coursera, Udemy Business and SAP Learning HUB
Community Hub that will allow you to choose from over 20 professional communities that gather people interested in, among others: Salesforce, Java, Cloud, IoT, Agile, AI.
benefits :
private medical care
sharing the costs of foreign language classes
sharing the costs of professional training & courses
life insurance
flexible working time
integration events
corporate sports team
no dress code
parking space for employees
extra social benefits
sharing the costs of tickets to the movies, theater