In your role as a Cloud Engineer you will be working in the Big Data Cluster, which is the enabler for data scientists and provides a huge collection of data and a data science workbench in one place.
On a daily basis you will develop products based on database technology which are running in the On-Premise or Cloud. Your responsibilities will encompass the full Software Development Lifecycle including analysis, architecture, testing and also supporting during production issues during normal working hours.
The Big Data cluster is the enabler for data scientists and provides a huge collection of data and a data science workbench in one place.
BI technology within the lake infrastructure
Establish a stable, state-of-the-art technology base with on-prem and cloud solutions
Set up data lake as single data and analytics hub and effectively ingest most important data sources
Establish data quality and metadata management
Provide Data Marts and sandboxes for segments and functions with the most important combination of data sources
responsibilities :
Developing data models, designing and implementing ETL processes on cloud platforms (in particular Google Cloud), and ensuring data quality and integrity throughout the data lifecycle
Implementing new features to enable new business cases
Focusing on stability, performance tunning and innovation of the applications
Taking care of proper up-to-date system documentation
Actively contributing to knowledge sharing and to a learning culture
Working in the international projects in agile methodologies
requirements-expected :
Cloud Data Development:
Design and develop cloud-based solutions using Google Cloud Platform
Optimize data infrastructure for performance, scalability and reliability
Infrastructure as Code (IaC):
Utilize Terraform to create and manage cloud resources efficiently
Implement CI/CD pipelines for automated deployment and continuous integration
Microservices and API:
Work with microservices architecture and design APIs for seamless data integration
Scripting and Testing:
Proficient in Python and SQL for ETL processes, schema evolution pipeline development and performance tuning
Develop, test, and deploy solutions using Dataproc, Dataflow, and Cloud Functions
offered :
Friendly and multicultural environment
26 days of holiday from the very beginning
benefits :
private medical care
sharing the costs of professional training & courses
life insurance
remote work opportunities
flexible working time
integration events
corporate sports team
retirement pension plan
preferential loans
no dress code
video games at work
coffee / tea
leisure zone
pre-paid cards
redeployment package
employee referral program
extra leave
Multisport
Skills@Work - personal & professional development program
Employee Assistance Program (psychological support)