Contribute to cutting-edge AI initiatives by applying your creativity, critical thinking, and analytical skills.
Design and implement scalable data pipelines using cloud, generative, and agentic AI technologies.
Work in agile, cross-functional teams combining technical, business, and data science expertise.
Your accountability, responsibility, and autonomy will grow with your experience and seniority - we are looking for specialists with various experience levels.
requirements-expected :
Experience Requirements (Cumulative):
At least 2 years of hands-on experience working with relational or analytical databases, applying SQL for development, testing, debugging, and performance optimization in production environments.
Within that experience, at least 1 year designing or implementing ETL processes and data integration solutions using enterprise ETL tools (e.g., SAS Data Integration Studio, IBM DataStage, Informatica) for large or complex data sets.
Exposure to data modelling and architecture: practical experience creating conceptual, logical and physical data models using dimensional, relational or Data Vault techniques in analytical environments.
Technical Skills:
Databases & SQL:
Expert-level SQL skills (procedures, triggers, functions, packages) in at least one major RDBMS/EDW technology such as Oracle, MS SQL Server, Teradata, or DB2.
Strong ability to profile, tune and optimize SQL code and database performance for ETL and reporting workloads.
Data Modeling & ETL
Solid understanding of dimensional modelling, normalized/relational modelling and Data Vault approaches; ability to translate business requirements into conceptual, logical and physical models.
Hands-on experience with ETL/ELT toolsets (SAS Data Integration Studio, IBM DataStage.
Cloud & Big Data (Experience or Interest):
Experience with, or strong interest in, processing and integrating data on major cloud platforms (GCP, Azure, AWS). Familiarity with cloud storage, managed databases and serverless / data pipeline services is desirable.
Bonus: familiarity in Big Data technologies such as Hive, Spark, NiFi, HBase, HDFS, Kafka, Kudu.
Programming & Analytics:
Practical scripting or programming skills (SQL-centric plus one of Python, R, or SAS) to support data transformation, automation and basic analytics.
Ability to translate complex business requirements into robust, scalable data solutions.
Soft Skills & Language:
Strong analytical thinking, problem-solving and attention to detail.
Professional working proficiency in English and effective communication with technical and non-technical stakeholders.
Willingness to travel to client locations across Europe as required.
offered :
? Career Development
Permanent employment contract
Individual support from a dedicated People Lead
Personalized professional development path
Access to coaching sessions
Extensive training package including:
Soft skills, technical, and language training
E-learning platforms
Gallup strengths assessment
GenAI training
Cloud certification programs
? Well-being & Support
Employee Assistance Program offering legal, financial, and psychological consultations
Private medical care and life insurance coverage
Access to the MyBenefit platform, including a wide range of products and services (e.g., Multisport card)
? Perks & Incentives
Paid employee referral program
Eligibility for quarterly dividends through the Employee Share Purchase Plan (for employees who own company shares)
benefits :
sharing the costs of sports activities
private medical care
sharing the costs of professional training & courses