Youll be the go-to Databricks expert on the team. Youll have important role in the migration from the legacy stack while designing and building the new platform in parallel – and “in parallel” is doing real work in that sentence.
The legacy platform runs. Not beautifully, but it runs – and it serves real business needs that cant wait for the migration to finish. Youll split your time between keeping it stable (and gradually less painful) and building its replacement. If the idea of legacy firefighting makes you want to close this tab, this probably isnt the right role. If you see it as part of the job and take quiet satisfaction in fixing things that are broken – read on.
More specifically, you will:
- Design & build the new streaming platform (Kafka → Databricks with Declarative Pipelines)
- Migrate existing batch workflows from Airflow + Docker + on‑prem Databricks to cloud‑native architecture
- Keep the current platform stable while improving its reliability, performance and operability
- Architect the serving layer
- Govern data properly – Unity Catalog, lineage, access control, data quality – not as an afterthought
- Enable sharing across organization with Polaris and Iceberg
- Collaborate with data scientists, ML engineers, and business teams across regions
- Use AI tools daily – we use GitHub Copilot and internal homemade assistants/agents we build on our own within a team; we expect you to help the team get real value from them
By joining our team, you become a part of the people-centric work environment of a Danish company. We offer you a competitive salary, permanent contract after the probation period, development package, team building events, activity-based office in Poznan’s city center in the new prestigious office building – Nowy Rynek. The building is recognized as a building without barriers, which means that it is fully adapted to the needs of people with disabilities.
Our compensation package on employment contracts includes:
- An office-first approach: home office is available up to 2 days per week
- Adaptable Hours: start your workday anytime between 7:00 AM and 9:00 AM
- Home office subsidy
- Private Medical Care
- Multikafeteria MyBenefit
- Wellbeing program
- Extra Day Off for voluntary activities
… and while in the office you can also use modern office space with beautiful view and high standard furniture, bicycle parking facilities & showers, chill-out rooms with PlayStation, football table, pool table, board games, subsidized canteen with delicious food & fruit.
You will join our Data Science & Engineering Team, a group of 14 skilled professionals including the Team Leader. The team combines strong expertise in data engineering, analytics, and machine learning, and is structured into several project‑focused sub‑teams working across a variety of business areas.
Our data platform is already there in cloud, already in Databricks. But we’re not here to maintain the status quo – we’re rebuilding it from the ground up to jump into exciting world of real-time data and streaming.
We will migrate from a batch-oriented Airflow + Databricks to a streaming-first architecture: Kafka, Databricks with new cool features like Declarative Pipelines, Unity Catalog, Apache Iceberg / Polaris Catalog, and new serving layer, which you will help us to select.
This is a greenfield build inside a global company – real budget, real data, real stakes. No startup chaos, but real room to make meaningful architectural decisions.
We are a global leader in stone wool solutions.
Our team of over 12,000 people across 40 countries delivers products to customers in more than 120 markets.
Our mission is to support sustainable development.
We help reduce energy consumption, noise, and fire risks - improving quality of life wherever our solutions are used.
The ROCKWOOL Global Business Services Center has been operating since 2016.
We started with 27 people, and today we are a team of over 600 - and still growing.
In Poznań, we are developing competence centers in areas such as IT, R&D, Engineering, Sourcing, and Digital Marketing.