Raiffeisen Tech is a platform that brings together IT talents from different countries. It consists of a Tech Hub in Poland and a Tech Hub in Romania and Kosovo (to be built soon)
Our task is to create and develop banking applications that make life easier for more than 16 million customers of our group banks in 14 countries
Raiffeisen Tech Polska is part of Raiffeisen Bank International AG (Spółka Akcyjna), Branch in Poland. However, we have been on the market for five years. You may have known us as MIT GDC.
TODAY at Raiffeisen Tech we are creating the financial industry of the FUTURE Home (raiffeisen-tech.com).
Make technology Happen!
About the Job
Working with us, you will have the opportunity to work with data team responsible for data engineering, data ingestion pipelines and maintaining unified data sources used by other teams in the group.
Do you want to be part of the team which helps to make our organization a data-driven one? Then please get in touch with us!
Your core competences
Raiffeisen Tech is a platform that brings together IT talents from different countries. It consists of a Tech Hub in Poland and a Tech Hub in Romania and Kosovo (to be built soon)
Our task is to create and develop banking applications that make life easier for more than 16 million customers of our group banks in 14 countries
Raiffeisen Tech Polska is part of Raiffeisen Bank International AG (Spółka Akcyjna), Branch in Poland. However, we have been on the market for five years. You may have known us as MIT GDC.
TODAY at Raiffeisen Tech we are creating the financial industry of the FUTURE Home (raiffeisen-tech.com).
Make technology Happen!
About the Job
Working with us, you will have the opportunity to work with data team responsible for data engineering, data ingestion pipelines and maintaining unified data sources used by other teams in the group.
Do you want to be part of the team which helps to make our organization a data-driven one? Then please get in touch with us!
,[Design, build and maintain complex data flows, Development and maintenance of an ecosystem of Big Data solutions, Designing and developing high-performance data processing tools Requirements: Python, AWS, ETL, Databricks Additionally: Sport subscription, Training budget, Private healthcare, International projects, Flat structure, Free parking, Free coffee, Bike parking, No dress code, Massages, Modern office, Free beverages.