.
Scala/Big Data Developer @ Ework Group
  • Gdynia
Scala/Big Data Developer @ Ework Group
Gdynia, Gdynia, Pomeranian Voivodeship, Polska
Ework Group
15. 1. 2026
Informacje o stanowisku

Project details:

Would you like to develop in Scala, Spark and other Big Data technology stack available on-premises and public cloud? We are now looking for a Scala/Spark/Hadoop/Big Data Developer to deliver global reporting solution.

The Development Team is building a new customer account reporting engine to replace existing legacy customer account reporting solutions across home markets over the coming years. A prerequisite is consolidating all account and payment transaction data in the bank to Hadoop, involving dozens of data formats and integration points both upstream and downstream. The project aims to deliver new reports to all 10M customers in the Nordics, consisting of billions of transactions a month.


Required Skills:

  • 3+ years of experience with functional programming in Scala 
  • Experience with writing Spark-based applications in Scala 
  • Experience with Hadoop Technology Stack: Hive, Oozie, Kafka 
  • Experience in containerized technologies including Docker and Kubernetes 

Preferred Skills:

  • Deep understanding of the principles of distributed systems 
  • Experience in data engineering and building ETL/ELT pipelines 
  • Experience with performance tuning of Hadoop/Spark solutions 
  • Experience with container-based technologies: Docker, Kubernetes 
  • Experience in developing RESTful services 
  • Experience in developing web applications with Bootstrap and ReactJS 
  • Experience within Agile was of working 

Project details:

Would you like to develop in Scala, Spark and other Big Data technology stack available on-premises and public cloud? We are now looking for a Scala/Spark/Hadoop/Big Data Developer to deliver global reporting solution.

The Development Team is building a new customer account reporting engine to replace existing legacy customer account reporting solutions across home markets over the coming years. A prerequisite is consolidating all account and payment transaction data in the bank to Hadoop, involving dozens of data formats and integration points both upstream and downstream. The project aims to deliver new reports to all 10M customers in the Nordics, consisting of billions of transactions a month.

,[Responsible for successful design and implementation of a high-performing, flexible, robust, scalable and easily maintainable global reporting solution for the Bank, Actively cooperating with Product Owner, Solution Architects, Analysts, Testers and Developers. Working in an Agile team., You will join a Corporate Access Account Reporting Team (CAAR) which is part of Payments Train following the SAFe agile framework. Requirements: Scala, Big data, Hadoop, Hive, Spark, ETL, REST API Additionally: Sport Subscription, Private healthcare, International environment, Life insurance.

  • Praca Gdynia
  • Gdynia - Oferty pracy w okolicznych lokalizacjach


    103 215
    17 493