We are looking for an experienced Scala/Spark Developer to join a project in the banking sector. The main responsibilities include developing and optimizing Scala/Spark jobs for data transformation and aggregation within the Hadoop ecosystem, as well as implementing CI/CD processes. The candidate will work in an agile environment, collaborating with the team on designing, testing, and deploying new solutions. Hybrid work - one day per week from the office
Scala/Spark Developer
Your responsibilities
Our requirements