The project focuses on building a Transaction Monitoring solution designed to support the detection of money laundering activities (AML) within the Capital Markets domain. It is a greenfield initiative, providing the opportunity to design and develop the system architecture from the ground up, with a strong emphasis on innovation, product ownership, and delivery excellence.
Hybrid working model with 3 days per week on-site
responsibilities :
Designing, implementing, and optimizing data pipelines supporting end-to-end AML detection scenarios in the Capital Markets domain, leveraging Apache Spark (PySpark) and SQL for large-scale data processing in distributed environments
Developing and maintaining applications in Python and Java for transactional data processing and implementation of complex business logic
Building and maintaining CI/CD pipelines using Bamboo/Jenkins and Bitbucket to automate deployment, testing, and delivery processes
Working with distributed computing platforms and on-premise clusters, focusing on performance optimization and scalability in big data environments
Actively contributing to the design and development of a greenfield architecture, with a strong sense of product ownership, innovation, and delivery quality
requirements-expected :
Strong programming skills in Python and Java.
Advanced experience with Apache Spark (PySpark), SQL for large-scale distributed data processing.
Hands-on experience in distributed compute platforms and on-premise cluster environments.
Proficiency with CI/CD tools such as Bamboo/Jenkins, and Bitbucket.
Experience working in Agile teams, with the ability to work independently in fast-paced, evolving environments.
Strong analytical and problem-solving skills with the ability to adapt quickly.