Informacje o stanowisku
Project description
Our program was started as the migration of the financial instruments trading legacy mainframe system to a new technical, highly-scalable platform. The success of the program in both the migration and creation of a scalable platform led to it being selected as the strategic platform providing full-scale advisory services for one of the biggest financial institutions in the world. This has led to significant further investment for legacy system migrations and technical improvements. Currently, we have teams across several locations (Weehawken, Wroclaw, Toronto, Pune) on those projects. Reengineer set of existing applications and introduce additional application interface layer, allowing to query underlying data from third-party applications.
Responsibilities
- Write clean code
- Cover own code with tests
- Participate in backlog refinement, planning and demos
- Perform code reviews
- Clarify the requirements with stakeholders
- Participate in solution architecture design and implementation
SKILLS
Must have
- 6+ years of experience in enterprise Software development and Data engineering
- Strong experience in ETL and Java 17 with a focus on data processing and analytics applications
- Understanding of cloud platforms, particularly Microsoft Azure including services like Azure Data Lake, Azure Blob Storage, Azure Data Factory
- Spark (or any other distributed big data processing engine - Flink, Trino etc.)
- Experience with AKS (or any other Kubernetes provider)
- Familiar with Kafka, Delta Lake is a big plus
Nice to have
- Microservices architecture and Spring framework experience
- Full-stack (Java+React) experience
- Experience in banking area/ with enterprise programs
#J-18808-Ljbffr
Praca WrocławWrocław - Oferty pracy w okolicznych lokalizacjach