An international consulting firm that helps companies of all sizes have a better impact on the world. Company capabilities focus on supporting the private and public sectors with their people, processes, and digital technology challenges.
responsibilities :
Design and implement scalable and secure data pipelines across multi-cloud (GCP & AWS) environments
Develop integration and transformation workflows between cloud data services and on-prem Oracle databases
Work closely with trading, risk, and analytics teams to understand data requirements and deliver real-time and batch data solutions
Optimise and monitor performance of data systems to support latency-sensitive trading applications
Collaborate with cross-functional teams using Agile/Scrum methodologies to deliver business-critical data projects, provide technical assistance to the team
Ensure robust data governance, lineage, and compliance (including MiFID II, FCA, and other regulatory standards)
Automate data workflows using Terraform, CI/CD pipelines, and containerisation tools (Docker/Kubernetes)
requirements-expected :
Strong experience with cloud data services on both Google Cloud Platform (GCP) and Amazon Web Services (AWS) e.g. BigQuery, Pub/Sub, Dataflow, S3, Glue, Redshift
Expertise in Oracle SQL, PL/SQL, and working with complex stored procedures and large datasets
Proficiency in programming languages such as Python, Java, or Scala
Experience with streaming and messaging systems (e.g. Jenkins, GitLab), and container orchestration (Kubernetes)
Deep understanding of data modelling, data warehousing, and ETL/ELT design patterns
Familiarity with Agile development practices (Scrum, Kanban, Jira)
Exposure to financial markets, trading systems, or related high-performance environments is a strong plus