HSBC Markets and Securities is an emerging markets-led and financing-focused investment banking and trading business that provides tailored financial solutions to major government, corporate and institutional clients worldwide.
In IT we provide HSBC with a genuine competitive advantage across the globe. Global Business Insights (GBI) provide critical metrics and reports to Markets and Securities Services Operations to enable them to monitor the health of their business and make data-driven decisions.
The GBI Transformation is a large and complex data integration program spanning all of MSS Ops globally. We serve a diverse audience of users and data visualisation requirements from Exco down, and over 80 data sources in multiple time-zones across Middle Office, Post-Trade and Securities Services IT and elsewhere. We are a critical enabler for the Rubix 2025 Strategy and the MSS control agenda, providing operational KPI and KRI metrics which allow senior management to measure the success of their BAU and CTB investment dollars.
We are looking for a GCP developer who can design, develop, test and deploy ETL/SQL pipelines connected to a variety of on-prem and Cloud data sources - both data stores and files. We will be using mainly GCP technologies like Cloud Store, BigQuery, and Data Fusion.
You will also need to work with our devops tooling to deliver continuous integration/deployment capabilities, automated testing, security, and IT compliance.
responsibilities :
Design, build, test and deploy Google Cloud data models and transformations in BigQuery environment (e.g. SQL, stored procedures, indexes, clusters, partitions, triggers, etc.).
Create and manage ETl/ELT data pipelines to model raw/unstructured data into Data Vault universal model, enriching, transforming and optimising data into for end-user consumption.
Review and refine, interpret, and implement business and technical requirements.
Ensure applications meet non-functional requirements, IT standards, and security, compliance, scalability, reliability, and cost-efficiency needs.
Monitor and optimize data pipelines for failure, performance issues, and scalability, implementing fixes or improvements as needed.
Integrate data from multiple sources, ensuring consistency and accuracy.
Manage code artefacts and CI/CD using tools like Git, Jenkins, Google Secrets Manager, etc.
Address defects, enhance development, and transition knowledge, code, and support responsibilities to the support team.
requirements-expected :
Proven (3+ years) hands-on experience in SQL querying and optimization of complex transformation in BigQuery, focusing on cost-efficiency, concurrency, and data integrity.
Proven (3+ years) hands-on experience in SQL Data Transformation, ETL/ELT pipeline development, testing, and implementation, ideally in GCP Datafusion.
Experience in Data Vault modelling and implementation.
Hands on experience in Cloud Composer/Airflow, Cloud Run, and Pub/Sub.
Development experience in Python, Terraform, including proficiency in Git for version control and CI/CD pipeline creation and maintenance using DevOps tools like Ansible/Jenkins for could-based applications (ideally GCP).
Experience working in DataOps model and Agile environment with corresponding toolsets.
Strong problem-solving, analytical, and organisational skills with the ability to adapt, learn, and develop technical and soft skills rapidly and independently.
Collaborative team player who embraces teamwork and mutual support.
offered :
Competitive salary
Annual performance-based bonus
Additional bonuses for recognition awards
Multisport card
Private medical care
Life insurance
One-time reimbursement of home office set-up (up to 800 PLN)
Corporate parties & events
CSR initiatives
Nursery discounts
Financial support with trainings and education
Social fund
Flexible working hours
Free parking
benefits :
sharing the costs of sports activities
private medical care
sharing the costs of professional training & courses