.
Senior Data Engineer @ HSBC Technology Poland
  • Kraków
Senior Data Engineer @ HSBC Technology Poland
Kraków, Kraków, Lesser Poland Voivodeship, Polska
HSBC Technology Poland
21. 5. 2025
Informacje o stanowisku

Some careers shine brighter than others.

If you’re looking for a career that will help you stand out, join HSBC, and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

Your career opportunity

HSBC Markets and Securities is an emerging markets-led and financing-focused investment banking and trading business that provides tailored financial solutions to major government, corporate and institutional clients worldwide.

In IT we provide HSBC with a genuine competitive advantage across the globe. Global Business Insights (GBI) provide critical metrics and reports to Markets and Securities Services Operations to enable them to monitor the health of their business and make data-driven decisions.

The GBI Transformation is a large and complex data integration program spanning all of MSS Ops globally. We serve a diverse audience of users and data visualisation requirements from Exco down, and over 80 data sources in multiple time-zones across Middle Office, Post-Trade and Securities Services IT and elsewhere. We are a critical enabler for the Rubix 2025 Strategy and the MSS control agenda, providing operational KPI and KRI metrics which allow senior management to measure the success of their BAU and CTB investment dollars.

We are looking for a GCP developer who can design, develop, test and deploy ETL/SQL pipelines connected to a variety of on-prem and Cloud data sources - both data stores and files. We will be using mainly GCP technologies like Cloud Store, BigQuery, and Data Fusion.

You will also need to work with our devops tooling to deliver continuous integration/deployment capabilities, automated testing, security, and IT compliance.

The role will be responsible for the provisioning of subject matter expertise to support Enterprise Risk Management (ERM) Leadership Team (LT) and ERM Assurance teams discharge their responsibilities in relation to operational risk and resilience risk steward delivery across all service areas, delivery of assurance activities, embedding of assurance practices and embedding of stewardship activities and service catalogue in respective GB/GF/Specialist team.

If your CV meets our criteria, you should expect the following steps in the recruitment process:

  • Online behavioural
  • Telephone screen
  • Job interview with the hiring manager

What you need to have to succeed in this role

  • Proven (3+ years) hands on experience in SQL querying and optimization of complex queries/transformation in BigQuery, with a focus on cost, time-effective SQL coding and concurrency/data integrity
  • Proven (3+ years) hands on experience in SQL Data Transformation/ETL/ELT pipelines development, testing and implementation, ideally in GCP Datafusion
  • Proven Experience in Data Vault modelling and usage.
  • Hands on experience in Cloud Composer/Airflow, Cloud Run, Pub/Sub
  • Hands on development in Python, Terraform
  • Proficiency in Git usage for version control and collaboration.
  • Proficiency with CI/CD processes/pipelines designing, creation, maintenance in DevOps tools like Ansible/Jenkins etc. for Cloud Based Applications (Ideally GCP)
  • Experience in working in DataOps model
  • Experience in working in Agile environment and toolset.
  • Strong problem-solving and analytical skills
  • Enthusiastic willingness to learn and develop technical and soft skills as needs require rapidly and independently.
  • Strong organisational and multi-tasking skills.
  • Good team player who embraces teamwork and mutual support.

Nice to  Have

  • Experience designing, testing, and implementing data ingestion pipelines on GCP Data Fusion, CDAP or similar tools, including ingestion and parsing and wrangling of CSV, JSON, XML etc formatted data from RESTful & SOAP APIs, SFTP servers, etc.
  • Modern world data contract best practices understanding with experience for independently directing, negotiating, and documenting best in class data contracts.
  • Java development, testing and deployment skills (ideally custom plugins for Data Fusion)
  • Proficiency in working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools, ideally for Cloud based Data solutions.


Some careers shine brighter than others.

If you’re looking for a career that will help you stand out, join HSBC, and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

Your career opportunity

HSBC Markets and Securities is an emerging markets-led and financing-focused investment banking and trading business that provides tailored financial solutions to major government, corporate and institutional clients worldwide.

In IT we provide HSBC with a genuine competitive advantage across the globe. Global Business Insights (GBI) provide critical metrics and reports to Markets and Securities Services Operations to enable them to monitor the health of their business and make data-driven decisions.

The GBI Transformation is a large and complex data integration program spanning all of MSS Ops globally. We serve a diverse audience of users and data visualisation requirements from Exco down, and over 80 data sources in multiple time-zones across Middle Office, Post-Trade and Securities Services IT and elsewhere. We are a critical enabler for the Rubix 2025 Strategy and the MSS control agenda, providing operational KPI and KRI metrics which allow senior management to measure the success of their BAU and CTB investment dollars.

We are looking for a GCP developer who can design, develop, test and deploy ETL/SQL pipelines connected to a variety of on-prem and Cloud data sources - both data stores and files. We will be using mainly GCP technologies like Cloud Store, BigQuery, and Data Fusion.

You will also need to work with our devops tooling to deliver continuous integration/deployment capabilities, automated testing, security, and IT compliance.

The role will be responsible for the provisioning of subject matter expertise to support Enterprise Risk Management (ERM) Leadership Team (LT) and ERM Assurance teams discharge their responsibilities in relation to operational risk and resilience risk steward delivery across all service areas, delivery of assurance activities, embedding of assurance practices and embedding of stewardship activities and service catalogue in respective GB/GF/Specialist team.

If your CV meets our criteria, you should expect the following steps in the recruitment process:

  • Online behavioural
  • Telephone screen
  • Job interview with the hiring manager
,[Design, build, test and deploy Google Cloud data models and transformations in BigQuery environment (e.g. SQL, stored procedures, indexes, clusters, partitions, triggers, etc.), Creating and managing ETl/ELT data pipelines to model raw/unstructured data into Data Vault universal model, enriched, transformed and optimized raw data into suitable for end consumers usage, Review and refine, interpret and implement business and technical requirements, Deliver non-functional requirements, IT standards and developer and support tools to ensure our applications are a secure, compliant, scalable, reliable and cost effective, Monitoring data pipelines for failures or performance issues and implementing fixes or improvements as needed, Optimizing ETL/ELT processes for performance and scalability, ensuring they can handle large volumes of data efficiently, Integrating data from multiple sources, ensuring consistency and accuracy, Manage code artefacts and CI/CD using tools like Git, Jenkins, Google Secrets Manager, etc., Fix defects and provide enhancements during the development period and hand-over knowledge, expertise, code and support responsibilities to support team Requirements: SQL, BigQuery, ETL, Testing, GCP, Vault, Cloud Composer, Cloud, PUB, Python, Terraform, Git, DevOps, Ansible, Jenkins, SOAP, CSV, JSON, REST API, XML Additionally: Training budget, Private healthcare, Flat structure, International projects, Multisport card, Monthly remote work subsidy, Psychological support, Conferences, PPK option, Annual performance based bonus, Integration budget, International environment, Small teams, Employee referral bonus, Mentoring, Workstation reimbursement, Company share purchase plan, Childcare support programme, Bike parking, Playroom, Shower, Canteen, Free coffee, Free beverages, Free parking, In-house trainings, In-house hack days, No dress code, Modern office, Knowledge sharing, Garden, Massage chairs, Kitchen.

  • Praca Kraków
  • Technolog Kraków
  • Technolog żywności Kraków
  • Kraków - Oferty pracy w okolicznych lokalizacjach


    82 593
    8 786