.
ETL Data Engineer (GCP) @ ITDS
  • Kraków
ETL Data Engineer (GCP) @ ITDS
Kraków, Kraków, Lesser Poland Voivodeship, Polska
ITDS
21. 5. 2025
Informacje o stanowisku

Join us in transforming data into insights!

Kraków - based opportunity with hybrid work model (2 days/week in the office)

As a Data Analyst, you will be working for our client – an emerging markets-led investment banking and trading business. You will be responsible for designing, developing, testing, and deploying ETL/SQL pipelines connected to various data sources, using GCP technologies like Cloud Store, BigQuery, and Data Fusion.

Your main responsibilities:

  • Designing, building, testing and deploying Google Cloud data models and transformations in the BigQuery environment (e.g. SQL, stored procedures, indexes, clusters, partitions, triggers, etc.)
  • Creating and managing ETL/ELT data pipelines to model raw/unstructured data into the Data Vault universal model, enriching, transforming, and optimizing raw data into formats suitable for end-user consumption
  • Reviewing, refining, interpreting, and implementing business and technical requirements
  • Delivering non-functional requirements, IT standards, and developer/support tools to ensure applications are secure, compliant, scalable, reliable, and cost-effective
  • Monitoring data pipelines for failures or performance issues and implementing fixes or improvements as needed
  • Optimizing ETL/ELT processes for performance and scalability, ensuring the handling of large volumes of data efficiently
  • Integrating data from multiple sources and ensuring consistency and accuracy
  • Managing code artifacts and CI/CD using tools like Git, Jenkins, Google Secrets Manager, etc.
  • Fixing defects and providing enhancements during the development period and handing over knowledge, expertise, code, and support responsibilities to the support team

Youre ideal for the role if you have:

  • 4+ years hands-on experience in SQL querying and optimization of complex queries/transformation in BigQuery
  • Hands-on experience in SQL Data Transformation/ETL/ELT pipelines development, testing and implementation, ideally in GCP Datafusion
  • Proven experience in Data Vault modelling and usage
  • Hands-on experience in Cloud Composer/Airflow, Cloud Run, Pub/Sub
  • Hands-on development in Python and Terraform
  • Proficiency in Git and CI/CD processes using DevOps tools like Ansible orJenkins
  • Experience in working in the DataOps model
  • Experience in working in an Agile environment and toolset.
  • Strong problem-solving and analytical skills

It is a strong plus if you have:

  • Experience designing, testing, and implementing data ingestion pipelines on GCP Data Fusion, CDAP or similar tools
  • Java development, testing and deployment skills (ideally custom plugins for Data Fusion)

Join us in transforming data into insights!

Kraków - based opportunity with hybrid work model (2 days/week in the office)

As a Data Analyst, you will be working for our client – an emerging markets-led investment banking and trading business. You will be responsible for designing, developing, testing, and deploying ETL/SQL pipelines connected to various data sources, using GCP technologies like Cloud Store, BigQuery, and Data Fusion.

Your main responsibilities:

  • Designing, building, testing and deploying Google Cloud data models and transformations in the BigQuery environment (e.g. SQL, stored procedures, indexes, clusters, partitions, triggers, etc.)
  • Creating and managing ETL/ELT data pipelines to model raw/unstructured data into the Data Vault universal model, enriching, transforming, and optimizing raw data into formats suitable for end-user consumption
  • Reviewing, refining, interpreting, and implementing business and technical requirements
  • Delivering non-functional requirements, IT standards, and developer/support tools to ensure applications are secure, compliant, scalable, reliable, and cost-effective
  • Monitoring data pipelines for failures or performance issues and implementing fixes or improvements as needed
  • Optimizing ETL/ELT processes for performance and scalability, ensuring the handling of large volumes of data efficiently
  • Integrating data from multiple sources and ensuring consistency and accuracy
  • Managing code artifacts and CI/CD using tools like Git, Jenkins, Google Secrets Manager, etc.
  • Fixing defects and providing enhancements during the development period and handing over knowledge, expertise, code, and support responsibilities to the support team
,[ Requirements: ETL, SQL, GCP, BigQuery, Data models, Data pipelines, Vault, Git, Jenkins, Airflow, Python, Terraform, DevOps, Ansible, Analytical skills, CSV, XML Additionally: Private healthcare, International projects.

  • Praca Kraków
  • Kraków - Oferty pracy w okolicznych lokalizacjach


    82 593
    8 786