.
Data Engineer @ Antal
  • Warsaw
Data Engineer @ Antal
Warszawa, Warsaw, Masovian Voivodeship, Polska
Antal
16. 1. 2026
Informacje o stanowisku

The Role

As a Data Engineer, you will design, develop, and maintain robust data pipelines and platforms that support enterprise-level use cases. You will ensure high performance, reliability, and scalability while implementing modern data solutions in a cloud-first, governance-aligned environment.


Education:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, Statistics, or a related field

Technical Expertise:

  • 3–5 years of hands-on experience in data engineering roles
  • Strong SQL skills for analytics and data transformations
  • Advanced Python and Apache Spark experience
  • Hands-on experience with lakehouse architectures and Delta Lake
  • Experience with at least one major cloud platform (AWS, Azure, or GCP)
  • Solid understanding of distributed systems and large-scale data processing
  • Familiarity with modern orchestration and transformation tools (Airflow, dbt, Terraform, etc.)
  • Knowledge of enterprise data governance, access control, and data lineage management
  • Awareness of cloud cost governance and optimization practices in large, multi-workspace environments

Domain & Delivery Experience:

  • Experience delivering data platforms in regulated financial services environments
  • Understanding of data quality, auditability, and access control standards
  • Exposure to risk, finance, AML, or regulatory reporting data domains
  • Experience managing pipelines with strict SLAs and enterprise reliability expectations

The Role

As a Data Engineer, you will design, develop, and maintain robust data pipelines and platforms that support enterprise-level use cases. You will ensure high performance, reliability, and scalability while implementing modern data solutions in a cloud-first, governance-aligned environment.

,[Design, build, and optimize scalable data pipelines for large-scale data processing, Develop and maintain cloud-based data solutions using modern architectures such as lakehouse, Apache Spark, and Delta Lake, Monitor, troubleshoot, and enhance ETL/ELT workflows, including error handling and recovery mechanisms, Perform performance tuning, capacity planning, and reliability testing for production pipelines, Collaborate with architects, analysts, and cross-functional teams to deliver end-to-end data solutions, Investigate data quality issues, identify root causes, and implement long-term fixes, Maintain technical documentation for data pipelines and platform components, Ensure adherence to cloud-first, security-aware, and governance-aligned principles, Support migration of legacy data warehouses to modern, cloud-based architectures, ensuring data consistency and minimal disruption, Design and operate data pipelines aligned with non-functional requirements (high availability, disaster recovery, RTO/RPO) Requirements: Degree, Data engineering, SQL, Python, Spark, Cloud platform, Azure, GCP, Airflow, dbt, Terraform, Cloud

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    101 474
    14 395