.
Senior GCP Data Engineer with Airflow
  • Wrocław
Senior GCP Data Engineer with Airflow
Wrocław, Wrocław, Lower Silesian Voivodeship, Polska
Xebia sp. z o.o.
20. 3. 2025
Informacje o stanowisku

technologies-expected :


  • Google Cloud Platform
  • Python
  • Databricks
  • Snowflake
  • Spark
  • SQL
  • NoSQL
  • Airflow

technologies-optional :


  • Tableau
  • Looker

about-project :


  • As a Senior Data Engineer at Xebia, you will work closely with engineering, product, and data teams to deliver our clients scalable and robust data solutions. Your key responsibilities will include designing, building, and maintaining data platforms and pipelines and mentoring new engineers.

responsibilities :


  • developing and maintaining data pipelines to ensure seamless data flow from the Loyalty system to the data lake and data warehouse,
  • collaborating with data engineers to ensure data engineering best practices are integrated into the development process,
  • ensuring data integrity, consistency, and availability across all data systems,
  • integrating data from various sources, including transactional databases, third-party APIs, and external data sources, into the data lake,
  • implementing ETL processes to transform and load data into the data warehouse for analytics and reporting,
  • working closely with cross-functional teams including Engineering, Business Analytics, Data Science and Product Management to understand data requirements and deliver solutions,
  • collaborating with data engineers to ensure data engineering best practices are integrated into the development process,
  • optimizing data storage and retrieval to improve performance and scalability,
  • monitoring and troubleshooting data pipelines to ensure high reliability and efficiency,
  • implementing and enforcing data governance policies to ensure data security, privacy, and compliance,
  • developing documentation and standards for data processes and procedures.

requirements-expected :


  • 7+ years in a data engineering role, with hands-on experience in building data processing pipelines,
  • experience in leading the design and implementing of data pipelines and data products,
  • proficiency with GCP services, for large-scale data processing and optimization,
  • extensive experience with Apache Airflow, including DAG creation, triggers, and workflow optimization,
  • knowledge of data partitioning, batch configuration, and performance tuning for terabyte-scale processing,
  • strong Python proficiency, with expertise in modern data libraries and frameworks (e.g., Databricks, Snowflake, Spark, SQL),
  • hands-on experience with ETL tools and processes,
  • practical experience with dbt for data transformation,
  • deep understanding of relational and NoSQL databases, data modelling, and data warehousing concepts,
  • excellent command of oral and written English,
  • Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field.

offered :


  • development budgets of up to 6,800 PLN,
  • we fund certifications e.g.: AWS, Azure, ISTQB, PSM,
  • access to Udemy and OReilly (formerly Safari Books Online),
  • events and technology conferences,
  • technology Guilds,
  • internal training,
  • Xebia Upskill,
  • private medical healthcare,
  • multiSport card - we subsidise a MultiSport card,
  • mental Health Support,
  • flexible working hours,
  • B2B or permanent contract,
  • contract for an indefinite period,
  • internal and external referral program,
  • welcome gift.

benefits :


  • sharing the costs of sports activities
  • private medical care
  • sharing the costs of foreign language classes
  • sharing the costs of professional training & courses
  • life insurance
  • flexible working time
  • integration events
  • no dress code
  • video games at work
  • coffee / tea
  • drinks
  • parking space for employees
  • leisure zone
  • employee referral program

  • Praca Wrocław
  • Wrocław - Oferty pracy w okolicznych lokalizacjach


    87 947
    9 324