.
Lead Data Engineer (Databricks) @ Godel Technologies Europe
  • Wrocław
Lead Data Engineer (Databricks) @ Godel Technologies Europe
Wrocław, Wrocław, Lower Silesian Voivodeship, Polska
Godel Technologies Europe
8. 3. 2026
Informacje o stanowisku

At Godel Technologies, we are passionate about building innovative data platforms and software solutions that empower businesses around the world. We are growing and looking for a Lead Data Engineer with strong Databricks expertise to join our team. If you enjoy working with modern data technologies, solving complex data challenges, and designing scalable cloud data platforms - we would love to hear from you.  

As a Lead Data Engineer, you will play a key role in designing and implementing large-scale data platforms based on Databricks. You will lead the development of distributed data processing pipelines, ensure high performance and reliability of data systems, and guide engineering teams in implementing best practices for modern data architecture.  

You will work closely with Solution Architects, Data Engineers, Data Scientists, and business stakeholders to build secure, scalable, and high-performing data solutions that support advanced analytics, machine learning, and data-driven decision making. 

This is a hybrid role, which means wed like you to work in the office occasionally, especially during client visits or other important company meetings. Wed also like you to be willing to take an occasional short business trips to Warsaw (approximately four times a year).


Must have:

  • 7+ years of experience in Data Engineering (including min. 5 years with Databricks) 
  • Strong experience in Databricks Data Platform for distributed data processing 
  • Excellent programming skills in Python and SQL 
  • Strong understanding of data modeling, data lakehouse architecture, and ELT/ETL patterns 
  • Experience designing scalable cloud-based data platforms (AWS / Azure) 
  • Knowledge of data governance, security, and access control best practices (Unity Catalog, dbt) 
  • Experience leading or mentoring engineers is a strong advantage 
  • Strong analytical thinking and problem-solving skills 
  • Excellent communication and collaboration skills 
  • Fluency in English (at least B2) 

Nice to have:

  • Data Streaming: Kafka 
  • Databases: MS SQL (SSIS, SSAS), PostgreSQL, MySQL 
  • BI tools: PowerBI 

At Godel Technologies, we are passionate about building innovative data platforms and software solutions that empower businesses around the world. We are growing and looking for a Lead Data Engineer with strong Databricks expertise to join our team. If you enjoy working with modern data technologies, solving complex data challenges, and designing scalable cloud data platforms - we would love to hear from you.  

As a Lead Data Engineer, you will play a key role in designing and implementing large-scale data platforms based on Databricks. You will lead the development of distributed data processing pipelines, ensure high performance and reliability of data systems, and guide engineering teams in implementing best practices for modern data architecture.  

You will work closely with Solution Architects, Data Engineers, Data Scientists, and business stakeholders to build secure, scalable, and high-performing data solutions that support advanced analytics, machine learning, and data-driven decision making. 

This is a hybrid role, which means wed like you to work in the office occasionally, especially during client visits or other important company meetings. Wed also like you to be willing to take an occasional short business trips to Warsaw (approximately four times a year).

,[Design and implement scalable data platforms and pipelines using Apache Spark on Databricks , Lead the development of distributed data processing pipelines using PySpark and SparkSQL , Build and manage Databricks Workflows for orchestration, scheduling, monitoring, and error handling , Optimize Spark workloads by applying join strategies, shuffle optimization, caching, and partitioning techniques , Design and maintain Delta Lake architectures, including schema evolution, ACID transactions, and performance tuning , Implement data governance and access control using Unity Catalog, including permissions, lineage, and secure data sharing , Collaborate with architects and engineering teams to design cloud-native data platforms , Ensure data quality, observability, and reliability across pipelines and data products , Lead performance optimization of large-scale data processing workloads , Mentor and support other Data Engineers, contributing to engineering standards and best practices , Participate in architecture discussions and contribute to the evolution of the company’s data engineering practices  Requirements: Databricks, Data engineering, Python, SQL, ETL, Cloud, UNITY, Apache Spark, PySpark, Communication skills, Kafka, MySQL, PowerBI Tools: . Additionally: Flat structure, Small teams, Integration events, Internal trainings/meetups, Private healthcare, Sport subscription, Free coffee, Bike parking, Free snacks, In-house trainings, Modern office, No dress code.

  • Praca Wrocław
  • Technik technologii drewna Wrocław
  • Wrocław - Oferty pracy w okolicznych lokalizacjach


    117 922
    19 280