.
Senior Data Engineer (Azure / Databricks)
  • Warsaw
Senior Data Engineer (Azure / Databricks)
Warszawa, Warsaw, Masovian Voivodeship, Polska
GETINDATA POLAND sp. z o.o.
14. 12. 2024
Informacje o stanowisku

technologies-expected :


  • Python
  • Scala
  • Java
  • Databricks
  • Kafka
  • Git
  • Microsoft Azure
  • Docker
  • Terraform
  • Azure DevOps

technologies-optional :


  • Apache Airflow
  • Azure Data Factory
  • Prefect
  • Dagster

about-project :


  • We run a variety of projects in which our sweepmasters can excel. Advanced Analytics, Data Platforms, Streaming Analytics Platforms, Machine Learning Models, Generative AI and more. We like working with top technologies and open-source solutions for Data & AI and ML/AI. In our portfolio, you can find Clients from many industries, e.g., media, e-commerce, retail, fintech, banking, and telcos, such as Truecaller, Spotify, ING, Acast, Volt, Play, and Allegro. You can read some customer stories here - https://getindata.com/blog/.
  • About role
  • Were excited to share that were seeking a Data Engineer to join our team! This role plays a vital part in our company, and were looking for candidates with exceptional skills and expertise. While there isnt a project ready to start at the moment, wed love to connect and explore potential opportunities together in the future.
  • A Data Engineers role involves the design, construction, and upkeep of data architecture, tools, and procedures facilitating an organizations collection, storage, manipulation, and analysis of substantial data volumes. This position involves erecting data platforms atop commonly provided infrastructure and establishing a streamlined path for Analytics Engineers who rely on the system.

responsibilities :


  • Working together with Platform Engineers to assess and choose the most suitable technologies and tools for the project
  • Development and committing of new functionalities and open-source tools
  • Executing intricate data intake procedures
  • Implementing and enacting policies in line with the companys strategic plans regarding utilized technologies, work organization, etc.
  • Ensuring compliance with industry standards and regulations in terms of security, data privacy applied in the data processing layer
  • Conducting training and knowledge-sharing

requirements-expected :


  • Proficiency in a programming language like Python / Scala or Java
  • Knowledge of Lakehouse platforms - Databricks
  • Experience working with messaging systems - Kafka
  • Familiarity with Version Control Systems, particularly GIT
  • Experience as a programmer and knowledge of software engineering, good principles, practices, and solutions
  • Extensive experience in Microsoft Azure
  • Knowledge of at least one orchestration and scheduling tool, for example, Airflow, Azure Data Factory, Prefect, Dagster
  • Familiarity with DevOps practices and tools, including Docker, Terraform, CI/CD, Azure DevOps
  • Ability to actively participate/lead discussions with clients to identify and assess concrete and ambitious avenues for improvement

offered :


  • Salary: 160 - 200 PLN net + VAT/h B2B (depending on knowledge and experience)
  • 100% remote work
  • Flexible working hours
  • Possibility to work from the office located in the heart of Warsaw
  • Opportunity to learn and develop with the best Big Data experts
  • International projects
  • Possibility of conducting workshops and training
  • Certifications
  • Co-financing sport card
  • Co-financing health care
  • All equipment needed for work

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    94 936
    15 467