.
Hadoop Data Engineer
  • Kraków
Hadoop Data Engineer
Kraków, Kraków, Lesser Poland Voivodeship, Polska
Mindbox S.A.
14. 5. 2025
Informacje o stanowisku

technologies-expected :


  • Hadoop
  • Hive
  • HDFS
  • Apache Spark
  • Scala
  • GCP
  • Jenkins
  • Airflow
  • SQL

about-project :


  • We are looking for Data Engineers to join the IT team within the Environmental, Social & Governance department of the Data and Analytics office. Here in ESG DAO we create ESG domain data assets for consumption elsewhere in the bank.
  • Tech Stack: Hadoop, Hive, HDFS, Apache Spark, Scala, GCP, Jenkins, Airflow, SQL

responsibilities :


  • The engineering team is responsible for taking business logic/poc asset designs and using them to create robust data pipelines using spark in scala. Our pipelines are orchestrated through airflow and deployed through a Jenkins based CICD pipeline. We operate on a private GCP instance and an on-premises Hadoop cluster. Engineers are embedded in multi-disciplinary teams including business analysts, data analysts, data engineers and software engineers and architects.

requirements-expected :


  • Scala
  • Experience designing and deploying large scale distributed data processing systems with few technologies such as PostgreSQL or equivalent databases, SQL, Hadoop, Spark, Tableau
  • Proven ability to define and build architecturally sound solution designs.
  • Demonstrated ability to rapidly build relationships with key stakeholders.
  • Experience of automated unit testing, automated integration testing and a fully automated build and deployment process as part of DevOps tooling.
  • Must have the ability to understand and develop the logical flow of applications on technical code level
  • Strong interpersonal skills and ability to work in a team and in global environments.
  • Should be proactive, have learning attitude & adjust to work in dynamic work environments.
  • Exposure in Enterprise Data Warehouse technologies
  • Exposure in a customer facing role working with enterprise clients.
  • Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA.
  • Tech Stack: Hadoop, Hive, HDFS, Apache Spark, Scala, GCP, Jenkins, Airflow, SQL

benefits :


  • sharing the costs of sports activities
  • private medical care
  • sharing the costs of professional training & courses
  • life insurance

  • Praca Kraków
  • Kraków - Oferty pracy w okolicznych lokalizacjach


    82 468
    9 261