.
Senior BigData Developer Python, Spark, Kafka Kraków 06.02.2025
  • Kraków County
Senior BigData Developer Python, Spark, Kafka Kraków 06.02.2025
Kraków, Kraków County, Lesser Poland Voivodeship, Polska
DCG Poland
27. 2. 2025
Informacje o stanowisku

As a recruitment company, DCG understands that every business is powered by experienced professionals. Our management style and partnership approach enable us to meet your needs and provide continuous support. Due to our ongoing growth and the large number of recruitment projects we undertake for our partners, we are currently looking for:

Senior BigData Developer

Responsibilities:

  • Facilitate the establishment of a secure data platform on Clients OnPrem Cloudera infrastructure
  • Document and develop ETL logic and data flows to facilitate the easy usage of data assets, both batch and real-time streaming
  • Leverage the following (but not limited to) components of Cloudera distribution to achieve project objectives: Sqoop, Hive, Impala, Spark
  • Consistent practice in coding and unit testing
  • Work with distributed teams

Requirements:

  • Bachelors degree in Computer Science or related technical field, or equivalent experience
  • 8+ years of experience in an IT, preliminary on hands on development
  • Strong knowledge of architectural principles, frameworks, design patterns and industry best practices for design and development
  • Strong hands on experience with programming languages - Java, Scala or Python
  • 4+ years’ real project experience as a data wrangler/engineer across design, development, testing, and production implementation for Big Data projects, processing large volumes of structured/unstructured data
  • Strong hands-on experience with Snowflake, Spark and Kafka
  • Experience with Oracle database engine with PL/SQL and performance tuning of SQL Queries
  • Experience in designing efficient and robust ETL/ELT workflows and schedulers
  • Communication skills – both written and verbal, strong analytical and problem-solving skill

Nice to have:

  • End-to-end development life-cycle support and SDLC processes
  • Working experience with Data Virtualization tools such as Dremio/Denodo
  • Knowledge of Machine Learning libraries and exposure to Data Mining
  • Working experience with AWS/Azure/GCP
  • Working experience in a Financial industry is a plus
#J-18808-Ljbffr

  • Praca Kraków
  • Kraków - Oferty pracy w okolicznych lokalizacjach


    98 430
    11 738