.
Data Engineer with Expertise in Groovy and Spark, Kraków
  • Kraków County
Data Engineer with Expertise in Groovy and Spark, Kraków
Kraków, Kraków County, Lesser Poland Voivodeship, Polska
TN Poland
25. 2. 2025
Informacje o stanowisku

Social network you want to login/join with:

Data Engineer with Expertise in Groovy and Spark, Kraków

Client: DataArt

Location: Kraków

Job Category: Other

EU work permit required: Yes

Job Reference: bdbdde363723

Job Views: 35

Posted: 23.01.2025

Expiry Date: 09.03.2025

Job Description:

Responsibilities

  • Work closely with data engineers, the product team, and other stakeholders to gather data requirements, and design and build efficient data pipelines
  • Create and maintain algorithms and data processing code in Java/Groovy
  • Implement processes for data validation, cleansing, and transformation to ensure data accuracy and consistency
  • Develop Python scripts to automate data extraction from both new and existing sources
  • Monitor and troubleshoot the performance of data pipelines in Airflow, proactively addressing any issues or bottlenecks
  • Write SQL queries to extract data from BigQuery and develop reports using Google’s Looker Studio
  • Participate in daily stand-ups, sprint planning, and retrospective meetings
  • Engage in peer code reviews, knowledge sharing, and assist other engineers with their work
  • Introduce new technologies and best practices as needed to keep the product up to date
  • Assist in troubleshooting and resolving production escalations and issues

Requirements

  • Bachelors degree or equivalent programming experience
  • 4-5 years of overall experience as a backend software developer, with at least 2 years as a Data Engineer using Spark with Java/Groovy and/or Python
  • Strong coding skills, and knowledge of data structures, OOP principles, databases, and API design
  • Highly proficient in developing programs and data pipelines in Java/Groovy or Python
  • 2+ years of professional experience with Apache Spark/Hadoop

Nice to have

  • Work experience with AWS (EMR, S3, lambda, EC2, glue, RDS)
  • Work experience with SQL (MYSQL is a Plus) and NoSQL Databases
  • Experience with Elasticsearch
  • Experience with Python
  • Experience with Scala (Zeppelin)
  • Experience with Airflow or other ETL
  • Certification or verified training in one or more of the following technologies/products: AWS, ElasticSearch
#J-18808-Ljbffr

  • Praca Kraków
  • Kraków - Oferty pracy w okolicznych lokalizacjach


    91 114
    11 909