Poznań, Poznań, Greater Poland Voivodeship, Polska
Harvey Nash Technology Sp. z o.o.
21. 11. 2024
Informacje o stanowisku
technologies-expected :
Scala
Java
Kafka
Apache Spark
Flink
SQL
technologies-optional :
Apache Airflow
Python
AWS
GCP
about-project :
As our Data Engineer, you’ll play a vital role in crafting, developing, and sustaining real-time data streams and batch data integrations for regulatory operations reporting, built within our AWS and GCP cloud environments. This position offers significant growth opportunities as you contribute to establishing the core infrastructure for our regulatory reporting data flow.
responsibilities :
Design, develop, and support resilient and efficient data pipelines.
Ensure data accuracy and coherence by implementing data engineering best practices.
Diagnose and address issues within data feeds to ensure uninterrupted flow.
Facilitate effective communication between business and technical teams regarding data assets.
Engage in every phase of the system lifecycle, from analysis through to deployment.
requirements-expected :
3 + years in data integration, real-time streaming, or warehousing.
Fluent in English with a meticulous attention to data quality.
Skilled in Scala or Java programming (candidates with Python will also be reviewed).
Experienced with Spark, Flink, or Kafka for big data streaming.
Proficient in AWS or GCP data processing solutions.
Comfortable in a dynamic, fast-paced setting.
Effective at prioritizing, troubleshooting, and clear communication.
Familiarity with SQL, Airflow, and data warehousing.
offered :
Competitive salary with a company bonus program.
Memorable team events quarterly and regular company gatherings.
One-time home office setup bonus for a comfortable remote work environment.
Hybrid work model with up to 40 remote workdays per year.
Multisport Card for physical wellness.
Comprehensive private health insurance.
benefits :
sharing the costs of sports activities
private medical care
sharing the costs of professional training & courses