Informacje o stanowisku
- Bachelor’s degree in Computer Science or relevant certification.
- Strong software engineering skills with proven hands-on experience in Spark and PySpark for building scalable data processing solutions (must have).
- Solid experience developing and integrating real-time and batch data workflows using Kafka.
- Experience working with Databricks and Delta Tables is a strong plus (nice to have).
- Proficient in DevOps CI/CD pipelines, preferably with GitLab, ADO or GitHub.
- Skilled in test-driven development (TDD) and applying software design principles.
- Well-versed in Cloud architecture and delivery, particularly Azure (preferred), AWS, or GCP.
- Experience with Java backend development is a valuable addition, supporting collaboration with the broader Java-based system.
- Familiarity with Kubernetes and modern data infrastructure is advantageous.
,[ Design and develop our strategic platform enabling trade executions to flow from Dealstores to Operations and Regulatory applications, turning epics and features into robust functionality. , Working closely with other agile pod members in sprints to iteratively deliver on the product requirements. Working with the product team to understand and implement required functionality. Requirements: Spark, PySpark, Kafka, DevOps, CI/CD Pipelines, GitLab, ADO, GitHub, TDD, Cloud architecture, Azure, AWS, GCP, Java Backend, Kubernetes, Databricks, Delta Table, Data infrastructure
Praca KrakówKraków - Oferty pracy w okolicznych lokalizacjach