Bydgoszcz, Bydgoszcz, Kuyavian-Pomeranian Voivodeship, Polska
Huuuge Games Sp. z o.o.
11. 10. 2025
Informacje o stanowisku
technologies-expected :
Java
Scala
Python
technologies-optional :
AWS
Google Cloud Platform
Microsoft Azure
Kafka
Flink
about-project :
At Huuuge Games we make top grossing mobile games that bring people together through fun and social mobile gaming.
Were looking for a passionate Streaming Applications Developer to join our team. Youll be responsible for developing and maintaining high-performance, real-time systems where every millisecond counts, processing thousands of events per second. Youll be using Scala within the Confluent Kafka ecosystem to build these systems. Your work will have a tangible impact on the companys results, and youll see the direct effects of your efforts implemented and scaled.
responsibilities :
Develop core modules for our Big Data platform and data applications, which operate on hundreds of servers in AWS using Spark, Kafka, Flink and Kubernetes.
Design, develop, and maintain high-performance, real-time data pipelines.
Research and perform proof-of-concepts for new technologies, tools, and design patterns.
Lead and contribute to technical design discussions, challenging and reviewing architectural definitions.
Conduct code reviews to ensure high development standards and best practices are maintained.
Contribute to high-standard development processes, including CI/CD and unit testing.
Ensure seamless and reliable deployments to production, along with effective production monitoring.
Collaborate with cross-functional teams to deliver scalable and robust data solutions.
requirements-expected :
3+ years of experience working in Java/Scala/Python or other high-level programming languages.
Solid understanding of software architecture paradigms (e.g., microservices, event-driven architecture) and design patterns for building scalable, distributed systems.
Proven experience in the design and development of large-scale distributed systems.
Practical knowledge of Big Data concepts and technologies, with hands-on experience in processing large datasets.
Familiarity with cloud environments (AWS, GCP, or Azure) and a commitment to high-quality development practices, including unit testing and CI/CD.