We are seeking a passionate Data Engineer to join the newly forming A/B Testing Platform team in the Data Science Hub where we apply analytical techniques, mathematics, and machine learning to solve a wide range of business problems.
The A/B Testing Platform team is a multidisciplinary group of product analysts, software engineers, and data engineers. Our mission is to strategically enhance our A/B testing platform, a critical tool that empowers data-driven decision-making regarding the roll out of new features by assessing the potential impact of these features through user behavior analysis. Through tasks performed, the team plays a pivotal role in shaping the overall user experience on Allegro, one of the worlds largest eCommerce platforms.
responsibilities :
Designing, developing, and maintaining robust, scalable data pipelines.
Collaborating closely with product managers, UX designers, data analysts and software engineers to understand their requirements and deliver high quality, prepared data to enable their work.
Building, testing, and maintaining data systems for accuracy and readiness for a bigger pipeline containing streaming data flow.
Designing and implementing data schemas, data models, message brokers, and SQL/No-SQL databases.
Optimizing data systems and building them from the ground up to deliver insights for data analytical systems.
Implementing data pipelines and automated workflows required for the A/B testing platform.
Ensuring data privacy and compliance standards across all projects.
Operating with multiple platforms and technologies such as Google Cloud Platform, Azure Cloud and Allegro Data Centers.
Delivering solutions for multiple markets.
Balancing engagement across ad-hoc support of Product Managers and Data Analysts requests.
requirements-expected :
We are looking for people who:
Have a Bachelors or Masters degree in Computer Science, Mathematics or a related field.
Know English at min. B2 level.
Have proven experience as a Data Engineer or in a similar role
Are able to fluently work with SQL preferably GCP BigQuery.
Have knowledge of BigData tools in Google Cloud Platform, AWS or Azure.
Have experience with message broker systems and streaming data processing eg. Pub/Sub, Apache Beam
Are aware of data pipelines orchestration tools like Apache Airflow.
Have experience in Python programming and are familiar with software engineering best practices (PEP8, clean architecture, code review, CI/CD etc.).
Experience with Infrastructure as a Code tools - Terraform is welcomed.
Have proven commercial experience in DevOps and CI/CD practice.
Have strong communication skills, capable of conveying complex ideas in a clear, concise manner.
Are detail-oriented and capable of working in a fast-paced, dynamic environment.
Have a positive attitude and ability to work in a team.
Are eager to constantly develop and broaden their knowledge.
offered :
A hybrid work model that you will agree on with your leader and the team. We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms).
Annual bonus up to 10% of the annual salary gross (depending on your annual assessment and the companys results).
A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g., medical, sports or lunch packages, insurance, purchase vouchers).
English classes that we pay for related to the specific nature of your job.
16" or 14" MacBook Pro with M1 processor and 32GB RAM or a corresponding Dell with Windows (if you don’t like Macs) and other gadgets that you may need.
Working in a team you can always count on — we have on board top-class specialists and experts in their areas of expertise.
A high degree of autonomy in terms of organizing your team’s work. We encourage you to develop continuously and try out new things.
Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communication, motivation to work and various technologies and subject-matter issues).
If you want to learn more, check out this webpage or listen to the Allegro Tech Podcast Episode about recent projects in the Data Science Hub.
benefits :
sharing the costs of sports activities
private medical care
sharing the costs of foreign language classes
sharing the costs of professional training & courses