In this role, you will leverage advanced analytical techniques and machine learning models to extract actionable insights from our complex data sets, driving strategic decision-making and operational excellence. Partner with our product and business teams to understand their needs, translate them into data science solutions, and provide actionable insights Develop and implement advanced data science solutions (including ML models, AI products, optimization frameworks etc.) Stay ahead of the curve exploring cutting-edge methods and being on top of new trends in Data Science & AI Education – Bachelor’s, Master’s or PhD degree in a relevant field, e.g. data science, computer science, mathematics, statistics, econometrics, physics or similar Experience – at least 3 years of experience working using data science methods. Strong technical and/or research background is a plus. Mindset – you are data-driven, goal-oriented and proactive, skilled in change and time management, business-conscious, able to think long-term and decompose business problems Languages – you are proficient in English (other languages knowledge is a plus) Experience applying data science methods at scale or building models used in production by large number of users Previous work on open-source projects Experience in working in a data-driven environment, designing, executing and analysing many experiments Specialized knowledge in some of these areas: logistics, operation research, forecasting, spatial data science, graph methods, product analytics or marketing data science Technical requirements Excellent knowledge of ML solutions and their impact on business, user experience and operational processes (supervised and unsupervised learning, e.g.: clustering, recommender systems, regression, classification, linear programming, reinforcement learning, time series, geospatial analysis, etc. Proficiency in Python 3, as well as ML and data analysis libraries (e.g. pandas, numpy, scipy, scikit-learn, statsmodels, tf/pytorch, etc.) Knowledge and experience in PySpark, relational databases, cloud solutions (e.g. Databricks, Azure, GCP, AWS, Snowflake) Knowledge of best practices regarding writing readable code, technical reviews and maintainability Flexible work Contract selection