Your role:
In this role, you will gain a comprehensive understanding of the companys products and services, identifying ideal customer personas based on various factors such as demographics, psychographics, and purchasing power. You will collaborate with international teams, tackling challenging projects and working with stakeholders from around the globe. You will delve into domains such as bidding, pacing, performance-based advertising, recommendations, and churn prediction/prevention. Access to proprietary data will allow you to address product challenges and develop impactful solutions.
Responsibilities:
Develop, test, deploy, and maintain scalable low-latency machine learning products and pipelines, considering factors such as data nature, problem complexity, and computational resources
Validate model performance on unseen data, ensuring generalization and addressing potential issues like bias or fairness concerns
Design and develop the next-generation machine learning platform to support numerous model training pipelines and vast daily batch predictions
Research and experiment with the latest ML platform technologies, staying current with industry trends and pushing the boundaries of ML capabilities
Streamline model deployment and ensure engineering quality through unit testing, integration testing, and stress testing
Automate the ML pipeline using CI/CD principles to promote consistency, reproducibility, and agility
Collaborate with Data Scientists to introduce new ML platform features, streamline the model development process, and reduce lead time for model production
Work closely with internal ML teams to improve codebase and product health
Technical leadership opportunities available based on skills and experience
Offer:
Friendly atmosphere focused on teamwork
Comprehensive training and support in developing algorithmic skills
Opportunity to work on multiple projects with the latest technologies
Monthly integration budget
Opportunities to attend local and international conferences
Flexible working hours
Private medical care (including family members)
Multisport card
Life insurance
Lunch card
Hybrid work model: 3 days from the office per week (Warsaw)
Attractive relocation package
Requirements:
Tech stack: Python, TensorFlow, PyTorch, AWS, Spark, Snowflake, Snowpark, Github Actions, Airflow, Grafana
Must have:
Degree in Computer Science or related fields
Minimum 5 years of industry experience
Familiarity with CI/CD (e.g., Github Actions, Airflow), ETL or big data tools (e.g., MapReduce, Spark, Flink, Kafka, Docker, Kubernetes)
Familiarity with mainstream ML libraries (e.g., TensorFlow, PyTorch, Spark ML) and/or cloud solutions (e.g., AWS, Sagemaker)
Programming skills in Python, Go, or other OOP languages
Experience in SQL and databases, including SQL query optimization
Experience with unit test frameworks
Familiarity with data structures, algorithms, and software engineering principles