Informacje o stanowisku
On The Spot is a software development company, focused on building R&D offices for heavily invested startups from the UK, EU and Israel. We aim to directly connect tech talents with emerging tech companies worldwide to develop their products — from scratch to unicorn.
Domains: cybersecurity, e-commerce, fintech, adtech
Key customers: Orca Security, ironSource (merged with Unity), Cycode, Karma, 365scores
Company staff: 140+ people
About the role
The company specializes in the fine wine investment market, providing a unique opportunity to combine a passion for wine with investment strategy. The product leverages sophisticated data analysis and machine learning to optimize investment decisions in the wine market, supported by a global network that spans over 80 countries and manages assets worth £250M. With a focus on transparency, security, and expert guidance, the company offers an enriching investment experience.
Responsibilities:
- Data Pipeline Development: Design, develop, and maintain scalable ETL (Extract, Transform, Load) processes to collect, process, and store large volumes of data from various sources.
- Database Management: Optimize and manage SQL databases to ensure efficient data storage, retrieval, and overall performance.
- Data Integration: Integrate data from various data sources, ensuring data quality, consistency, and accessibility.
- Collaboration: Work closely with stakeholders to understand data needs and provide robust data solutions.
- Performance Optimization: Analyze and optimize data workflows and queries for maximum performance and efficiency.
- Documentation: Maintain comprehensive documentation of data engineering processes, architectures, and workflows.
- Troubleshooting: Identify and resolve data-related issues, ensuring data integrity and reliability.
Requirements:
- Proficiency in writing complex SQL queries, optimizing queries, and managing relational databases.
- Strong programming skills in Python, with experience in using libraries such as Pandas, NumPy, and others for data processing.
- Experience with ETL tools and frameworks.
- Strong problem-solving abilities and analytical thinking.
- Excellent communication skills, with the ability to collaborate effectively with cross-functional teams.
- English at B2 level.
Preferred qualifications:
- Databricks Experience: Familiarity with Databricks for data processing and analytics.
- PySpark Proficiency: Experience in using PySpark for big data processing.
- Cloud Platforms: Experience with cloud platforms such as Azure.
Work Environment
Work in a highly professional team with an informal and friendly atmosphere.
Ability to work from our comfortable downtown office in Warsaw.
Benefits:
- Paid vacation — 20 business days per year, 100% sick leave payment.
- 5 sick days per year.
- Medical insurance (after the end of the probationary period).
- Partially compensated educational costs (for courses, certifications, professional events, etc.).
- English and Polish classes twice a week (online).
- Legal and Accounting support in Poland.
- Inflation-protected wages with regular revision of compensation conditions.
- Bright and memorable corporate life: corporate parties, gifts to employees on significant dates.
#J-18808-Ljbffr
Praca WarszawaWarszawa - Oferty pracy w okolicznych lokalizacjach