.
Data engineer
  • Kraków
Data engineer
Kraków, Kraków, Lesser Poland Voivodeship, Polska
KITOPI POLAND Sp. Z O.o.
23. 12. 2024
Informacje o stanowisku

Technologies-expected : Snowflake Python SQL DBT about-project : As a leading food-tech business, Kitopi’s growth has been largely fueled by its innovative and scalable software solutions. Kitopi’s kitchens are powered by its proprietary Smart Kitchen Operating System (SKOS) - an in-house suite of applications that optimizes cloud kitchen operations in the real-time. As part of its growth roadmap, technological innovation, data science, artificial intelligence, and robotics will take center stage as Kitopi continues to reinvent the food industry as we know it today. At Kitopi, we collaborate closely with business and product stakeholders to facilitate datadriven decision-making. Our data organization is structured around data verticals that focus on specific business domains, as well as data horizontals that specialize in di Perent practices. Data engineering is one of these practices, and as a data engineer, you will work alongside other data engineers and cross-functional teams, including analysts, ML engineers, BI specialists, data modelers, platform engineers, and backend engineers. responsibilities : In brief: Design, build, and continuously scale the data ingestion, preparation, modeling, and aggregation in our data warehouse, while ensuring data quality and implementing industry best practices. Contribute to improvement of the existing data products and take part in conducting POCs for new data products Develop a deep understanding of our business domain and use this knowledge to drive the maturation of data in the warehouse there by supporting the creation of valuable insights that enable data-driven decision making across our product and business teams In detail: People: Collaborate closely with Horizontal Data Engineers, Data Architect, and Data Modeller to develop scalable data pipelines and maintain the infrastructure. People: Work closely with the Data Vertical Leader and Data Analysts to gain domain knowledge of the business and identify opportunities for building the best data solutions. Build ETL solutions for Real-Time and Batch Data processing using various platforms and technologies in a cloud computing environment. Design programmatic solutions and translate them into code that works with Gitlab CI/CD, Docker, Terraform, and Kubernetes. Build a comprehensive view of all customer interactions to enable personalisation and a better user experience with reliable data. Data Horizon: Help identify new sources of data from your squad developments, and collaborate closely with the Vertical Data Analyst to build business-driven models for use by other business functions and Data Scientists. Partnership: Work in partnership with the Horizontal Data Team to improve platform capabilities around data modeling, testing platforms, data visualization, and data architecture. Governance: incorporate and help shape company-wide data governance policies. Maintain and monitor the data pipelines within the data products that you build, and Horizontal Data Team infrastructure. Data Quality: Adopt best practices in reporting and analysis, including data integrity, data security, analysis, validation, and documentation to ensure data quality. requirements-expected : Bachelors or Masters degree in Engineering, Computer Science, Technology, or similar. 2+ years of hands-on experience developing production quality code in data engineering at high growth consumer product company or similar. Professional experience using Python and SQL for data processing. Demonstrably deep understanding of SQL and analytical data warehouses (Snowflake preferred, with Azure based solutions being a plus – Microsoft One Lake), and hands-on experience implementing ETL (or ELT) best practices at scale. Hands-on experience with data pipeline tools (Airflow, Airbyte, dbt). Strong data modeling skills and familiarity with the Kimball methodology. Knowledge about testing methods, automatisations, and good practices within Data Warehouse. Desire to continually keep up with advancements in data engineering practices, and ability to catch bugs and style issues in code reviews. Should have the capability to work independently, possess critical thinking, problem-solving, stakeholder management, and e Pective presentation skills, and be a good team player with inclusiveness and constructive thinking. offered : We offer both employment contracts (Uo P) and B2 B agreements, tailored to your preferences. ESOP - Employee Stock Option Plan. You choose a form of employment (26 paid days o P on B2 B). Additional paid days off for volunteer activities. U-Day - time for medical check-ups, examinations, or diagnosis. Top equipment: high-end Mac Book Pro + additional accessories. Support for your development:. 2 000 PLN growth annual budget available to every employee. Internal initiatives: webinars, workshops, knowledge-sharing sessions internal conferences Mentoring program. Free English classes with a native speaker. Worksmile benefit platform (private medical healthcare, Multisport card, vouchers, etc.) Life insurance. Mental health support - free access to online sessions with a professional therapist. Wellbeing program - tailored to the needs of our employees including physical & mental health, and socializing activities. Referral bonus. Flextime: adjust your daily schedule to your individual needs. Great office with fruit & snacks, social budget for every team & awaydays, and more! benefits : sharing the costs of sports activities private medical care sharing the costs of foreign language classes life insurance remote work opportunities flexible working time fruits integration events computer available for private use corporate library no dress code video games at work coffee / tea drinks parking space for employees leisure zone employee referral program charity initiatives extra leave ESOP - Employee Stock Option Plan

  • Praca Kraków
  • Kraków - Oferty pracy w okolicznych lokalizacjach


    93 389
    15 233