Technologies-expected : Python SQL Google Cloud Platform about-project : What about the projects we work with? We run a variety of projects in which our sweepmasters can excel. Advanced Analytics, Data Platforms, Streaming Analytics Platforms, Machine Learning Models, Generative AI and more. We like working with top technologies and open-source solutions for Data & AI and ML/AI. In our portfolio, you can find Clients from many industries, e.g., media, e-commerce, retail, fintech, banking, and telcos, such as Truecaller, Spotify, ING, Acast, Volt, Play, and Allegro. You can read some customer stories here. What else do we do besides working on projects? We conduct many initiatives like Guilds and Labs and other knowledge-sharing initiatives. We build a community around Data & AI, thanks to our conference Big Data Technology Warsaw Summit, meetup Warsaw Data Tech Talks, Radio Data podcast, and DATA Pill newsletter. Data & AI projects that we run and the companys philosophy of sharing knowledge and ideas in this field make Get In Data | Part of Xebia not only a great place to work but also a place that provides you with a real opportunity to boost your career. If you want to be up to date with the latest news from us, please follow up on our Linked In profile. About role A Data Engineers role involves crafting, constructing, and upholding the structure, tools, and procedures essential for an organization to gather, store, modify, and scrutinize extensive data amounts. This position involves creating data platforms using typically provided infrastructure and establishing a clear path for Analytics Engineers who utilize the system. responsibilities : Working alongside Platform Engineers to assess and choose suitable technologies and tools for the project Supporting and diagnosing technical problems of the data processing layer Implementing intricate data intake procedures Constructing efficient data models Implementing and executing policies aligned to the strategic plans of the company concerning used technologies, work organization, etc. Ensuring compliance with industry standards and regulations in terms of security and data privacy applied in the data processing layer Providing training and fostering knowledge-sharing requirements-expected : Proficiency in a programming language like Python and SQL Knowledge of the Big Query DWH platform Experience as a programmer and knowledge of software engineering, good principles, practices, and solutions Familiarity with cloud Google Cloud Platform (GCP) Familiarity with Dev Ops area and tools - GKE, Docker Experience with Version Control System, preferably GIT Experience with tools such as Spark or Airflow is nice to have Ability to actively participate/lead discussions with clients to identify and assess concrete and ambitious avenues for improvement offered : Salary: 110 - 160 PLN net + VAT/h B2 B (depending on knowledge and experience) 100% remote work Flexible working hours Possibility to work from the office located in the heart of Warsaw Opportunity to learn and develop with the best Big Data experts International projects Possibility of conducting workshops and training Certifications Co-financing sport card Co-financing health care All equipment needed for work