Databricks Data Engineer (m/k/n) Miejsce pracy: Wrocław Technologies we use Expected SQL PySpark Python Azure AWS Google Cloud Platform Your responsibilities Designing and developing Data Engineering solutions Building and maintaining ETL/ELT processes Working with large volumes of data in a distributed environment Data modeling and processing for analytical purposes Collaborating with technical and business teams Participating in the design of cloud-based data architectures Our requirements Minimum 5 years of experience in the Data Engineering field Minimum 2 years of experience with Databricks Very good knowledge of SQL, PySpark, and Python Experience with data warehousing, ETL, distributed data processing, and data modeling Strong analytical problem-solving skills in a Big Data environment Experience working with structured, semi-structured, and unstructured data Experience with at least one public cloud platform (Azure, AWS, or GCP) Knowledge of designing relational and non-relational databases Knowledge of Data Mart, Data Warehouse, Data Lake, and Data Mesh concepts Very good command of English Experience working with Agile methodologies (Scrum, Kanban) and knowledge of DevOps and CI/CD principles Wszystkie informacje o przetwarzaniu danych osobowych w tej rekrutacji znajdziesz w formularzu aplikacyjnym, po kliknięciu w przycisk "Aplikuj Teraz".