Senior Azure Data Engineer
Numer referencyjny: PL26/AZUREDATAENGINEER/REMOTE
In Cyclad We work with top international IT companies to boost their potential in delivering outstanding, cutting-edge technologies that shape the world of the future. We are seeking an experienced Azure Data Engineer with Databricks expertise to join a remote development team. The role focuses on designing, building, and maintaining scalable, cloud-based data platforms using the Microsoft Azure analytics stack and modern Databricks features. The candidate will collaborate closely with architects, data scientists, and business stakeholders to deliver secure, production-ready data solutions.
Project information:
- Type of project: IT services
- Office location: Poland
- Work model: Remote
- Budget: 150-180 PLN net /h - b2b
- Project length: Longterm
- Only candidates with citizenship in the European Union and residence in Poland
- Start date: ASAP (depending on candidate's availability)
Project scope:
- Design, build, and deploy scalable data pipelines using Azure Databricks and the Azure Analytics stack
- Curate structured, semi-structured, and unstructured data by creating efficient, cost-effective, and scalable pipelines
- Create and maintain robust data pipeline architecture ensuring data quality, reliability, and scalability
- Assemble and manage large, complex datasets to meet functional and non-functional business requirements
- Identify, design, and implement process improvements, including automation and optimization of data delivery
- Work with real-time and streaming analytics solutions where applicable
- Collaborate closely with Product Owners, Scrum Masters, architects, and data leadership to ensure timely, high-quality delivery
- Align with Data Engineering chapter standards, processes, and best practices
- Apply solid software engineering practices, including unit testing, CI/CD, and version control
- Troubleshoot complex data-related issues and perform root cause analysis
- Ensure data security, governance, and compliance with data management frameworks
Tech Stack:
- Cloud: Microsoft Azure (Data Factory, Event Hubs, CosmosDB, Databricks)
- Data Processing & Analytics: Databricks, Spark, SQL, real-time analytics tools
- Programming & Scripting: Python, SQL
- DevOps & CI/CD: Git, CI/CD pipelines, unit testing
- Workflow & Orchestration: Airflow
- Data Management & Governance: Data modelling, Collibra/Alation (optional), DAMA frameworks
Requirements:
- Minimum 3+years' hands-on experience with Azure Data Factory and Databricks (modern features, including Unity Catalog and 2025 capabilities)
- At least 5+ years' experience in data engineering or backend/full-stack software development
- Solid software engineering background: writing unit tests in Python, proficient with Git, and CI/CD pipelines
- Strong SQL skills and experience structuring and modelling data in both relational and non-relational formats
- Experience with data transformation tools, Spark, and real-time analytics solutions on Azure
- Familiarity with modern cloud infrastructure and analytics tools, including Azure Event Hubs, CosmosDB, Spark Streaming, or Airflow is a plus
- Exposure to data catalogue tools (Collibra, Alation) and data management frameworks (e.g., DAMA) is a plus
- Strong verbal and written communication skills in English
We offer:
- Full-time job agreement based on b2b
- Private medical care with dental care (covering 70% of costs) + rehabilitation package.; family package option possible
- Multisport card (also for an accompanying person)
- Life insurance
- Flexibility and international environment