We are looking for an experienced Senior Data Engineer to help design and implement a modern data lakehouse platform from scratch. This role combines hands-on engineering with strategic decision-making around scalable data solutions.
responsibilities :
Design and implement a greenfield data lakehouse architecture.
Build and optimize data models, schemas, and integration patterns.
Develop scalable data storage and processing solutions using Azure and open-source components.
Implement data ingestion pipelines (ETL/ELT) and high-performance query APIs.
Integrate the platform with enterprise systems, ensuring data consistency and quality.
Create technical documentation, establish standards, and define data governance policies.
requirements-expected :
At least 4 years of hands-on experience with Microsoft Azure data services.
Proven track record of delivering large-scale data projects using Azure / Databricks / Spark.
Proficiency with Microsoft Azure Data Platform (Synapse, Data Factory, SQL Databases, Data Lake).
Practical experience with Spark or other big data frameworks.
Excellent knowledge of SQL, including query optimization.
Good command of .NET Core.
Solid understanding of data modeling techniques and pipeline design (ETL/ELT).