We are seeking a Senior Data Platform Engineer to build and refine data ingestion systems that ensure reliable data flow for analytics and product development. The role involves using Infrastructure as Code and AWS services to automate and manage infrastructure. Proficiency in Python and PySpark is required. You will design scalable data architectures based on business needs and work closely with stakeholders to offer technical insights for ongoing and future data platform initiatives.
Senior Data Platform Engineer (Python)
Your responsibilities
- Design and optimize data ingestion systems to ensure a consistent flow of high-quality data into the platform, creating a solid foundation for developing data products and supporting comprehensive KPI tracking and analysis.
- Demonstrate expertise in Infrastructure as Code (IaC), utilizing tools like Terraform or CloudFormation to automate infrastructure deployment and management.
- Translate business requirements into robust technical architecture, including designing physical schema and logical data models.
- Engage with stakeholders to understand their needs, providing technical guidance for current and future data platform projects.
- Analyze user interaction with the data platform, focusing on patterns of use to identify areas for improvement and optimize user engagement over time.
- Implement and maintain data governance frameworks, emphasizing automated processes for data quality checks, compliance adherence, and secure data handling, while collaborating with engineering teams to integrate governance protocols into data pipelines and platform architecture.
- Participate actively in all Data Platform Engineering team meetings and knowledge-sharing sessions, contributing to team learning and process improvement.
Our requirements
- Expertise in AWS services, especially Glue, Athena, EMR, EC2, IAM, MWAA
- Proficiency in Python and PySpark
- Experience in building data platforms, not just using them
- Proficiency in data modeling techniques and best practices
- Experience in implementing data contracts
- Experience in applying data governance policies
- Experience with data quality frameworks (Great expectations, Soda)
- Familiarity with the data mesh architecture and its principles
- Django experience
- Important: Strong Python knowledge
What we offer
- Opportunity to work on bleeding-edge projects
- Work with a highly motivated and dedicated team
- Competitive salary
- Flexible schedule
- Benefits package - medical insurance, sports
- Corporate social events
- Professional development opportunities
- Well-equipped office