Informacje o stanowisku
We are seeking a seasoned Data Architect with extensive experience in designing and implementing robust data solutions. As a senior engineer, you will play a pivotal role in shaping the data architecture and engineering practices in a big retail company. You’ll collaborate with solution architects, product managers, and engineering teams to ensure seamless integration and accessibility of data while building scalable, high-performance data ecosystems.
Your tasks
- Collaborate with solution architects and product managers to determine the most suitable data architecture
- Partner with software engineers and data product managers to understand data and reporting requirements and translate them into effective data architecture solutions
- Design and implement data structures, ETL pipelines, and integration solutions for collecting data from both internal and external systems
- Develop performant, reliable, and scalable data solutions using modern architecture concepts such as data mesh, data lake, and data lakehouse
- Ensure data quality by writing tests, performing code reviews, and debugging complex data pipelines
- Work closely with engineering teams to ensure seamless integration of data across various systems and platforms
- Implement data governance and data quality assurance practices, ensuring data integrity across the organization
- Assist product managers in developing conceptual and detailed data models and blueprints that enable the development of advanced data products
- Support the design and development of ETL processes using tools like Snowflake, and Airflow, and message brokers like RabbitMQ and SNS/SQS
Requirements:
- 7 years of experience in data architecture, data engineering, or a related role
- Expertise in data engineering, database design, and data warehousing concepts
- Knowledge of modern data architecture concepts, including data mesh, data virtualization, data lakes, and data lakehouses
- Proficiency with data integration and pipeline tools such as Snowflake and Airflow
- Strong knowledge of ETL processes and message brokers like RabbitMQ and SNS/SQS
- Advanced proficiency in SQL and Java
- Familiarity with cloud platforms, especially Azure Databricks and AWS
- Strong understanding of data governance and data quality assurance practices
Nice-to-have requirements
- Knowledge of Python
- Experience with event-based customer analytics
Praca WarszawaWarszawa - Oferty pracy w okolicznych lokalizacjach