Working with a modern analytics stack, including Google Cloud, BigQuery, Informatica, and Python, the Senior Data Engineer will be building data pipelines, transforming, structuring, and operationalizing data to support BI, Analytics, and Data Science projects.
Your responsibilities
- Contribute to the development and design of the data platform, including the design and execution of data extraction, transformation, and optimal data pipeline architectures and solutions at an enterprise scale to enable business intelligence, analytics, and data science use cases
- Design and implement processes around data-loading, data transformation, and feature engineering as well as access control, data security, and data privacy
- Develop complex code-based ETL/ELT data pipelines with performance optimized data modeling
- Develop source of truth data models to power business analytics solutions using data warehouse modeling techniques, data quality checks, reconciliation processes, and thorough testing practices
- Establish and implement best practices in coding, monitoring, and alerting, using CI/CD and other relevant techniques
- Ensure code is managed, validated, and released using best practices (DevOps) making effective use of automation for these processes
- Maintain a repeatable and automated approach to data management including a continuous deployment pipeline, centralized configuration management, automated testing patterns, provisioning of environments, and controlled release to production
- Ensure all features and new releases are tested, upgraded, and changed as necessary from a data perspective
Our requirements
- Minimum 4 years of experience in a similar role
- Proficiency in designing efficient and robust ETL/ELT workflows, data modeling and data pipelines – batch and/or event-driven
- Proficiency in designing data solutions and creating solution architecture
- Experience working in a complex enterprise data warehouse environment and developing and maintaining pipeline processes to move and transform data
- Experience with data warehousing principles, including orchestration framework, schema structure, and data architecture principles
- Experience with cloud solutions (preferably GCP)
- Experience with Informatica ETL and Governance
- Ability to write robust code in SQL, and Python
Benefits
- sharing the costs of sports activities
- private medical card
- sharing the costs of professional training & courses
- life insurance
- coffee / tea
- meal passes
- sharing the commuting costs
- extra leave
- work in international teams
- friendly atmosphere
- copyrights for IT roles
- hybrid work
Recruitment stages
- INITIAL INTERVIEW (30 min)
- TECHNICAL TEST (optional - 45-90 min)
- TECHNICAL INTERVIEW (60-90 min)
- FINAL INTERVIEW (60-90 min)
- FEEDBACK / OFFER
Established in 1928, Genuine Parts Company (GPC) is a leading global service organization specializing in the distribution of automotive and industrial replacement parts. GPCs commitment to innovation and technology is evident in the GPC Global Technology Center in Krakow, established in 2022. This center serves as a hub for research and development, supporting GPCs digital transformation efforts. The centers team of highly skilled IT engineers focuses on developing advanced technologies and solutions that enhance GPCs operations and growth. Their work spans across various areas, including e-commerce and data platforms, supply chain solutions, selling systems, and cyber security. Learn more at genpt.com.