.
Senior Data Engineer @ Bayer
  • Warsaw
Senior Data Engineer @ Bayer
Warszawa, Warsaw, Masovian Voivodeship, Polska
Bayer
9. 11. 2025
Informacje o stanowisku


At Bayer we’re visionaries, driven to solve the world’s toughest challenges and striving for a world where ,Health for all, Hunger for none’ is no longer a dream, but a real possibility. We’re doing it with energy, curiosity and sheer dedication, always learning from unique perspectives of those around us, expanding our thinking, growing our capabilities and redefining ‘impossible’. There are so many reasons to join us. If you’re hungry to build a varied and meaningful career in a community of brilliant and diverse minds to make a real difference, there’s only one choice.


Senior Data Engineer 


As a Senior Data Engineer, you will be a part of Bayers Pharma Data & AI Team with varying assignments according to project needs. You will develop and operate robust, observable, and cost-efficient data pipelines and services powering analytics and data products on our Commercial Data Platform.



  • Bachelor’s degree in Computer Science, Data Science, Information Technology, or a related field.
  • 5+ years of experience in data engineering, data operations, or a similar role, delivering production-grade data pipelines and services.
  • Familiarity with data product lifecycle management across disciplines: Demonstrated expertise designing, implementing, and operating robust data ingestion, transformation, and serving pipelines across batch and streaming architectures, leveraging expert knowledge in SQL and Python, and experience in orchestration via modern platforms such as Dagster or Airflow with sound workflow/dependency design.
  • Experience with at least one major cloud service provider (AWS, Azure, or GCP), demonstrating the ability to leverage cloud technologies for business solutions.
  • Good experience in building and maintaining streaming integrations with Kafka and/or Kinesis, ensuring exactly-once or idempotent processing, managing schema evolution, and implementing efficient replay strategies for reliable near-real-time data delivery.
  • Good hands-on experience in leveraging solutions such as dbt Core to implement modular, tested, and well-documented data and analytics models; enforcing best practices for code quality, testing standards, and documentation throughout the data engineering lifecycle.
  • Good experience with data quality frameworks (e.g. dbt tests, Great Expectations, PyTest), integrating comprehensive data validation and quality gates within CI/CD pipelines, and supporting incident triage and root cause analysis.
  • Basic knowledge working with containerized solutions.
  • Hands-on experience working with APIs such (GraphQL, OData, REST).
  • Competent in usage of data integration and ingestion solutions (e.g. AWS API Gateway, dltHub, AWS AppFlow).
  • Basic experience with modern lakehouse table formats (Apache Iceberg, Delta Lake)
  • Good experience with cost and performance optimization and FinOps practices for cloud data systems such as Snowflake or Databricks, ensuring reliable operations and scalable data delivery in production environments.
  • Strong ownership of deliverables, high standards for code review and peer mentorship, and a commitment to clear documentation, metadata/lineage publishing, and sustainable engineering practices.
  • Good experience with agile development methodologies and tooling such as Azure DevOps
  • Excellent analytical and problem-solving skills.
  • Good communication and collaboration abilities. Ability to work independently and as part of a team.
  • Strong adaptability and willingness to quickly learn and adopt new technologies and approaches in a dynamic, evolving data platform and business environment.


At Bayer we’re visionaries, driven to solve the world’s toughest challenges and striving for a world where ,Health for all, Hunger for none’ is no longer a dream, but a real possibility. We’re doing it with energy, curiosity and sheer dedication, always learning from unique perspectives of those around us, expanding our thinking, growing our capabilities and redefining ‘impossible’. There are so many reasons to join us. If you’re hungry to build a varied and meaningful career in a community of brilliant and diverse minds to make a real difference, there’s only one choice.


Senior Data Engineer 


As a Senior Data Engineer, you will be a part of Bayers Pharma Data & AI Team with varying assignments according to project needs. You will develop and operate robust, observable, and cost-efficient data pipelines and services powering analytics and data products on our Commercial Data Platform.


,[Collaborate closely within the data product team members, data source owners and data consumers (e.g. data scientists, data analysts) to understand data requirements and clarify acceptance criteria., Contribute to the design of analytical and domain data models that support the organizations data & analytics requirements now and in the future., Design, develop, and maintain scalable data pipelines for ingesting, transforming, and storing large volumes of data from diverse sources. Build and operate data solutions on AWS and in Snowflake or Databricks, applying modern lakehouse formats (Delta/Iceberg) where appropriate., Follow documented standards and industry best practices for data engineering, ensuring compliance with regulations and data governance policies., Embed automated data quality checks and validation gates within pipelines and CI/CD., Optimize data processing to improve performance and cost across storage and compute., Troubleshoot and resolve data-related incidents, ensuring minimal downtime and disruption to business operations. Lead root-cause analysis and drive blameless postmortems to reduce MTTR and prevent recurrences., Contribute to the data engineering community, defining & documenting data infrastructure, processes, and best practices for knowledge sharing and future reference., Participate in “data engineer on duty” rotations during regular office hours to support data services and incident response., Mentor junior engineers and uphold high standards in code review, testing, and documentation. Requirements: Python, SQL, AWS, Airflow, Kafka Tools: GitHub, GIT, Agile. Additionally: Sport subscription, Training budget, Private healthcare, Flat structure, Lunch card, International projects, Free coffee, Canteen, Bike parking, Playroom, Free parking, In-house trainings, In-house hack days, Modern office, Startup atmosphere.

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    109 883
    15 428