Informacje o stanowisku
At Bayer we’re visionaries, driven to solve the world’s toughest challenges and striving for a world where ,Health for all, Hunger for none’ is no longer a dream, but a real possibility. We’re doing it with energy, curiosity and sheer dedication, always learning from unique perspectives of those around us, expanding our thinking, growing our capabilities and redefining ‘impossible’. There are so many reasons to join us. If you’re hungry to build a varied and meaningful career in a community of brilliant and diverse minds to make a real difference, there’s only one choice.
Qualifications & Competencies (education, skills, experience):
- Strong proficiency in Python for ETL processes and automation.
- Practical knowledge of AWS Cloud services and Terraform.
- Familiarity with Docker and Kubernetes (EKS) for containerization.
- Understanding of workflow engines like Argo Workflows or AirFlow.
- Knowledge of PySpark and Databricks as a plus.
- Ability to apply best practices and design patterns in development.
- Strong teamwork and communication skills for collaboration in DevOps teams.
- Fluent in English, both written and spoken.
At Bayer we’re visionaries, driven to solve the world’s toughest challenges and striving for a world where ,Health for all, Hunger for none’ is no longer a dream, but a real possibility. We’re doing it with energy, curiosity and sheer dedication, always learning from unique perspectives of those around us, expanding our thinking, growing our capabilities and redefining ‘impossible’. There are so many reasons to join us. If you’re hungry to build a varied and meaningful career in a community of brilliant and diverse minds to make a real difference, there’s only one choice.
,[Design, develop, and optimize ETL workflows and data pipelines to support data migration and consolidation, leveraging API and Python., Implement data processing and transformation logic using Pandas and SQLAlchemy, with PySpark and Databricks knowledge as a plus., Develop and maintain relational database solutions, particularly with PostgreSQL., Collaborate with business analysts, data engineers, and IT teams to define ETL strategies and understand requirements., Utilize best practices and design patterns to ensure efficient and scalable development., Troubleshoot and resolve issues throughout the entire software lifecycle, ensuring readiness for go-live., Engage actively with data scientists, IT product managers, and IT Architects throughout the product lifecycle. Requirements: Python, AWS, Terraform, Docker, Kubernetes, Airflow, Databricks, PySpark Tools: Jira, GitHub, GIT, Agile, Scrum. Additionally: Sport subscription, Training budget, Private healthcare, Flat structure, Small teams, International projects, Canteen, Bike parking, Playroom, In-house trainings, In-house hack days, Modern office, No dress code.
Praca WarszawaWarszawa - Oferty pracy w okolicznych lokalizacjach