.
Senior Data Engineer (Python, Snowflake, Airflow)
  • Warszawa
Senior Data Engineer (Python, Snowflake, Airflow)
Warszawa, Warszawa, mazowieckie, Polska
Square One Resources Sp. z o.o.
10. 4. 2025
Informacje o stanowisku

We are looking for an experienced Data Engineer to join a dynamic team in an exciting long-term project with one of our key clients in the Pharmaceutical industry. This project focuses on building and optimizing modern data pipelines and ensuring seamless integration across cloud-based data platforms.

The ideal candidate will have strong experience in data engineering with Python, Snowflake, and Airflow, combined with a deep understanding of data integration, cloud infrastructure, and modern ETL processes.

Senior Data Engineer (Python, Snowflake, Airflow)



Your responsibilities

  • Design, develop, and maintain data pipelines using Python, Airflow, and Snowflake.
  • Collaborate with cross-functional teams to understand data requirements and build efficient solutions for data integration.
  • Implement and optimize ETL/ ELT processes, ensuring high-quality data transformation and delivery.
  • Work with cloud-based platforms (AWS, GCP, or Azure) to ensure seamless data storage and processing.
  • Leverage DBT for data modeling and transformation tasks.
  • Ensure continuous integration and delivery (CI/CD) pipelines are effectively supporting data workflows.
  • Provide support for troubleshooting, optimizing, and maintaining existing data systems.
  • Design and implement best practices for managing data architectures (including Data Vault, Kimball, SCD).

Our requirements

  • Strong experience in Data Engineering and data pipeline development
  • Proficiency in Python, particularly for data processing, and familiarity with libraries like pandas, pyarrow, and SQLAlchemy.
  • Solid experience with Snowflake (data warehousing), including working with warehouses and queries at scale.
  • Hands-on experience with Apache Airflow for workflow orchestration.
  • Strong knowledge of ETL/ELT processes and best practices.
  • Familiarity with DBT and its use in data transformation.
  • Experience with cloud platforms, such as AWS, GCP, or Azure.
  • Understanding of modern data architecture concepts, including Data Vault, Kimball, and Slowly Changing Dimensions (SCD).
  • Experience working in CI/CD environments for data pipelines.
  • Experience with machine learning (ML) or related technologies.
  • Familiarity with containerization technologies such as Docker.
  • Exposure to additional data tools like Kafka, Spark, or Hado

 

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    91 636
    12 338