.
Senior Data Engineer (Python, Snowflake, Airflow)
  • Warsaw
Senior Data Engineer (Python, Snowflake, Airflow)
Warszawa, Warsaw, Masovian Voivodeship, Polska
SQUARE ONE RESOURCES sp. z o.o.
10. 4. 2025
Informacje o stanowisku

technologies-expected :


  • Python
  • airflow
  • Snowflake Data Cloud
  • dbt
  • ETL
  • Pandas
  • Apache Airflow
  • AWS
  • Google Cloud Platform
  • Microsoft Azure

technologies-optional :


  • Kafka
  • Spark

about-project :


  • We are looking for an experienced Data Engineer to join a dynamic team in an exciting long-term project with one of our key clients in the Pharmaceutical industry. This project focuses on building and optimizing modern data pipelines and ensuring seamless integration across cloud-based data platforms.
  • The ideal candidate will have strong experience in data engineering with Python, Snowflake, and Airflow, combined with a deep understanding of data integration, cloud infrastructure, and modern ETL processes.

responsibilities :


  • Design, develop, and maintain data pipelines using Python, Airflow, and Snowflake.
  • Collaborate with cross-functional teams to understand data requirements and build efficient solutions for data integration.
  • Implement and optimize ETL/ ELT processes, ensuring high-quality data transformation and delivery.
  • Work with cloud-based platforms (AWS, GCP, or Azure) to ensure seamless data storage and processing.
  • Leverage DBT for data modeling and transformation tasks.
  • Ensure continuous integration and delivery (CI/CD) pipelines are effectively supporting data workflows.
  • Provide support for troubleshooting, optimizing, and maintaining existing data systems.
  • Design and implement best practices for managing data architectures (including Data Vault, Kimball, SCD).

requirements-expected :


  • Strong experience in Data Engineering and data pipeline development
  • Proficiency in Python, particularly for data processing, and familiarity with libraries like pandas, pyarrow, and SQLAlchemy.
  • Solid experience with Snowflake (data warehousing), including working with warehouses and queries at scale.
  • Hands-on experience with Apache Airflow for workflow orchestration.
  • Strong knowledge of ETL/ELT processes and best practices.
  • Familiarity with DBT and its use in data transformation.
  • Experience with cloud platforms, such as AWS, GCP, or Azure.
  • Understanding of modern data architecture concepts, including Data Vault, Kimball, and Slowly Changing Dimensions (SCD).
  • Experience working in CI/CD environments for data pipelines.

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    93 756
    12 628