.
Data Engineer
  • Warsaw
Data Engineer
Warszawa, Warsaw, Masovian Voivodeship, Polska
KUBO
15. 12. 2024
Informacje o stanowisku

technologies-expected :


  • Python
  • Pyspark
  • AWS
  • Azure Cosmos DB
  • ETL

about-project :


  • We are looking for an experienced Regular / Senior Data Engineers to join a forward-thinking IT services company known for its expertise in managing sophisticated data platforms. The company is a managed service provider that empowers clients in various industries to achieve business outcomes through purpose-built, workload-optimized technology solutions. In this role, you will play a crucial role in one of the multiple teams collaborating with each other to deliver large-scale, cloud oriented data platform.

responsibilities :


  • Design and manage data pipelines (ETL/ELT): design, implement, and optimize data pipelines using Python and PySpark to process large volumes of data in both real-time and batch modes, ensuring data quality and transformation best practices
  • Build and maintain a cloud-based data platform (AWS or Azure): develop and maintain a scalable data platform in the cloud using services such as Amazon S3, AWS Glue, Azure Data Lake, or Azure Synapse, while ensuring system security, monitoring, and performance aligned with business requirements
  • Automate and optimize data processing workflows: create automated solutions for data processing and integrate various data sources, leveraging advanced ETL tools and cloud infrastructure to minimize costs and enhance operational efficienc

requirements-expected :


  • Demonstrated ability to design, implement, and optimize data pipelines and large-scale data processing solutions.
  • The technical skills required: Python, PySpark, SQL.
  • Good working knowledge and skills in the following areas: building ETL processes, data integration, and orchestration tools (e.g., Apache Airflow, Luigi, AWS Glue, Azure Data Factory).
  • Familiarity with at least ONE cloud environment: AWS, Azure, or GCP, with a focus on data services (e.g., S3, Redshift, Databricks, BigQuery).
  • A solution-oriented style of conversation with a customer-centric approach to data-related problem-solving.
  • Demonstrated ability to partner with Product Owners to ensure the technical implementation aligns with business objectives.
  • Personal accountability to your teammates to deliver on sprint commitments and ensure high-quality, scalable solutions.
  • Collaboration amongst teammates to clearly define and estimate user stories, focusing on data engineering best practices.
  • Demonstrated hands-on experience in writing and automating unit and integration tests for data pipelines.

offered :


  • Working mode: 100% remote
  • Form of cooperation: B2B
  • Salary: 120 - 180 PLN / h (B2B)
  • Benefits: private medical care, life insurance, Multisport

benefits :


  • sharing the costs of sports activities
  • private medical care
  • life insurance
  • remote work opportunities

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    93 902
    15 856