.
Data Engineer @ Amelco UK LTD
  • Warsaw
Data Engineer @ Amelco UK LTD
Warszawa, Warsaw, Masovian Voivodeship, Polska
Amelco UK LTD
10. 12. 2025
Informacje o stanowisku

As a Data Engineer, you will design, build, and maintain scalable data pipelines and architectures that power our analytical and operational systems.

You’ll work closely with Data Analysts, BI Developers, and Software Engineers to ensure data quality, performance, and scalability across all environments — from real-time streaming to batch processing.

You’ll be part of a modern data ecosystem leveraging AWS, Snowflake, Spark, Airflow, and Kafka, supporting mission-critical reporting and predictive analytics across the business.


Essential

  • Minimum 3+ years of experience as a Data Engineer or in a similar data-focused role.
  • Strong proficiency in SQL and experience with relational databases (PostgreSQL, Snowflake, Redshift, BigQuery, etc.).
  • Proven experience building ETL pipelines using Python (Pandas, PySpark, Airflow, or dbt).
  • Solid understanding of data warehousing principles and dimensional modelling.
  • Familiarity with cloud platforms (AWS, Azure, or GCP).
  • Strong problem-solving skills and ability to work in fast-paced, data-intensive environments.

Desirable

  • Experience with real-time data streaming (Kafka, Kinesis, or Pub/Sub).
  • Knowledge of sports betting or iGaming data models (bets, markets, transactions, GGR, etc.).
  • Exposure to data governance, data observability, or catalog tools
  • Experience with CI/CD for data (GitHub Actions, Terraform).
  • Familiarity with Power BI or other BI tools.

As a Data Engineer, you will design, build, and maintain scalable data pipelines and architectures that power our analytical and operational systems.

You’ll work closely with Data Analysts, BI Developers, and Software Engineers to ensure data quality, performance, and scalability across all environments — from real-time streaming to batch processing.

You’ll be part of a modern data ecosystem leveraging AWS, Snowflake, Spark, Airflow, and Kafka, supporting mission-critical reporting and predictive analytics across the business.

,[Design, develop, and maintain ETL/ELT pipelines to ingest and transform large-scale data from multiple sources (betting, casino, player, finance, CRM, and external feeds)., Implement data models and schemas (Bronze/Silver/Gold layers) in Snowflake / Redshift / Postgres, ensuring data consistency and governance., Work on real-time streaming and event-driven architecture using Kafka, Solace, or similar tools., Optimize SQL and Python-based data transformations for performance and scalability., Develop data validation, monitoring, and alerting frameworks., Collaborate with BI developers to deliver Power BI and analytical datasets., Partner with stakeholders across Trading, Finance, Compliance, and Risk to deliver high-impact data solutions., Contribute to data quality standards, documentation, and CI/CD automation of pipelines (GitHub). Requirements: SQL, ETL, Python, Kafka, CI/CD, Power BI Additionally: Small teams, International projects, Free coffee, Canteen, Bike parking, Free beverages, Free snacks, Modern office, Startup atmosphere, No dress code.

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    119 712
    17 596