Informacje o stanowisku
? We are a leading trading platform that is ambitiously expanding to the four corners of the globe. Our top-rated products have won prestigious industry awards for their cutting-edge technology and seamless client experience. We deliver only the best, so we are always in search of the best people to join our ever-growing talented team.
We’re looking for a skilled Data Engineer to help design, build, and optimize the data systems that power our regulatory reporting and analytics. You’ll take ownership of our PostgreSQL environment and Airflow pipelines, ensuring performance, reliability, and data integrity across critical reporting workflows.
- 2+ years’ experience as a Data Engineer (or similar) working heavily with PostgreSQL.
- Expert SQL and strong PL/pgSQL: able to design, implement, and maintain complex stored procedures/functions.
- Deep PostgreSQL fundamentals: schema design and normalization, indexing strategies, constraints (PK/FK/UNIQUE/CHECK), triggers, table partitioning.
- Experience with EXPLAIN/ANALYZE, query optimization, VACUUM/ANALYZE, bloat detection/mitigation, and use of pg_stat_* views.
- Experience building and operating DAGs (Apache Airflow), including SLAs, retries, and dependencies.
- Experience with Version control & CI/CD (GitHub/GitLab), code reviews.
- Data quality and governance mindset: validation checks, lineage/documentation, and auditability.
- Nice to have domain familiarity with reporting in fintech/brokerage (e.g., trade/transaction reporting for CFD brokers; understanding of controls, reconciliations, and audit needs).
? We are a leading trading platform that is ambitiously expanding to the four corners of the globe. Our top-rated products have won prestigious industry awards for their cutting-edge technology and seamless client experience. We deliver only the best, so we are always in search of the best people to join our ever-growing talented team.
We’re looking for a skilled Data Engineer to help design, build, and optimize the data systems that power our regulatory reporting and analytics. You’ll take ownership of our PostgreSQL environment and Airflow pipelines, ensuring performance, reliability, and data integrity across critical reporting workflows.
,[Design, implement, and optimize database objects and complex PL/pgSQL procedures to support regulatory reporting use cases., Build and maintain robust DB dependencies: primary/foreign keys, constraints, triggers, partitions, and scheduled maintenance routines., Own Postgres performance and hygiene: capacity planning, index maintenance, partition strategy, VACUUM/ANALYZE, and systematic tuning using EXPLAIN and pg_stat_statements., Develop, schedule, and operate data pipelines as DAGs (in Airflow): ingestion, transformations, validations, and deliveries to reporting targets., Implement proactive alerting and on-failure workflows. Requirements: PostgreSQL, SQL, Data analytics, Data analysis, GitLab, Data engineering, Airflow, Automation, PL/pgSQL, CI/CD, Version control system, Communication skills Tools: Confluence, GitLab. Additionally: Private healthcare, International projects, Multisport, Training budget, Small teams, Refer a friend scheme, Annual bonus, Annual assessment, Free coffee, Bike parking, Playroom, Shower, Free snacks, Free parking, Modern office, Startup atmosphere, No dress code.
Praca WarszawaWarszawa - Oferty pracy w okolicznych lokalizacjach