.
Senior Data Architect IT
  • Warsaw
Senior Data Architect IT
Warszawa, Warsaw, Masovian Voivodeship, Polska
OPTIVEUM SPÓŁKA Z OGRANICZONĄ ODPOWIEDZIALNOŚCIĄ
6. 3. 2026
Informacje o stanowisku

technologies-expected :


  • SQL
  • RDBMS
  • Airflow
  • Prefect
  • Redshift
  • Snowflake
  • BigQuery

technologies-optional :


  • AWS
  • Google Cloud Platform
  • Microsoft Azure
  • Python
  • Pandas
  • PySpark
  • Docker
  • Kubernetes
  • Lambda

about-project :


  • Our client is a US-based leader in AI-powered enterprise operations, delivering digital solutions and consulting services that transform high-growth businesses and private equity-backed platforms. With over a decade of deep domain expertise in private capital markets, the company operates an integrated ecosystem spanning PaaS, SaaS, and a Solutions & Consulting Suite.
  • We are seeking a Senior Data Architect to join the companys growing Warsaw engineering centre. In this role you will own enterprise data architecture from strategy through execution — designing cloud-native data platforms, establishing governance standards, and enabling AI/ML-ready data infrastructure that powers business intelligence across the portfolio.
  • This is a hybrid position — you will be expected to work from the Warsaw office at least 3 days per week.

responsibilities :


  • Design, develop, and maintain enterprise data architecture strategies, standards, and blueprints supporting operational, analytical, and AI/ML workloads
  • Architect cloud-native data solutions on AWS (Redshift, RDS, Glue, Lake Formation) or equivalent platforms, ensuring scalability, security, and cost efficiency
  • Define and enforce data modeling standards: dimensional modeling, denormalized schemas, OLTP/OLAP design patterns, and AI-friendly ontologies
  • Architect and oversee data transformation layers using DBT, delivering modular, tested, and well-documented models across the analytics stack
  • Lead design of data integration and orchestration patterns with Prefect and Airflow — batch ETL, real-time streaming, event-driven, and API-based data exchange
  • Define and implement data validation, quality control, and automated testing frameworks across pipelines and warehouses
  • Establish data quality SLAs, monitoring, and alerting standards; design automated reconciliation processes to catch issues before downstream impact
  • Build and maintain data governance frameworks: data quality, lineage, cataloging, classification, and access control
  • Collaborate with Data Engineers, Software Engineers, Product, and Analytics teams to translate business requirements into scalable designs
  • Evaluate and recommend data technologies and tools; own technical decision-making for data infrastructure within assigned domains
  • Design data partitioning, indexing, and optimization strategies for high-performance queries and big data workloads
  • Ensure architectures support AI/ML consumption — feature stores, embedding pipelines, and model training datasets
  • Perform architecture and code reviews to uphold data standards, optimal execution patterns, and long-term maintainability
  • Mentor data engineers on best practices in modeling, architecture patterns, and cloud data design
  • Assist with CI/CD processes and automated release management for data infrastructure deployments

requirements-expected :


  • 7+ years of experience in data architecture, data engineering, or related technical roles
  • 5+ years designing and implementing cloud-based data architectures (AWS, GCP, or Azure)
  • 5+ years writing complex SQL queries across RDBMSes
  • 5+ years developing and deploying ETL/ELT pipelines using Airflow, Prefect, or similar tools
  • Strong experience with DBT for data transformation, testing, and documentation
  • Experience with data warehouse design: OLTP, OLAP, star schemas, snowflake schemas, dimensions, and facts
  • Experience with data modeling tools and methodologies (conceptual, logical, physical models)
  • Hands-on experience with cloud-based data warehouses such as Redshift, Snowflake, or BigQuery
  • Experience implementing data validation frameworks, quality control processes, and automated testing for data pipelines
  • Familiarity with how data architectures serve AI/ML workloads, including feature stores and vector-based retrieval patterns
  • Strong understanding of data governance, data quality frameworks, and metadata management
  • Bachelors degree in Computer Science or equivalent — preferred

offered :


  • B2B contract with monthly compensation up to $8,000
  • Strategic, high-ownership role in a fast-growing global fintech
  • Direct influence over data infrastructure decisions and team direction
  • Mentorship opportunities and clear career progression
  • Collaborative, open, and ambitious team culture
  • Hybrid model — minimum 3 days/week in the Warsaw office

benefits :


  • remote work opportunities
  • integration events

  • Praca Warszawa
  • Administrator IT Warszawa
  • Specjalista ds. bezpieczeństwa IT Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    117 780
    19 447