.
Enterprise BI & Analytics Platform DevOps Engineer (Databricks | Azure | On-Call Duty) @ Square One Resources
  • Warsaw
Enterprise BI & Analytics Platform DevOps Engineer (Databricks | Azure | On-Call Duty) @ Square One Resources
Warszawa, Warsaw, Masovian Voivodeship, Polska
Square One Resources
20. 1. 2026
Informacje o stanowisku

Project Description

For a leading German telecommunications and IT services provider, we are building a new European Technology Center in Poland. As part of this initiative, the client is establishing a modern Enterprise BI, Analytics, and Data Engineering platform based on Azure Databricks.

The project focuses on designing, configuring, and operating a scalable Databricks environment that supports business intelligence, advanced analytics, and AI/ML workloads within a unified lakehouse architecture. The platform will serve multiple teams, including BI Analysts, Data Engineers, and ML Engineers, with strong emphasis on security, governance, automation, and operational excellence.

Role Purpose

We are looking for a Regular DevOps Engineer to take ownership of the Databricks platform setup and operations, ensuring it is production-ready, secure, cost-efficient, and fully integrated with BI and downstream analytics tools.

The role requires close collaboration with ML Ops Engineers, Data Engineers, and Security teams to deliver a shared, enterprise-grade data platform supporting both traditional BI and AI use cases.
The position includes On-Call Duty.


Required Skills & Experience

  • Minimum experience: 3+ years in DevOps or data platform operations
  • Strong hands-on experience with cloud-based DevOps environments
  • Practical knowledge of Databricks platform administration
  • Proficiency in Python for automation and scripting
  • Solid understanding of SQL and data engineering fundamentals
  • Experience with orchestration and scheduling tools (Databricks Workflows, Airflow, Azure Data Factory)
  • Knowledge of cloud Identity & Access Management (IAM)
  • English proficiency at B2 level or higher

Must-Have Requirements

(Candidates without these skills will not proceed to the client interview)

  • Terraform (Infrastructure as Code)
  • Python (automation and platform tooling)
  • English – minimum B2 level

Nice-to-Have Requirements

  • Azure Databricks environment administration
  • Experience with Azure ecosystem (Azure Data Lake, Synapse Analytics)
  • Infrastructure-as-Code tooling beyond Terraform (ARM templates)
  • Performance tuning for BI queries on large-scale datasets
  • Knowledge of Delta Lake architecture

Project Description

For a leading German telecommunications and IT services provider, we are building a new European Technology Center in Poland. As part of this initiative, the client is establishing a modern Enterprise BI, Analytics, and Data Engineering platform based on Azure Databricks.

The project focuses on designing, configuring, and operating a scalable Databricks environment that supports business intelligence, advanced analytics, and AI/ML workloads within a unified lakehouse architecture. The platform will serve multiple teams, including BI Analysts, Data Engineers, and ML Engineers, with strong emphasis on security, governance, automation, and operational excellence.

Role Purpose

We are looking for a Regular DevOps Engineer to take ownership of the Databricks platform setup and operations, ensuring it is production-ready, secure, cost-efficient, and fully integrated with BI and downstream analytics tools.

The role requires close collaboration with ML Ops Engineers, Data Engineers, and Security teams to deliver a shared, enterprise-grade data platform supporting both traditional BI and AI use cases.
The position includes On-Call Duty.

,[Databricks Environment Setup & Configuration, , Deploy and configure Azure Databricks workspaces for multi-team usage, Design and manage shared clusters, job clusters, and interactive analysis clusters, Implement role-based access control aligned with data governance policies, Data Integration & Enablement, , Configure secure connectivity to on-premise and cloud data sources (SQL Server, Data Lakes, APIs), Build and maintain shared ingestion pipelines for BI and analytics teams, Automate daily and weekly data refresh processes, BI Tool Connectivity, , Integrate Databricks with BI platforms (e.g. Power BI), Optimize JDBC/ODBC connectors for performance and scalability, Operational Excellence, , Implement monitoring, alerting, and logging for Databricks jobs and pipelines, Define and maintain backup and disaster recovery procedures, Track and optimize infrastructure and cluster costs, Automation & CI/CD, , Design and maintain CI/CD pipelines for data engineering and analytics code, Automate deployment of notebooks, SQL queries, and data models, Collaboration, , Work closely with ML Ops Engineers to align shared infrastructure (Delta Lake, storage, compute), Support Data Engineers in maintaining centralized data assets, Cooperate with Security teams on access control and sensitive data protection, Governance & Compliance, , Enforce data governance and compliance requirements (GDPR, internal policies), Maintain auditing, logging, and documentation of platform operations Requirements: DevOps, Data engineering, Databricks, Airflow, Azure, Cloud, IAM, Terraform, Python, Azure Data, ARM, Delta Lake

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    98 684
    14 760