.
Enterprise BI Solutions & Operations - Regular DevOps Engineer
  • Warszawa
Enterprise BI Solutions & Operations - Regular DevOps Engineer
Warszawa, Warszawa, mazowieckie, Polska
Square One Resources Sp. z o.o.
14. 12. 2025
Informacje o stanowisku

We are looking for an experienced DevOps Engineer to set up, configure, and operationalize a new Databricks environment focused on business intelligence (BI), analytics, and data engineering workflows.

Working closely with an ML Ops Engineer, you will ensure the Databricks platform supports both traditional BI/data processing use cases and AI workloads. This includes secure access for data analysts, seamless integration with downstream AI/BI tools, and optimized data pipelines.

On Call Duty

Enterprise BI Solutions & Operations - Regular DevOps Engineer



Your responsibilities

  • Deploy and configure Databricks workspaces for multi-team usage.
  • Set up resource management policies for shared clusters, automated job clusters, and interactive analytical clusters.
  • Configure role-based access controls aligned with data governance standards.
  • Establish secure connectivity to on-premise and cloud data sources (SQL Server, data lake, APIs, etc.).
  • Build shared data ingestion pipelines for BI and analytics teams.
  • Automate daily and weekly data refresh schedules.
  • Integrate Databricks with BI platforms (e.g., Power BI).
  • Configure and optimize JDBC/ODBC connectors to ensure performance and reliability.
  • Implement monitoring and logging for Databricks jobs and pipelines.
  • Define backup and disaster recovery processes for key data sets.
  • Apply cost tracking, budgeting, and optimization practices for cluster usage.
  • Set up CI/CD pipelines for data engineering code and deployments.
  • Manage deployment workflows for notebooks, SQL queries, and data models.
  • Work with ML Ops Engineers to maintain shared infrastructure (storage, Delta Lake tables) supporting both BI and ML use cases.
  • Partner with Data Engineers to maintain central data sources in Databricks.
  • Collaborate with security teams to implement access controls for sensitive data.
  • Enforce data governance (GDPR and internal compliance) including workspace auditing and logging.
  • Document procedures for configuration, usage, and operations for all teams.
  • 3+ years of experience in DevOps or Data Platform operations with cloud technologies.
  • Hands-on experience with Databricks environment administration.
  • Proficiency in Python (automation, scripting).
  • Familiarity with BI/analytics tool integration via Databricks connectors.
  • Solid knowledge of SQL and data engineering fundamentals.
  • Experience with orchestration tools such as Databricks Workflows, Airflow, or Azure Data Factory.
  • Understanding of Identity & Access Management in cloud environments.

Our requirements

  • 3+ years of experience in DevOps or Data Platform operations with cloud technologies.
  • Hands-on experience with Databricks environment administration.
  • Proficiency in Python (automation, scripting).
  • Familiarity with BI/analytics tool integration via Databricks connectors.
  • Solid knowledge of SQL and data engineering fundamentals.
  • Experience with orchestration tools (Databricks Workflows, Airflow, Azure Data Factory).
  • Understanding of Identity & Access Management in cloud environments.
  • Terraform
  • Python (for automation)
  • English level B2

 

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    165 526
    23 379