.
Data Engineer @ VirtusLab
  • Kielce
Data Engineer @ VirtusLab
Kielce, Kielce, Swietokrzyskie, Polska
VirtusLab
12. 6. 2025
Informacje o stanowisku

We are #VLteam – tech enthusiasts constantly striving for growth. The team is our foundation, that’s why we care the most about the friendly atmosphere, a lot of self-development opportunities, and good working conditions. Trust and autonomy are two essential qualities that drive our performance. We simply believe in the idea of ​​“measuring outcomes, not hours”. Join us & see for yourself! 

Project scope

We are building a modern Data and Integration Platform for a fast-scaling Insurance client. Our work consolidates fragmented legacy systems, organizes data from 200+ sources, and creates a standardized, future-proof cloud-native environment. We aim to unlock the full value of the companys data, enable more informed and faster decision-making, and provide the backbone for business growth, integration, and AI readiness. This includes setting up a transparent, role-based, governed data environment and engineering a robust, scalable integration hub to connect internal systems and third-party services

Tech stack

SQL, Data modelling, Data Quality, Python, Azure, Apache Iceberg, Trino, Airflow, dbt, DevOps, IaC

Challenges

We focus on delivering a trusted, high-quality, and well-governed data platform to replace a highly fragmented and immature technology landscape. The key challenges include consolidating over 200 legacy systems into a streamlined, standardized technology stack and designing and implementing a modern cloud-native data platform leveraging Azure, Starburst with Iceberg, Airflow, and the Power Platform suite. We are also building an integration layer and API hub to support third-party data ingestion, such as sanctions checks, foreign exchange, and entity validation. Another primary task is phasing out outdated tooling like KNIME and replacing it with maintainable, scalable workflows. Embedding strong DevOps practices, including Infrastructure as Code (IaC), automated testing, and CI/CD pipelines, is critical to the platform delivery. Additionally, we aim to enable tactical business outcomes, such as early data marts and reporting capabilities, while building towards a complete platform. Enhancing the developer experience, ensuring operational excellence, and embedding strong data governance with role-based access control are fundamental. All initiatives are entirely cloud-native, designed with automation, traceability, scalability, and business agility

Team

We aim to build a small, agile, cross-functional team capable of delivering the complete data and integration project, from initial architecture to production operations. The team will be flexible and multidisciplinary to foster strong ownership, collaboration, and rapid delivery. It will work closely with the clients CTO and business stakeholders to ensure technical excellence, effective knowledge transfer, and alignment with strategic goals



    • Strong SQL skills
    • Strong engineering skills
    • Experience with modern data pipelines powered by robust orchestration tools
    • Strong focus on delivering high data quality
    • Polyglot engineer with experience in traditional Data Engineering and a knowledge of current trends like Modern Data Stack
    • Demonstrated ability in the design, build, and implementation of software solutions with an unwavering focus on quality
    • Ability to work in an agile environment, partnering with team members and peers to find solutions to challenging problems with transparency
    • Experience of working using CI / CD and DevSecOps approach
    • Strong modelling skills

    Don’t worry if you don’t meet all the requirements. What matters most is your passion and willingness to develop. Moreover, B2B does not have to be the only form of cooperation. Apply and find out!

    We are #VLteam – tech enthusiasts constantly striving for growth. The team is our foundation, that’s why we care the most about the friendly atmosphere, a lot of self-development opportunities, and good working conditions. Trust and autonomy are two essential qualities that drive our performance. We simply believe in the idea of ​​“measuring outcomes, not hours”. Join us & see for yourself! 

    Project scope

    We are building a modern Data and Integration Platform for a fast-scaling Insurance client. Our work consolidates fragmented legacy systems, organizes data from 200+ sources, and creates a standardized, future-proof cloud-native environment. We aim to unlock the full value of the companys data, enable more informed and faster decision-making, and provide the backbone for business growth, integration, and AI readiness. This includes setting up a transparent, role-based, governed data environment and engineering a robust, scalable integration hub to connect internal systems and third-party services

    Tech stack

    SQL, Data modelling, Data Quality, Python, Azure, Apache Iceberg, Trino, Airflow, dbt, DevOps, IaC

    Challenges

    We focus on delivering a trusted, high-quality, and well-governed data platform to replace a highly fragmented and immature technology landscape. The key challenges include consolidating over 200 legacy systems into a streamlined, standardized technology stack and designing and implementing a modern cloud-native data platform leveraging Azure, Starburst with Iceberg, Airflow, and the Power Platform suite. We are also building an integration layer and API hub to support third-party data ingestion, such as sanctions checks, foreign exchange, and entity validation. Another primary task is phasing out outdated tooling like KNIME and replacing it with maintainable, scalable workflows. Embedding strong DevOps practices, including Infrastructure as Code (IaC), automated testing, and CI/CD pipelines, is critical to the platform delivery. Additionally, we aim to enable tactical business outcomes, such as early data marts and reporting capabilities, while building towards a complete platform. Enhancing the developer experience, ensuring operational excellence, and embedding strong data governance with role-based access control are fundamental. All initiatives are entirely cloud-native, designed with automation, traceability, scalability, and business agility

    Team

    We aim to build a small, agile, cross-functional team capable of delivering the complete data and integration project, from initial architecture to production operations. The team will be flexible and multidisciplinary to foster strong ownership, collaboration, and rapid delivery. It will work closely with the clients CTO and business stakeholders to ensure technical excellence, effective knowledge transfer, and alignment with strategic goals

    ,[ Requirements: Python, Data modeling, SQL, Azure, Apache Airflow, Trino/Starburst with Iceberg, Power BI or other BI tool, Power Automate, dbt, Terraform Additionally: Building tech community, Flexible hybrid work model, Home office reimbursement, Language lessons, MyBenefit points, Private healthcare, Stretching, Training Package, Virtusity / in-house training, Free coffee, Bike parking, No dress code, Shower, Free snacks, Free beverages, Modern office, Kitchen.

  • Praca Kielce
  • Kielce - Oferty pracy w okolicznych lokalizacjach


    102 377
    10 242