.
GCP Data Architect
  • Wrocław
GCP Data Architect
Wrocław, Wrocław, Lower Silesian Voivodeship, Polska
Capgemini Polska
10. 6. 2024
Informacje o stanowisku

technologies-expected :


  • BigQuery
  • Dataflow
  • GKE
  • Cloud Bigtable
  • Pub/Sub
  • Dataproc
  • Google Data Studio
  • Apache Beam
  • Kafka
  • Spark
  • Scala
  • Python
  • Java

about-project :


  • Insights & Data practice delivers cutting-edge data centric solutions.
  • Most of our projects are Cloud & Big Data engineering. We develop solutions to process large, also unstructured, datasets, with dedicated Cloud Data services on AWS, Azure or GCP.
  • We are responsible for full SDLC of the solution: apart from using data processing tools (e.g., ETL), we code a lot in Python, Scala or Java and use DevOps tools and best practices. The data is either exposed to downstream systems via API, outbound interfaces or visualized on reports and dashboards.
  • Within our AI CoE we deliver Data Science and Machine Learning projects with focus on NLP, Anomaly Detection and Computer Vision.
  • Additionally, we are exploring the area of Quantum Computing, searching for practical growth opportunities for both us and our clients.
  • Currently, over 250 of our Data Architects, Engineers and Scientists work on exciting projects for over 30 clients from different sectors (Financial Services, Logistics, Automotive, Telco and others)
  • Come on Board! :)

responsibilities :


  • you will create the architecture of systems processing large and unstructured data sets (Data Lake Architecture, Streaming Architecture);
  • you will implement, optimize and test modern DWH/Big Data solutions based on Google Cloud Platform and Continuous Delivery/Continuous Integration environment;
  • you will be responsible for data processing efficiency improvement and for migrations from on-prem to public cloud platforms;
  • you will be building and supervising an internal GCP development and training program as well as leading workshops and mentoring sessions.

requirements-expected :


  • you have at least 5 years of experience in Big Data or Cloud projects in the areas of processing and visualization of large and unstructured datasets (in different phases of Software Development Life Cycle);
  • you have experience as an Architect and/or Technical Leader in GCP projects;
  • you have practical knowledge of GCP in Storage, Compute (+Serverless), Networking and DevOps areas supported by commercial project work experience;
  • you are familiar with several of the following services: BigQuery, Dataflow, GKE, Cloud Bigtable, Pub/Sub, Dataproc, Google Data Studio, Apache Beam, Kafka, Spark;
  • you have good knowledge of one of programming languages: Scala, Python or Java;
  • you are organized, independent, willing to share the knowledge;
  • you have very good command of English.

benefits :


  • private medical care
  • sharing the costs of foreign language classes
  • sharing the costs of professional training & courses
  • life insurance
  • flexible working time
  • integration events
  • corporate sports team
  • no dress code
  • parking space for employees
  • extra social benefits
  • sharing the costs of tickets to the movies, theater
  • redeployment package
  • christmas gifts
  • employee referral program
  • charity initiatives
  • free chat/call with a therapist
  • Multisport card

  • Praca Wrocław
  • Wrocław - Oferty pracy w okolicznych lokalizacjach


    65 126
    13 103