.
GCP Engineer
  • Warsaw
GCP Engineer
Warszawa, Warsaw, Masovian Voivodeship, Polska
Senovo IT
16. 9. 2024
Informacje o stanowisku

technologies-expected :


  • Google Cloud Platform
  • Python

technologies-optional :


  • Java

about-project :


  • Senovo IT is urgently looking for Google Cloud (GCP) Data Engineer - Freelance contract role.
  • Start : ASAP
  • Location: Fully Remote - Candidate must be located within EU
  • Language - English
  • As GCP Data Engineer, You will work as a core member of the Data platform team to build data and analytics solutions on Google Cloud Platform (GCP).
  • If interested and available, please kindly send your CV for immediate consideration

responsibilities :


  • Build large scale data and analytics solutions on GCP
  • Use modern data/analytics technologies on-premise and Cloud
  • Efficiently use the GCP platform to integrate large datasets from multiple data sources, analyse data, data modelling, data exploitation/visualisation
  • Design and Build automated data pipelines
  • Engineer Data engineering solution on GCP using Cloud BigQuery, Cloud DataProc,
  • Cloud Dataflow, Pub-Sub, Cloud BigTable, Cloud storage Spark, Hive and AI/Ml solutions
  • DevOps, CI/CD implementation
  • Extract, load, transform, clean and validate data

requirements-expected :


  • Bachelors degree in Computer Science or related field
  • Certification in Google Cloud Platform (GCP) technologies
  • Proficiency in managing cloud-based infrastructure
  • Experience with designing, developing, and deploying applications on GCP
  • Strong knowledge of networking, security, and storage within GCP
  • Ability to troubleshoot and optimize GCP environments
  • Familiarity with automation tools and scripting languages
  • Excellent communication and teamwork skills
  • Willingness to keep up with the latest trends and technologies in cloud computing
  • Hands-on experience on Google Cloud
  • GCP Professional certification preferable
  • Experience on Data Migration from legacy systems (Oracle, BigData, Netezza) to GCP
  • Cloud DataProc, Cloud Dataflow
  • Pub-Sub, Cloud BigQuery, Cloud BigTable, Cloud storage Spark, Hive
  • Experience with Data lake, data warehouse ETL build and design
  • Compute Engine, Cloud Fusion
  • AI/Ml solutions
  • Python/Java/Spark Java experience preferable

  • Praca Warszawa
  • Warszawa - Oferty pracy w okolicznych lokalizacjach


    110 929
    20 567