We are looking for a GCP Data Lake Developer to join a fully remote team from Poland. This is a hands-on role focused on designing, building, and optimizing scalable data pipelines and analytics solutions on Google Cloud Platform for business-critical projects.
Industry: Consulting
Location: 100% remote, no business travel
Type of employment: B2B
Budget: 175 net / hour / b2b
Project language: English
Recruitment stages: HR screen, technical meeting with PM, and final meeting with the customer
Start date: ASAP
responsibilities :
Data Infrastructure Setup: Design, build, and maintain scalable data pipelines on GCP using BigQuery, Databricks, and Vertex AI to process large datasets
Data Modeling & Analysis: Create and optimize BigQuery data models for efficient querying and analysis
Machine Learning Integration: Implement ML models and pipelines on Vertex AI, automating model training and deployment
requirements-expected :
Strong experience with GCP services, specifically BigQuery and Vertex AI
Proficient in managing large datasets, optimizing queries, and designing data models in BigQuery
Solid experience with Oracle databases, including complex queries, data migration, and performance tuning
Experience with Databricks for collaborative data science and ML workflows
Experience in building Power BI dashboards and reports for business stakeholders
offered :
Fully remote work from Poland
Professional development opportunities, including training and certifications
Flexible working hours
Collaborative and supportive international team environment
Access to modern tools and technologies to work on challenging projects
Private medical care (with dental care financed 70% by the company)
Life Insurance
Multisport Card
benefits :
sharing the costs of sports activities
private medical care
sharing the costs of professional training & courses