GCP Data Lake Developer
Reference number: PL25/IC/DATALAKEDEV/REMOTE
We are looking for a GCP Data Lake Developer to join a fully remote team from Poland. This is a hands-on role focused on designing, building, and optimizing scalable data pipelines and analytics solutions on Google Cloud Platform for business-critical projects.
Project information:
- Industry: Consulting
- Location: 100% remote, no business travel
- Type of employment: B2B
- Budget: 175 net / hour / b2b
- Project language: English
- Recruitment stages: HR screen, technical meeting with PM, and final meeting with the customer
- Start date: ASAP
Project Tasks:
- Data Infrastructure Setup: Design, build, and maintain scalable data pipelines on GCP using BigQuery, Databricks, and Vertex AI to process large datasets
- Data Modeling & Analysis: Create and optimize BigQuery data models for efficient querying and analysis
- Machine Learning Integration: Implement ML models and pipelines on Vertex AI, automating model training and deployment
Must-Have Qualifications:
- Strong experience with GCP services, specifically BigQuery and Vertex AI
- Proficient in managing large datasets, optimizing queries, and designing data models in BigQuery
- Solid experience with Oracle databases, including complex queries, data migration, and performance tuning
- Experience with Databricks for collaborative data science and ML workflows
- Experience in building Power BI dashboards and reports for business stakeholders
Nice-to-Have:
- Experience in building and deploying ML models on platforms such as Vertex AI
- Familiarity with Data Lakes and unstructured data management
What We Offer:
- Fully remote work from Poland
- Professional development opportunities, including training and certifications
- Flexible working hours
- Collaborative and supportive international team environment
- Access to modern tools and technologies to work on challenging projects
- Private medical care (with dental care financed 70% by the company)
- Life Insurance
- Multisport Card