Work within Agile SCRUM methodologies using the Atlassian toolchain (JIRA, Confluence, GitHub, Git)
Design, develop, and maintain a modern cloud-based data warehouse on Google Cloud Platform (GCP)
Build and enhance data pipelines using Python, Java, and BigQuery SQL
Leverage GCP services such as Cloud Build, Data Transfer, Workflows, Scheduler, and Cloud Functions
Develop and manage data models and transformation layers using tools like dbt Cloud
Implement and maintain Infrastructure as Code (IaC) using Terraform
Contribute to the development of a scalable data platform supporting analytics, BI, and machine learning use cases
Support the organization’s digital transformation by delivering modern data solutions
Proactively propose and implement improvements in data architecture and engineering practices
Collaborate with a cross-functional, international team including BI, Data Science, DevOps, and Software Engineering
requirements-expected :
Openness to engage in continuous learning and agility in addressing business challenges;
Experience in designing, developing, documenting, and maintaining software written in Python and/or Java as well as dataflows written in SQL
Knowledge of different Data Modelling principles and selecting the right one depending on the needs of a consumer
Feeling comfortable working with Linux, git, and Docker to extend and maintain our CI/CD workflows and pipelines or are you eager to dive into this topic
Experience in working in cloud environments (preferably GCP or AWS)
Experience or knowledge of Infrastructure-as-Code applications like Terraform and data transformation tools as dbt and Dataform
Cultural awareness, sensitivity, and adaptability in building connections with people from diverse backgrounds
Excel conceptual thinking and a proactive approach to decision-making, taking ownership to ensure project success.