Join a dynamic team focused on revolutionizing Transfer Pricing operations through advanced CDP solutions. Our client is on a mission to optimize performance and drive cost-efficiency at every level. If you thrive on solving complex challenges and creating tangible improvements, this role offers the perfect platform to make a significant impact
Responsibilities:
- Work with the Development team to create and implement solutions on the Cloudera platform based on provided requirements.
- Work closely with Cloudera’s teams at all levels to help ensure application needs are met.
- Analyse complex distributed production deployments and make recommendations to optimize performance.
- Help design and implement Big Data architectures and configurations.
- Write and produce technical and supporting documentation, and knowledge base articles.
Requirements:
- Experience working with Cloudera Data Platform (CDP).
- Good understanding of AWS.
- Containerisation technologies such as Docker, and Kubernetes.
- Experience designing and deploying production large-scale Hadoop solutions.
- Ability to understand and translate customer requirements into technical requirements.
- Experience designing data queries in a Hadoop environment using tools such as Apache Hive, Apache Phoenix, Apache Spark, or others.
- Experience installing and administering multi-node Hadoop clusters.
- Strong experience implementing solutions in an Enterprise Linux or Unix environment.
- Good understanding of network configuration, devices, protocols, speeds and optimizations.
- Knowledge of programming and scripting languages.
- Strong understanding of using network-based APIs, preferably REST/JSON or XML/SOAP.
- Experience implementing big data use cases, and understanding of standard design patterns commonly used in Hadoop-based deployments.