Our customer is one of the world’s largest technology companies based in Silicon Valley with operations all over the world. In this project, we are working on the bleeding edge of Big Data technology to develop a high-performance data analytics platform, which handles petabytes datasets.
We are looking for an experienced Big Data Engineer.
responsibilities :
Design, develop, and maintain robust data pipelines using: Scala, Spark, HDFS, Gradle, Kubernetes and Airflow
Implement scalable and efficient data workflows to support data ingestion, processing, and analysis.
Develop and maintain continuous reporting systems and dashboards to provide real-time insights.
Collaborate with cross-functional teams to understand data requirements and deliver data solutions.
Optimize and tune data processing systems for performance and reliability.
Ensure data quality and integrity throughout the data lifecycle.
Troubleshoot and resolve data processing and pipeline issues.
Stay up-to-date with industry trends and emerging technologies to drive innovation in data engineering.
requirements-expected :
Bachelors or Masters degree in Computer Science, Engineering, or a related field.
Proven experience as a Data Engineer or in a similar role.
Strong proficiency in Spark, HDFS, Airflow
Solid programming skills in Scala.
Experience with data pipeline architecture and implementation.
Expertise in building and maintaining continuous reporting systems and dashboards.
Familiarity with data warehousing concepts and technologies.
Strong problem-solving skills and attention to detail.
Excellent communication and collaboration skills.
Ability to work independently and in a remote team environment.