We are excited to present an opportunity for a Senior Data Engineer to join our client’s team, contributing to the development of an innovative blockchain crypto solution. This role offers the chance to work on cutting-edge technology that helps audit teams verify client assets, including cryptocurrencies. If you are passionate about data engineering, blockchain, and large-scale systems, this is the perfect role for you!
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines using Python, PySpark, and Databricks.
- Implement data orchestration workflows using Airflow or similar tools.
- Manage and optimize data storage solutions on Azure Cloud.
- Develop and maintain containerized microservices using Docker.
- Collaborate with cross-functional teams to ensure data quality and integrity.
- Support blockchain setups and build tools to extract data from blockchains for analysis using Databricks.
- Maintain production infrastructure, ensuring effective monitoring and automation with tools like Datadog.
- Participate in agile development, planning, and work within cross-functional teams.
Key Requirements:
- Bachelor’s or Master’s degree in Computer Science, Data Science, or a related field.
- Advanced knowledge of Python for data processing and scripting.
- Proven experience with Spark and SQL for data engineering and analysis.
- Experience with data orchestration tools like Airflow.
- Hands-on experience with cloud platforms such as Azure, AWS, or GCP.
- Proficiency in RDBMS/NoSQL databases.
- Experience with Docker containerization (Kubernetes is a plus).
- Strong understanding of the software development lifecycle.
- Familiarity with blockchain technologies and data structures.
- Cryptography knowledge in blockchain is a plus.
- Databricks experience and familiarity with Delta Lake are pluses.
- Strong communication and interpersonal skills.