Poznań, Poznań, Greater Poland Voivodeship, Polska
BCF Software Sp. z o.o.
3. 11. 2024
Informacje o stanowisku
technologies-expected :
PySpark
Python
Docker
SQL
technologies-optional :
Apache Airflow
Azure DevOps
Kubernetes
Microsoft Azure
AWS
Google Cloud Platform
Databricks
about-project :
We are excited to present an opportunity for a Senior Data Engineer to join our client’s team, contributing to the development of an innovative blockchain crypto solution. This role offers the chance to work on cutting-edge technology that helps audit teams verify client assets, including cryptocurrencies. If you are passionate about data engineering, blockchain, and large-scale systems, this is the perfect role for you!
responsibilities :
Design, develop, and maintain scalable data pipelines using Python, PySpark, and Databricks.
Implement data orchestration workflows using Airflow or similar tools.
Manage and optimize data storage solutions on Azure Cloud.
Develop and maintain containerized microservices using Docker.
Collaborate with cross-functional teams to ensure data quality and integrity.
Support blockchain setups and build tools to extract data from blockchains for analysis using Databricks.
Maintain production infrastructure, ensuring effective monitoring and automation with tools like Datadog.
Participate in agile development, planning, and work within cross-functional teams.
requirements-expected :
Bachelor’s or Master’s degree in Computer Science, Data Science, or a related field.
Advanced knowledge of Python for data processing and scripting.
Proven experience with Spark and SQL for data engineering and analysis.
Experience with data orchestration tools like Airflow.
Hands-on experience with cloud platforms such as Azure, AWS, or GCP.
Proficiency in RDBMS/NoSQL databases.
Experience with Docker containerization (Kubernetes is a plus).
Strong understanding of the software development lifecycle.
Familiarity with blockchain technologies and data structures.