We are looking for a Middle Data Engineer to join our team focusing on data pipeline development. You will work on scheduled processes and databases that manage streaming IoT and ML data. Your efforts will contribute to generating insights and data aggregations for NFT minting and supporting projects like EnerGPT and finance service.
Project description: Our customer is dedicated to helping oil and gas companies produce clean energy profitably. By leveraging AI and Computer Vision, they automate Health, Safety, and Environment (HSE), Environmental, Social, and Governance (ESG), and operational processes.
Technical stack: Python, Airflow, Computer Vision, IoT sensory data, Docker, AWS Aurora, AWS RDS, Lambda, MSK Kafka, Streamlit, Gitlab, Pandas, Matplotlib, and Plotly.
WE OFFER:
If you are passionate about data engineering and excited to work with cutting-edge technologies in a dynamic environment, we would love to hear from you!
What you’ll need:
Nice-to-have skills:
We are looking for a Middle Data Engineer to join our team focusing on data pipeline development. You will work on scheduled processes and databases that manage streaming IoT and ML data. Your efforts will contribute to generating insights and data aggregations for NFT minting and supporting projects like EnerGPT and finance service.
Project description: Our customer is dedicated to helping oil and gas companies produce clean energy profitably. By leveraging AI and Computer Vision, they automate Health, Safety, and Environment (HSE), Environmental, Social, and Governance (ESG), and operational processes.
Technical stack: Python, Airflow, Computer Vision, IoT sensory data, Docker, AWS Aurora, AWS RDS, Lambda, MSK Kafka, Streamlit, Gitlab, Pandas, Matplotlib, and Plotly.
WE OFFER:
If you are passionate about data engineering and excited to work with cutting-edge technologies in a dynamic environment, we would love to hear from you!
,[Develop and maintain data pipelines using Python and Apache Airflow , Implement ETL/ELT processes , Manage and optimize databases, including AWS S3 and PostgreSQL , Develop and consume REST APIs using FastAPI , Perform data processing and analysis with SQL queries , Collaborate with team members to ensure data integrity and efficient workflows Requirements: IoT, AI, Computer vision, Python, Airflow, Docker, AWS, Apache Aurora, Amazon RDS, AWS Lambda, MSK, Kafka, GitLab, pandas, Matplotlib, Data engineering, Data modeling, ETL, Data pipelines, REST API, HTTP, Multithreading, Communication skills, Infrastructure as Code, Apache Airflow, AWS S3, PostgreSQL, FastAPI, SQL