Join our team as a Data Engineer! Design and implement robust data pipelines, build scalable data lakes, and enable business insights through efficient data systems. Work remotely from Poland on a B2B contract.
Build and maintain scalable data pipelines (ETL/ELT) to collect, transform, and integrate data from various systems into a centralized Data Lake.
Design and optimize data architectures for reporting and analytics purposes.
Ensure data quality, security, and availability across systems.
Collaborate with analysts and stakeholders to define data requirements and support business intelligence efforts.
Monitor and troubleshoot data pipelines to ensure smooth operation.
Stay updated on industry trends and best practices in data engineering.
Proven experience in designing and maintaining data pipelines (e.g., Apache Airflow, Talend, or similar tools).
Strong knowledge of SQL and programming languages such as Python.
Experience with data lakes and data warehouses (e.g., Snowflake, AWS S3, or BigQuery).
Familiarity with cloud platforms (AWS, GCP, Azure) and related services for data processing.
Understanding of data modeling and optimizing database performance.
Analytical mindset with a problem-solving approach.
Strong communication skills to work effectively with technical and non-technical stakeholders.
Process-oriented and detail-focused, with an emphasis on quality.
Must be available for a 4-hour overlap between 15:00 and 19:00 Poland time to ensure real-time collaboration.