We are one of the fastest-growing AWS Partners in Europe, and our strength lies in the experience gained from delivering innovative IT projects for companies across various industries – from telecommunications and finance to startups.
Here, you’ll have the opportunity to dive deep into AWS, working on migration projects, ensuring security, and exploring the latest technologies in the Data & AI area.
Together, we drive innovation and help businesses unlock limitless possibilities!
responsibilities :
Design and implement scalable data pipelines in AWS cloud environment and Snowflake
Develop data processing solutions using AWS tools such as AWS Glue, Redshift, EMR, Kinesis, Lambda, Athena, S3 Tables
Build data transformations using dbt or Apache Spark for analytics and data modeling
Orchestrate workflows using Apache Airflow and AWS Step Functions
Build and manage data lakes, data warehouses and data lakehousess (Apache Iceberg)
Implement end-to-end solutions with IaC tools
requirements-expected :
Minimum 1.5-2 years of experience in data engineering, ETL/ELT development, or related AWS data services
Strong proficiency in SQL and Python including familiarity with data processing frameworks
Experience with data pipeline orchestration, data transformation tools and workflow management
Understanding of data modeling, data warehousing concepts, and big data technologies
General willingness to help customers while theyre struggling with technical problems
Good understanding of software engineering best practices such as code reviews, source control management, build processes, testing, deployment, release, and change managements
Fluent Polish and English
offered :
Continuous education
Paid leave
Sponsored AWS certification
Training budget
Medical care package
Multisport card subsidy
Flexible schedule
Access to a language learning platform
benefits :
sharing the costs of sports activities
private medical care
sharing the costs of foreign language classes
sharing the costs of professional training & courses