Informacje o stanowisku
Why join GFT?
You will work with and learn from top IT experts. You will join a crew of experienced engineers: 60% of our employees are senior level.
Interested in the cloud? You will enjoy our full support in developing your skills: training programs, certifications and our internal community of experts. We have strong partnerships with top cloud providers: Google, Amazon and Microsoft - we are number one in Poland in numbers of GCP certificates. Apart from GCP, you can also develop in AWS or Azure.
We are focused on development and knowledge sharing. Internal expert communities provide a comfortable environment where you can develop your skillset in areas such as blockchain, Big Data, cloud computing or artificial intelligence.
You will work in a stable company (32 years on the market) in demanding and challenging projects for the biggest financial institutions in the world.
We offer you:
- Hybrid work from – 2 office day per week
- Working in a highly experienced and dedicated team
- Extra benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
- Contract of employment or B2B contract
- On-line training and certifications fit for career path
- Social events
- Access to e-learning platform
- Ergonomic and functional working space
- Openness to work in a hybrid model (2 days from the office per week)
- Openness to visiting the clients office in Cracow once every two months (for 3 days)
- At least 2-4 years of experience working on Data Engineering topics
- At least 2 years of experience working with Spark and Scala
- Strong SQL and Python
- Experience in working with big data – Spark, Hadoop, Hive
- Knowledge of GCP or Azure Databricks is considered as a strong plus
- Experience and expertise across data integration and data management with high data volumes.
- Experience working in agile continuous integration/DevOps paradigm and tool set (Git, GitHub, Jenkins, Jira)
- Experience with different database structures, including (Postgres, SQL, Hive)
- Fluent English is a must (both written and spoken)
Nice to have Skills:
- CI/CD: Jenkins, GitHub Actions
- Orchestration: Control-M, Airflow
- Scripting: Bash, Python
Why join GFT?
You will work with and learn from top IT experts. You will join a crew of experienced engineers: 60% of our employees are senior level.
Interested in the cloud? You will enjoy our full support in developing your skills: training programs, certifications and our internal community of experts. We have strong partnerships with top cloud providers: Google, Amazon and Microsoft - we are number one in Poland in numbers of GCP certificates. Apart from GCP, you can also develop in AWS or Azure.
We are focused on development and knowledge sharing. Internal expert communities provide a comfortable environment where you can develop your skillset in areas such as blockchain, Big Data, cloud computing or artificial intelligence.
You will work in a stable company (32 years on the market) in demanding and challenging projects for the biggest financial institutions in the world.
We offer you:
- Hybrid work from – 2 office day per week
- Working in a highly experienced and dedicated team
- Extra benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
- Contract of employment or B2B contract
- On-line training and certifications fit for career path
- Social events
- Access to e-learning platform
- Ergonomic and functional working space
,[We are looking for professionals at various levels — from Mid, through Senior, to Expert — to join our team. Your responsibilities will include performance tuning and optimization of existing solutions, building and maintaining ETL pipelines, as well as testing and documenting current data flows. You will also be involved in implementing tools and processes to support data-related projects and promoting the best development standards across the team. As a Data Engineer with Scala, your mission will be to develop, test and deploy the technical and functional specifications from the Solution Designers / Business Architects / Business Analysts, guaranteeing the correct operability and compliance with the internal quality levels. Requirements: Spark, Scala, Python, Google cloud platform, SQL, Data engineering, Hadoop, Hive, GCP, PostgreSQL, Jenkins, GitHub Actions, Control-M, Airflow, Bash script, Azure Databricks Additionally: Home office, Knowledge sharing, Life insurance, Sport subscription, Training budget, Private healthcare, International projects, Integration events, English lessons, Platforma Mindgram, Free coffee, Playroom, Free snacks, Free beverages, In-house trainings, In-house hack days, Modern office, Free fruits.
Praca WrocławWrocław - Oferty pracy w okolicznych lokalizacjach