Why join GFT?
You will work with and learn from top IT experts. You will join a crew of experienced engineers: 60% of our employees are senior level.
Interested in the cloud? You will enjoy our full support in developing your skills: training programs, certifications and our internal community of experts. We have strong partnerships with top cloud providers: Google, Amazon and Microsoft - we are number one in Poland in numbers of GCP certificates. Apart from GCP, you can also develop in AWS or Azure.
We are focused on development and knowledge sharing. Internal expert communities provide a comfortable environment where you can develop your skillset in areas such as blockchain, Big Data, cloud computing or artificial intelligence.
You will work in a stable company (32 years on the market) in demanding and challenging projects for the biggest financial institutions in the world.
Your skills
Nice to have
Why join GFT?
You will work with and learn from top IT experts. You will join a crew of experienced engineers: 60% of our employees are senior level.
Interested in the cloud? You will enjoy our full support in developing your skills: training programs, certifications and our internal community of experts. We have strong partnerships with top cloud providers: Google, Amazon and Microsoft - we are number one in Poland in numbers of GCP certificates. Apart from GCP, you can also develop in AWS or Azure.
We are focused on development and knowledge sharing. Internal expert communities provide a comfortable environment where you can develop your skillset in areas such as blockchain, Big Data, cloud computing or artificial intelligence.
You will work in a stable company (32 years on the market) in demanding and challenging projects for the biggest financial institutions in the world.
,[Your responsibilities will include performance tuning and optimization of existing solutions, building and maintaining ETL pipelines, as well as testing and documenting current data flows, You will also be involved in implementing tools and processes to support data-related projects and promoting the best development standards across the team, Design, build, test and deploy Cloud and on-premise data models and transformations in Cloud Native or dedicated toolset, Optimize data views for specific visualization use cases making use of schema design partitions, indexes, down-sampling, archiving, etc. to manage trade-offs such as performance and flexibility, Review and refine, interpret and implement business and technical requirements, Ensure you are part of the on-going productivity and priorities by refining User Stories, Epics and Backlogs in Jira, Onboarding new data sources, design, build, test and deploy Cloud data ingest, pipelines, warehouse and data models/products Requirements: Python, PySpark, SQL, query optimization, ETL, Hadoop, Data warehouses, Data Lake, Cloud, GCP, AWS, Azure, Java, Scala, Databricks Additionally: Home office, Knowledge sharing, Life insurance, Sport subscription, Training budget, Private healthcare, International projects, Integration events, English lessons, Platforma Mindgram, Free coffee, Playroom, Free snacks, Free beverages, In-house trainings, In-house hack days, Modern office, Free fruits.