Ryanair Labs is the technology brand of Ryanair. Labs is a state of-the-art digital & IT innovation hub creating Europes Leading Travel Experience for our customers. The Ryanair platform has over 1 billion visits per year. By joining Ryanair, you will develop cutting edge tech solutions inside Ryanair, transforming aviation for Pilots, Cabin Crew & Ground Ops, as well as driving the tech experience for our customers on Europe’s largest travel website!
Ryanair Labs has more than 550 employees across our offices in Dublin, Madrid, Poland, and Portugal. Our plan is to continue to grow our IT Labs Team so we are always on the lookout for the best talent. Apply today for more information.
We are looking for a Senior Data Engineer to work in a multi-disciplined team, working alongside Developers, Designers and Product Owners, who will own their work from the initial idea to the final implementation. We are continually gathering information on travel related events and are looking for someone to help us make the most of a cloud based data system. The role will allow for opportunities to work in a variety of areas such as commercial, marketing, engineering, logistics, and anything else that you may think of, or that will come your way.
The Data Engineer team has the majority of their projects on AWS, but there is also chance to work with other clouds, such as GCP and Azure.
As a Senior, you’ll also be required to overview other Data Engineers work, through code review processes, from time to time.
The Senior Data Engineer that we are looking for should have the following skills:
- Strong knowledge on general computing systems (Operating systems, network, memory layers, processes, etc).
- Proficient in data analysis.
- Good intuition in exploring data and interpretation.
- Must also have decent math / statistical base skills and how to use them on data quality processes.
- Good knowledge of Data Warehouse concepts.
- Proficient knowledge of SQL on both DDL and DML scripts. Being able to optimize queries and understand the processes generated behind them.
- Deep knowledge of Big Data technologies, especially on HDFS and Spark (2.X minimum).
- Proficient in data processing with either Python or Scala, but good knowledge on both technologies.
- Proficient coding skills, good code style, testing skills and documentation.
- Virtualization through Docker. Experience with Kubernetes is also a plus.
- Job Orchestration with Airflow or similar.
- Good knowledge of IaaC and CICD principles.
- Good data visualization skills through dashboards.
- Good communication skills.
- Proactivity and good team spirit.
Your responsibilities will include:
- Designing, developing and maintaining data processing jobs in cloud.
- Making the solutions fault tolerant.
- Monitor, tune performance and troubleshoot existing jobs.
- Support data scientists and analysts in deploying their solutions.
- Make data available and reusable within the organization.
- +5 years of experience working on data processing related projects (ETLs, Data Analysis, etc...).
- Working on exploiting multiple data formats (file formats, Json, CSV, parquet etc…; APIs, different DB engines..)
- Experience in Data Modelling on analytical data bases.
- Some experience on building event driven architectures, a plus if those architectures were made on AWS.
- +2 years at least working on AWS. Experience with the following services is a plus: EMR, Batch, Lambda, SNS, SQS, DynamoDB, Glue, Athena, MWAA
- Experience on defining process monitoring, troubleshooting.
- Experience working with Data Science teams, productionizing their pipelines and assisting them for data wrangling.
Our tech stack includes (but isn’t limited to):
- AWS Stack:
- Batch processing with EMR, AWS Batch, EKS and Lambdas
- Job orchestration with MWAA (Airflow 2.x)
- Event based processing with SNS, SQS, DynamoDB, Lambdas, Kinesis.
- Data Warehouse on Glue.
- Process monitoring and data visualization with Grafana.
- IaaC with Cloudformation.
- CICD Pipelines on every project with Codecommit + Codebuild + Codepipeline.
- Programming languages: Python, Scala.
- Git as CVS, JIRA Kanban boards as teamwork organization.
- Other services in other clouds:
- GCP: Big Query, Cloud functions. (IaaC with Terraform)
- Azure: Synapse, Blob storage.
Benefits and forms of employment:
Contract of employment (permanent contract after trial period)
- Possible hybrid model (2 days from the office weekly)
- Option to participate in trainings and conferences
- Staff travel benefits from day one
- Creative work tax deduction
- Multisport card
- Private health care
- Group insurance scheme
- - - - or - - -
- Full remote
- Possible permanent place in the office
- Possibility of taking part in trainings and certifications
- Great chance to meet your colleagues in other offices
- Annual events (i.e. St. Patrick’s Day )
- Regular social meetings
- Paid referral system
- New office building surrounded by great dinettes right in the city centre
Apply today to discuss the role in more detail!