We offer:
- Participation in interesting and challenging projects
- Flexible working hours
- A great, non-corporate atmosphere
- Stable employment conditions (contract of employment or B2B contract)
- Opportunities for development and promotion
- Attractive package of benefits
- Remote work
Your tasks:
- The responsibilities of this person will include development of components using object oriented programming following good coding practices and extensive components testing. Additionally, a person will be involved in the full lifecycle of forecasting application development: from data preparation, through modeling, evaluation, productization to maintenance.
We are looking for you, if you have:
- 3+ years of working with programming language focused on data pipelines,eg. Python or R
- 3+ years of experience working with SQL
- 2+ years of experience in data pipelines maintenance
- 2+ years of experience with different types of storage (filesystem, relation, MPP, NoSQL) and working with various kinds of data (structured, unstructured, metrics, logs, etc.)
- 2+ years of experience in working in data architecture concepts (in any of following areas data modeling, metadata mng., workflow management, ETL/ELT, real-time streaming, data quality, distributed systems)
- 2+ years of cloud technologies with emphasis on data pipelines (Airflow, Glue, Dataflow - but also other smart solutions of handling data in the cloud - elastic, redshift, bigquery, lambda, s3, EBS etc.)
- 1+ years of experience in Java and/or Scala
- Experience working in Snowflake cloud environment
- Very good knowledge of relational databases (optional)
- Very good knowledge of data serialization languages such as JSON, XML, YAML
- Excellent knowledge of Git, Gitflow and DevOps tools (e.g. Docker, Bamboo, Jenkins, Terraform
- Capability to conduct performance analysis, troubleshooting and remediation (optional)
- Excellent knowledge of Unix
- Pharma data formats is a big plus (SDTM)
We reserve right to contact the selected candidates.