Collaborate within a multidisciplinary team of software developers, data scientists, and architects to develop innovative data solutions for the lab.
Demonstrate a strong track record in building ETL/ELT pipelines and data warehousing, integrating data from diverse sources including legacy databases and unstructured documents.
Own and maintain end-to-end data pipeline frameworks for ingesting and exporting data, enabling the creation of high-value data products.
Partner with operations, product, and engineering teams to promote best practices and deliver tailored data solutions across various business needs.
Contribute to the design and strategic roadmap for advancing UL’s data maturity and infrastructure.
requirements-expected :
Bachelor’s Degree in Computer Science, Engineering, Mathematics, Statistics, or a related field
3+ years of experience with common data engineering tools (e.g., Airflow, Spark, Snowflake)
3+ years of experience building ETL/ELT pipelines
3+ years of hands-on experience with Python and SQL
Proven experience supporting and collaborating with cross-functional teams
offered :
Competitive remuneration package with yearly bonus
Mediclaim scheme for family
Group Term Life Insurance
Group Personal Accident Insurance
Training and Development (provided by UL University)
benefits :
private medical care
sharing the costs of professional training & courses