We are looking for a Data Engineer to support the development and optimisation of our Snowflake-based data warehouse and the pipelines powering our core business applications. You will work across Java, SQL, Python/TypeScript, ETL/ELT processes, and Salesforce integrations, helping us bring structure, reliability, and cost efficiency to a fast-growing data ecosystem.
responsibilities :
Develop, maintain, and optimise the Snowflake data warehouse, ensuring cost efficiency and performance.
Build and improve data ingestion processes (ETL/ELT), primarily between Salesforce and Snowflake.
Implement backend components in Java supporting data workflows and integrations.
Work with SQL, Python/TypeScript to support data transformations, automation, and validation.
Collaborate with Data Engineering, Product, and Application teams (React/Node/TypeScript) to define data requirements and integration patterns.
Ensure data privacy and proper handling of sensitive information (PII).
Prepare documentation to support offshore maintenance teams and enable smooth handovers.
Contribute to a pragmatic engineering culture focused on execution and measurable business value.
requirements-expected :
Solid hands-on experience working with Snowflake (architecture, performance optimisation, credit usage efficiency).
Strong SQL skills and experience designing or maintaining ETL/ELT pipelines.
Knowledge of data modelling for analytics and warehouse environments.
Experience integrating Salesforce data into analytical systems.
Practical experience with Java in data-related or backend components.
Familiarity with Python or TypeScript for data processing tasks.
Ability to work in distributed teams and communicate clearly with both technical and non-technical stakeholders.
Pragmatic mindset oriented toward delivery rather than building unnecessary complexity.
offered :
Remote type of work
Long-term cooperation
Engaging projects with real impact
Supportive, experienced team and modern development environment