Design and implement data pipelines using DBT for transformation and modeling;
Manage and optimize data warehouse solutions on Snowflake;
Develop and maintain ETL processes using Fivetran for data ingestion;
Utilize Terraform for infrastructure as code (IaC) to provision and manage resources in AWS, Snowflake, Kubernetes, and Fivetran;
Collaborate with cross-functional teams to understand data requirements and deliver scalable solutions;
Implement workflow automation using Argoworkflows to streamline data processing tasks;
Ensure data quality and integrity throughout the data lifecycle.
requirements-expected :
Bachelor’s degree in Computer Science, Engineering, or related field;
5+ years of experience working with Python;
Proven experience as a Data Engineer with a focus on DBT, Snowflake, ArgoWorkflows, and Fivetran;
Strong SQL skills for data manipulation and querying;
Experience with cloud platforms like AWS or Azure;
Experience with Kubernetes;
Familiarity with data modeling concepts and best practices;
Excellent problem-solving skills and attention to detail;
Ability to work independently and collaborate effectively in a team environment;
Upper-intermediate English level.
offered :
Professional growth: Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps
Competitive compensation: We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities
A selection of exciting projects: Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands
Flextime: Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive.