Informacje o stanowisku
Job Description
Job Title: Sr. Data Engineer
Position: 1
Experience: 5+ Years
Location: Bangalore, India
About Us: Our mission is to leverage data to drive transformative solutions and create impactful experiences for our customers. Join us and be a part of a dynamic team that is dedicated to pushing the boundaries of what’s possible with data.
Role Overview: We are seeking an experienced Sr. Data Engineer with 5+ years of hands-on experience to join our talented team. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines, optimizing data processes, and ensuring the seamless integration of data from various sources. If you are passionate about data and excited to work on cutting-edge technologies, we want to hear from you!
Key Responsibilities:
- Design, develop, and maintain robust and scalable data pipelines to support business needs.
- Code BigQuery procedures, functions, and other database objects by applying expert knowledge in BigQuery SQL.
- Optimize ETL processes and data workflows to ensure efficiency, performance, and high scalability.
- Implement and manage data integration solutions using tools like DBT, Apache Airflow, Fivetran, Python, Celigo.
- Design an ETL framework that leverages reusable components and automated data quality checks.
- Develop and implement features and enhancements in BigQuery, leveraging your expertise in SQL and cloud-based data warehousing technologies.
- Ensure data quality and consistency through rigorous testing and validation processes.
- Troubleshoot and resolve data-related issues promptly.
- Stay updated with industry trends and technologies to continuously improve our data engineering practices.
What We’re Looking For:
- 5+ years of experience in Data / ETL Engineering, as well as Data / ETL Architecture and pipeline development.
- Proficiency in SQL, DBT, Python (Apache Airflow, Composer) and hands-on experience with ETL tools like Talend, Fivetran or similar.
- Proven experience in building and maintaining a scalable Data Warehouse in a cloud-based data platform, preferably in Google’s BigQuery.
- Experience in Git for version control and DBT for data transformation.
- Proven experience in designing, building and maintaining ETL processes, automated data quality checks and reusable ETL components.
- Knowledge of Data Lake and Data Warehousing concepts and Data Modelling techniques.
- Strong problem-solving skills and the ability to work independently as well as collaboratively.
- Excellent communication skills and the ability to articulate technical concepts to non-technical stakeholders.
- Bachelor-level degree in Computer Science, MIS, or CIS, or equivalent experience.
More context about the role and the team:
Here at Applied we have embarked on the journey of elevating our Corporate Data Services function. This function’s mandate will encompass 3 key areas – Data Governance and Architecture, Data Engineering, Reporting and Analytics and later Machine Learning. This role will report into the Manager of Data Engineering and will play a key role in building Applied’s Data Lakehouse from the ground up. This is an exciting opportunity for a motivated, high-performing individual who is eager to play a key role and influence the way we design and build a flexible, scalable data foundation.
We are looking for an individual who has solid, hands-on ETL and data integration experience. Someone who’s able to leverage metadata to automate data integrations, who prioritizes automation and proactive job monitoring and alerting. We are looking for someone who’s very familiar with data warehousing concepts and data quality reporting. Someone who can design and build reusable code components to enable efficiency and scalability.
#J-18808-Ljbffr
Praca KatowiceKatowice - Oferty pracy w okolicznych lokalizacjach