This role is for one of the Weekday's clients
Salary range: Rs 1500000 - Rs 2500000 (ie INR 15-25 LPA)
Min Experience: 5 years
JobType: full-time
We are looking for a skilled Data Engineer to design, develop, and maintain scalable and efficient data pipelines and warehousing solutions. The ideal candidate will have strong experience in ELT/ETL processes, data modeling, and working with modern cloud and big data technologies to support data-driven decision-making across the organization.
Key Responsibilities:
- Design, build, and maintain robust ETL/ELT data pipelines ensuring accuracy, completeness, and timely delivery of data.
- Collaborate with cross-functional teams to gather data requirements and translate them into scalable data models and solutions.
- Develop and optimize data pipelines using technologies like Elastic Search, AWS S3, Snowflake, and NFS.
- Design and manage data warehouse schemas to support analytics and business intelligence initiatives.
- Implement data validation, quality checks, and monitoring systems to ensure data integrity and proactively address issues.
- Partner with data scientists and analysts to ensure accessible and usable data for analytical applications.
- Stay up to date with best practices in data engineering, including CI/CD, DevSecFinOps, and Agile/Scrum methodologies.
- Contribute to the continuous improvement of our data infrastructure and warehousing architecture.
Requirements:
Mandatory:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in Data Engineering with a focus on ELT/ETL workflows.
- 3+ years of hands-on experience with Snowflake data warehousing solutions.
- 3+ years of experience in building and maintaining ETL pipelines using Airflow.
- Minimum 3 years of professional experience using Python for data processing and automation tasks.
- Experience with Elastic Search in the context of data pipelines and analytics.
- Strong command of SQL and data modeling best practices.
- Hands-on experience with AWS S3 and other cloud-based data storage services.
- Familiarity with NFS and similar file storage systems.
- Strong analytical thinking and problem-solving skills.
- Excellent communication and collaboration abilities.
Technical Skills:
- Languages: Python, SQL
- Data Engineering: ETL/ELT, Airflow, Data Modeling
- Tools & Platforms: Snowflake, Elastic Search, AWS S3, NFS
- Methodologies: CI/CD, Agile/Scrum, DevSecFinOps