-
›
- Careers ›
- Mexico ›
- Job Openings ›
-
Lead Data Engineer – Snowflake
Lead Data Engineer – Snowflake
Qualification Required:
- Lead data warehouse development activities for a team.
- Lead more than one project in parallel.
- Assign data warehouse development tasks to team of 1-6 developers (Onsite/Offshore).
- Be a subject-matter expert (SME) in Data Warehouse domain and relevant business function.
- Work with minimal supervision and provide status updates and escalation to leadership as appropriate.
- Provides application support as part of an on-call rotation to work on resolving outages and user questions.
- Review project objectives and determine best technology to apply.
- Design and build large scale datasets for a wide range of consumer needs.
- Build, test and implement highly efficient data pipelines using a variety of technologies.
- Develop best practice standards for solution design and data structures.
- Create and maintain project specific documentation like architecture diagrams, flow charts.
- Analyze new data sources to understand quality, availability, and content.
- Write technical design documentation.
- Conducts peer code reviews.
- Create and executes unit tests.
- Supports QA team during testing.
- Mentor less experienced members of the development team.
Roles and Responsibilities:
- Hands-on Lead Data Engineer will be responsible for leading the team of data engineers (Onsite/Offshore) to build new data pipelines (loading millions of transformed true records into our cloud data warehouse in a timely manner) to support evolving analytical needs for critical business decisions.
- Manage existing data pipelines keeping data relevant and true, ready for analytical consumption.
Technical Expertise:
- 6 years+ experience on Snowflake SQL – advanced SQL expertise.
- 6 years+ experience on data warehouse experiences – hands on knowledge with the methods to identify, collect, manipulate, transform, normalize, clean, and validate data, star schema, normalization / denormalization, dimensions, aggregations etc.
- 6 years+ experience working in reporting and analytics environments – development, data profiling, metric development, CICD, production deployment, troubleshooting, query tuning, Atlassian Bitbucket and Bamboo etc.
- 4 years+ on Python – advanced Python expertise.
- 4 years+ on any cloud platform – AWS preferred.
- Hands on experience on AWS on Lambda, S3, SNS / SQS, EC2 is a bare minimum.
- 4 years+ on any ETL / ELT tool – Informatica, Pentaho, Fivetran, DBT etc.
- 4 years+ with developing functional metrics in any specific business vertical (finance, retail, telecom etc).
Skills Required:
- Clear communication – written and verbal communication, especially with time off, delays in delivery etc.
- Team Player – Works in the team and works with the team.
- Enterprise Experience – Understands and follows enterprise guidelines for CICD, security, change management, RCA, on-call rotation etc.
- Technical certifications from AWS, Microsoft, Azure, GCP or any other recognized Software vendor.
- 4 years+ on any ETL / ELT tool – Informatica, Pentaho, Fivetran, DBT etc.
- 4 years+ with developing functional metrics in any specific business vertical (finance, retail, telecom etc).
- 4 years+ with team lead experience.
- 3 years+ in a large-scale support organization supporting thousands of users.
Apply Now
Share this opportunity
Can’t Find the Job of Your Choice?
Never miss out on new jobs at HCLTech.