Employment Type: Full Time (Direct Hire)
Employment Setup: Remote Setup
What impact you will make?
Our client is seeking a highly skilled Lead Data Engineer Consultant to join their team and take a leading role in implementing robust data solutions for our clients primarily on the Snowflake platform. As a Lead Data Engineer, you will be responsible for designing, developing, maintaining, and optimizing SQL, pipelines, databases, and related applications. You will collaborate with cross-functional teams, stakeholders, and data analysts to analyze requirements and deliver scalable and high-performing data solutions.
What you will be doing?
- Provide architecture and infrastructure guidance of Snowflake capabilities to accommodate business/technical use cases.
- Leverage technical expertise in all aspects of the Snowflake platform.
- Monitor health and growth of cloud snowflake instance tuning Snowflake for performance and utilization optimization.
- Design, develop, automate, monitor, maintain and performance tune ELT/ETL to manage high volume data transfer to and from internal and external systems.
- Migrate data from on-premises to cloud technologies primarily into Snowflake, AWS & Azure eco-systems.
- Collaborate cross-functionally with project managers and agile teams to estimate development efforts and ensure complete delivery of solutions and fulfilling requirements.
- Automate and manage provisioning needs, such as Snowflake storage and compute, Role Based Access Control model and permissions-- ensure data security and monitor user access.
- Configure and manage monitoring/alerting around replication latency, performance (cluster & query)
- Deploy CI/CD enabled pipelines and adapt best practices using tools such as GitHub.
- Provide release management support for databases and related applications in addition to maintaining schemas and objects.
- Implement data models and schemas to support business requirements and ensure data consistency and integrity.
- Provide technical expertise, troubleshooting, and recommendations to enhance existing systems and processes.
- Create and maintain technical documentation, including data flow diagrams, data dictionaries, and process documentation.
- Write and optimize SQL queries, stored procedures, and views for data extraction, transformation, and loading.
What skills do you need to be successful?
- Bachelor’s degree in Computer Science, Information Systems, or a related field.
- Snowflake Certification is required.
- Experience with ELT/ETL tools dbt is mandatory and Fivetran is highly preferred
- Strong customer-facing communication & presentation skills a must
- Ability to work across multiple clients simultaneously.
- Ability to work with teams across geographical regions.
- 10+ years of hands-on experience working extensively with relational databases.
- Strong hands-on Snowflake Architecture experience; warehouse sizing and health check, data structures, performance tuning, Role-based access control, SnowSQL
- Deep understanding of SQL; ability to write multi-layer statements that handle complex transformation needs.
- Hands on experience developing data pipelines in a cloud environment.
- Demonstrated ability to understand/discover business needs and craft data merges, transformations and aggregations, transformations to deliver data that can be used in reports, analytic repositories, and applications.
- Working knowledge of Python, other languages (Java, JavaScript, C#) a plus