Job Responsibilities:
- Analyze, develop, refactor, fix, test, review, and deploy ETL functionality and bug fixes moving data between Snowflake data layers.
- Perform database and query tuning, diagnose, and resolve performance issues utilizing ELT and push-down if required.
- Enhance ETL frameworks, continuous data quality frameworks, and automation in the data pipeline.
- Ensure service data availability SLOs, engage with Security, Infrastructure, and Workload management teams for issue resolutions.
- Adhere to Agile Jira SDLC controls, Service Now Change & Incident management, and Data Ops GitLab CI/CD pipeline.
- Participate in daily standups, lead design reviews, and coordinate with offshore teams.
Requirements:
- 2 or more years of experience developing, deploying, and supporting fault-tolerant data pipelines leveraging ETL and streaming ingestion technologies.
- Proficiency in Linux scripting (Python, Shell), SQL development, and NoSQL databases.
- Experience with cloud-based data offerings (Amazon AWS, Redshift, Snowflake, Google GCP, Microsoft Azure).
- Familiarity with ETL Tools like Informatica, DataStage, Data Build Tool, etc.
- Experience in the P&C Insurance Domain.
Competencies:
- Informatica Powercentre
- Python for Digital
- Snowflake for Digital
- PL/SQL
Essential Skills:
Snowflake, DBT, Informatica, SQL, Python
Desirable Skills:
PnC Insurance Domain