Data Architect _Snowflake (15+ Yrs Exp)
Location: Dallas, TX
Duration: 12+ months
Key Responsibilities:
Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Snowflake, DBT, Snaplogic, and ETL tools.
SQL Optimization: Write and optimize complex SQL queries to ensure high performance and efficiency.
Data Integration: Integrate data from various sources, ensuring consistency, accuracy, and reliability.
Database Management: Manage and maintain SQL Server and PostgreSQL databases.
ETL Processes: Develop and manage ETL processes to support data warehousing and analytics.
Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions.
Documentation: Maintain comprehensive documentation of data models, data flows, and ETL processes.
Troubleshooting: Identify and resolve data-related issues and discrepancies.
Python Scripting: Utilize Python for data manipulation, automation, and integration tasks.
Technical Skills:
- Proficiency in Snowflake, DBT, Snaplogic, SQL Server, PostgreSQL, and Azure Data Factory.
- Strong SQL skills with the ability to write and optimize complex queries.
- Knowledge of Python for data manipulation and automation.
- Knowledge of data governance frameworks and best practices
- Soft Skills:
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration skills.
- Positive attitude and ability to work well in a team environment.
- Certifications: Relevant certifications (e.g., Snowflake, Azure) are a plus.