*No C2C candidates and sponsorship not offered*
*This is an onsite position in Lehi, UT, with no option for relocation candidates*
Job Description:
As a Data Engineer, you will play a critical role in designing, developing, and maintaining our data architecture to support business needs. You will work closely with data scientists, analysts, and other stakeholders to ensure the availability, reliability, and scalability of our data systems.
Key Responsibilities:
- Data Architecture & Design:
- Design, implement, and optimize scalable data pipelines and ETL processes.
- Develop and maintain data models and schemas to support analytics and reporting needs.
- Database Management:
- Manage and optimize databases and data warehouses, with a particular focus on Snowflake and Azure.
- Ensure data integrity, consistency, and security across all systems.
- Data Integration:
- Integrate data from multiple sources and systems into unified data structures.
- Develop APIs and data services for seamless data access and utilization.
- Performance Optimization:
- Monitor and improve the performance of data processing systems.
- Implement best practices for data storage, retrieval, and management.
- Collaboration & Support:
- Work with cross-functional teams to understand data requirements and deliver solutions.
- Provide technical support and guidance to team members and stakeholders.
Required Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- 5+ years of experience as a Data Engineer or in a similar role.
- Strong proficiency in SQL and experience with database management.
- Hands-on experience with Snowflake and Azure Data Services.
- Solid understanding of ETL processes and data pipeline architecture.
- Experience with data modeling, data warehousing, and data integration techniques.
- Familiarity with cloud platforms and tools, particularly Azure.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
Preferred Qualifications:
- Experience with additional cloud platforms (e.g., AWS, Google Cloud).
- Knowledge of data governance and compliance best practices.
- Familiarity with big data technologies such as Hadoop, Spark, etc.
- Experience with scripting languages like Python or Bash for data manipulation.