We are seeking an Analytics Engineer with expertise in modern data stack technologies to take on a 12+ month contract. In this role, you will bridge the gap between data engineering and data analysis, working to design, build, and maintain scalable data pipelines and analytical frameworks that empower our business to make data-driven decisions. You will be instrumental in transforming raw data into clean, reliable data sets that support data marts, warehousing, and advanced analytics.
You'll work extensively with Snowflake, SQL and Airflow, and to ensure that the data infrastructure is robust, efficient, and accurate. You will also leverage your strong background in data warehousing and de-duplication to create well-structured data models and maintain data integrity across various domains.
Responsibilities:
- Design and build scalable data pipelines using Airflow and SQL to extract, transform, and load data into Snowflake, supporting analytics and reporting needs.
- Develop, maintain, and optimize data marts and data warehouses, ensuring they are structured for high performance and ease of use.
- Implement and maintain de-duplication processes, ensuring data accuracy and consistency across all layers of the data architecture.
- Collaborate with cross-functional teams, including data engineers, analysts, and business stakeholders, to understand data requirements and design robust data models.
- Write, optimize, and maintain complex SQL queries to aggregate, clean, and transform data for analytics purposes.
- Leverage GitHub for version control, collaboration, and to ensure code quality and consistency across the data team.
- Monitor and troubleshoot ETL workflows, addressing any failures or performance issues in a timely manner.
- Maintain documentation for data models, pipelines, and workflows to ensure that processes are transparent and repeatable.
- Stay up to date with emerging technologies and best practices in data engineering, analytics, and data warehousing
Qualifications:
- 5+ years of experience in a data engineering, analytics engineering, or similar role.
- Strong proficiency in SQL for data transformation, aggregation, and analysis.
- Hands-on experience with Snowflake.
- Proficiency with Airflow for scheduling and managing ETL workflows.
- Experience using GitHub (or similar version control systems) for code management, collaboration, and deployment workflows.
- Solid understanding of data mart and data warehouse design principles, including dimensional modeling and normalization/denormalization techniques.
- Proven experience with de-duplication techniques to ensure data integrity and consistency.
- Strong analytical and problem-solving skills, with an ability to work independently and as part of a team.
- Excellent communication skills, with the ability to explain technical concepts to non-technical stakeholders.