As a Senior Analytics Engineer, you will support the BI lead on data pipeline and ecosystem from end to end. You will be partnering with multiple cross functional business partners and teams to understand team needs and requirements to transform the raw data into clean, actionable data. This individual will help build and design the next version of the data ecosystem. Ultimately, you will be responsible for the maintenance and continued enhancements of data tools. This role will report to the Director, Business Intelligence.
Six Month Expectations
• Contribute to the data pipeline process, creating new pipelines that further data availability on our BI platform
• Take ownership of portions of our data stack
• Proactively solve issues or errors within the data pipeline • Take an active role in designing the future of our data pipeline or BI platforms
• Collaborate with the team on best practices and overall business strategy
• Take ownership of data office hours and have taught stakeholders how to better interact with our data tools
Twelve Month Expectations
• Taken complete ownership of our data pipeline from end to end
• Established close relationships with data stakeholders across the company
• Have revamped our data visibility and anomaly tools through new and improved error reporting
• Taken a lead role in managing the development and architecture design of our data infrastructure
• Collaborated with key stakeholders and executives to define a core set of KPI’s and reporting
• Developed a vision of the future of data and how our tools interact with stakeholders to streamline processes and analytics
Qualifications
• 4+ years of experience working in data engineering, BI/analytics engineering, solutions architecture or related data fields
• 3+ years of working experience with ETL and relational databases
• 2+ years of working experience in BI and Web Analytics platforms
• Strong expertise in SQL (Python knowledge is a plus)
• Expertise and familiarity with our current data stack: o Data warehousing with Snowflake o dbt & Fivetran o Looker as a BI platform o Languages: SQL & Python o AWS for serving infrastructure (Lambdas, EC2, S3, Postgres RDS) o Deployments and containerization with Docker and Kubernetes
• Experience in building data transformation pipelines and designing dimensional models via dbt or a similar ETL tool
• Ability to learn autonomously and quickly
• Analytical, creative and commercial mindset
• Extremely organized and detail-oriented with effective multitasking and prioritization skills
• Highly motivated, willing to take ownership of work, drive to solve problems and work effectively under pressure
• Excellent written and verbal communication skills, willing to proactively engage other team members in fostering a strong collaborative team-oriented environment