Senior Data Engineer - W2 or 1099
Location: Boston - 3 days on-site per week (only looking for local candidates for now)
Contract: Length: 6 months
Rate: $85 - $110 per hour
Need to be a US citizen or GC Holder for this position - No third parties.
We are looking for a Senior Data Engineer with deep expertise in building scalable data solutions, Snowflake, DBT, AWS services, and MLOps. The ideal candidate will have previous experience working with start-ups and a passion for implementing data engineering best practices to support machine learning pipelines.
Key Responsibilities:
- Design, implement, and maintain scalable data pipelines using Snowflake and AWS services (S3, Redshift, Glue, Lambda, etc.).
- Collaborate with data scientists and ML engineers to develop and operationalize MLOps pipelines, ensuring efficient deployment and monitoring of machine learning models.
- Utilize DBT (Data Build Tool) to manage and transform data, ensuring data quality, consistency, and governance.
- Work with various structured and unstructured datasets to deliver reliable data solutions for real-time analytics and reporting.
- Collaborate with cross-functional teams to ensure alignment on data architecture and strategy.
- Continuously improve data pipeline performance, scalability, and reliability.
- Implement data security best practices, ensuring compliance with company and industry standards.
- Participate in code reviews, mentoring, and knowledge sharing within the team.
Required Qualifications:
- 5+ years of experience as a Data Engineer, working with modern data platforms and tools.
- Hands-on experience with Snowflake for data warehousing and analytics.
- Strong knowledge and practical experience with DBT for data transformation.
- Proficiency with AWS services (S3, Redshift, Glue, Lambda, etc.).
- Experience in MLOps: working knowledge of deploying and monitoring machine learning models in production.
- Prior experience working with start-ups or in a fast-paced environment.
- Strong understanding of data warehousing concepts, ETL/ELT processes, and performance optimization.
- Solid programming skills in Python or similar languages.
- Excellent communication and problem-solving skills, with a passion for scaling and optimizing data systems.
Preferred Qualifications:
- Experience with Kubernetes, Docker, and CI/CD pipelines.
- Familiarity with orchestration tools such as Airflow.
- Knowledge of data governance and privacy frameworks (GDPR, CCPA, etc.).