Requires experience in: AWS Terraform Docker Python Github Actions Snowflake
Contract Assignment: Data Engineer Sr. Specialized
12 month assignment
Open to Remote – Candidates must be willing to work ET hours. Expected to work within core business hours, and must not exceed 40 hours/week.
For this contractor assignment, looking for a Senior Data Engineer to join their Marketing Research and Analytics team to support the building of new data capabilities and data infrastructure. In this role, you will be an integral part of a team that supports the analysis, creative modeling and measurement, and optimization of advertising campaigns across channels. As a Senior Data Engineer, you will be responsible for designing, building, and maintaining data pipelines and integrating and productionizing models, and supporting dashboards/scorecards used for analytics to solve business problems.
This contractor will primarily use tools such as Python, Terraform, and GitHub Actions to deploy cloud infrastructure and develop internal tooling. They support the building of Marketing’s data infrastructure and will integrate with external APIs to exchange data with our vendor partners. They will work closely with data analysts and data science to prepare datasets for analytics, model training, automate scoring processes, and enable the use of new tools for the Marketing Research and Analytics team.
Day-to-Day Responsibilities:
• Working with Marketing data partners and build data pipelines to automate the data feeds from the partners to internal systems on Snowflake/SQL server.
• Working with Data Analysts to understand their data needs and prepare the datasets for analytics.
• Work with Data Scientists to build the infrastructure to deploy the models, monitor the performance and build the necessary audit infrastructure.
Required skills:
• Experience with building data pipelines, data pipeline infrastructure and related tools and environments used in analytics and data science (ex: Python, Unix)
• Experience in developing analytic workloads with AWS Services, S3, Simple Queue Service (SQS), Simple Notification Service (SNS), Lambda, EC2, ECR and Secrets Manager.
• Strong proficiency in Python, SQL, Linux/Unix shell scripting, GitHub Actions or Docker, Terraform or CloudFormation, and Snowflake.
• Order of Importance: Terraform, Docker, GitHub Actions OR Jenkins
• Experience automating data ingestion, processing, and reporting/monitoring.
• Experience with orchestration tools such as Prefect, DBT, or Airflow.
• Experience with other relevant tools used in data engineering (e.g., SQL, GIT, etc.)
• Ability to set up environments (Dev, QA, and Prod) using GitHub repo and GitHub rules/methodologies; how to maintain (via SQL coding and proper versioning)
EDUCATION AND/OR EXPERIENCE REQUIRED:
- Bachelor's Degree or higher in an Information Technology discipline or related field of study and minimum of two years of work experience designing, programming, and supporting software programs or applications.
- In lieu of degree, minimum of four years related work experience designing, programming, and supporting software programs or applications may be accepted.