Location: Charlotte, NC (hybrid, 2-3 days/week in office)
3- year contract with opportunity for extension or full-time hire
W-2 only (cannot accommodate Corp-to-Corp or 1099)
Brooksource is searching for an AWS Data Engineer with expertise in data warehousing using AWS Redshift to join our Fortune 500 Energy & Utilities client in Charlotte, NC
.
RESPONSIBILITIE
- S:
Provides technical direction, guides the team on key technical aspects and responsible for product tech deliv - eryLead the Design, Build, Test and Deployment of compone
- ntsWhere applicable in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Le
- ad)Understand requirements / use case to outline technical scope and lead delivery of technical solut
- ionConfirm required developers and skillsets specific to prod
- uctProvides leadership, direction, peer review and accountability to developers on the product (key responsibili
- ty)Works closely with the Product Owner to align on delivery goals and tim
- ingAssists Product Owner with prioritizing and managing team back
- logCollaborates with Data and Solution architects on key technical decisi
- onsThe architecture and design to deliver the requirements and functional
- itySkilled in developing data pipelines, focusing on long-term reliability and maintaining high data qual
- ityDesigns data warehousing solutions with the end-user in mind, ensuring ease of use without compromising on performa
- nceManage and resolve issues in production data warehouse environments on
AWS
REQUIRED SKI
- LLS:5+ years of AWS experience, specifically including AWS Reds
- hiftAWS services - S3, EMR, Glue Jobs, Lambda, Athena, CloudTrail, SNS, SQS, CloudWatch, Step Functions, QuickS
- ightExperience with Kafka/Messaging preferably Confluent K
- afkaExperience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB and Au
- roraExperience with Amazon Redshift for AWS data warehou
- singProven track record in the design and implementation of data warehouse solutions using
- AWSSkilled in data modeling and executing ETL processes tailored for data warehou
- singCompetence in developing and refining data pipelines within
- AWSProficient in handling both real-time and batch data processing t
- asksExtensive understanding of database management fundamen
- talsExpertise in creating alerts and automated solutions for handling production prob
- lemsTools and Languages – Python, Spark, PySpark and Pa
- ndasInfrastructure as Code technology – Terraform/CloudForma
- tionExperience with Secrets Management Platform like Vault and AWS Secrets man
- agerExperience with Event Driven Architec
- tureDevOps pipeline (CI/CD); Bitbucket; Conco
- urseExperience with RDBMS platforms and Strong proficiency with
- SQLExperience with Rest APIs and API gat
- ewayDeep knowledge of IAM roles and Poli
- ciesExperience using AWS monitoring services like CloudWatch, CloudTrail ad CloudWatch ev
- entsDeep understanding of networking DNS, TCP/IP and
- VPNExperience with AWS workflow orchestration tool like Airflow or Step Funct
ionsPREFERRED SKI
- LLS:Experience with native AWS technologies for data and analytics such as Kinesis, OpenSe
- archDatabases - Document DB, Mong
- o DBHadoop platform (Hive; HBase; Dr
- uid)Java, Scala, Nod
- e JSWorkflow Automa
- tionExperience transitioning on premise big data platforms into cloud-based platforms such as
- AWSStrong Background in Kubernetes, Distributed Systems, Microservice architecture and contai
nersADDITIONAL REQUIREME
- NTS:Ability to perform hands on development and peer review for certain components / tech stack on the pro
- ductStanding up of development instances and migration path (with required security, access/ro
- les)Develop components and related processes (e.g. data pipelines and associated ETL processes, workfl
- ows)Lead implementation of integrated data quality frame
- workEnsures optimal framework design and load testing scope to optimize performance (specifically for Big D
- ata)Supports data scientist with test and validation of mo
- delsPerforms impact analysis and identifies risk to design cha
- ngesAbility to build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applicat
- ionsAbility to implement data pipelines with the right attentiveness to durability and data qua
- lityImplements data warehousing products thinking of the end users experience (ease of use with right performa
- nce)Ensures Test Driven develop
- ment5+ years of Experience leading teams to deliver complex prod
- uctsStrong technical skills and communication sk
- illsStrong skills with business stakeholder interact
- ionsStrong solutioning and architecture sk
- ills5+ years of Experience building real time data ingestion streams (event dri
- ven)Ensure data security and permissions solutions, including data encryption, user access controls and log
ging
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local
laws.