Position: Data Engineer(Business Data Analyst/Consultant)
Location: REMOTE (Must work in PST hours)
Duration: 4+ Months (High Possibility to extend)
Pay Rate: $55-65/hr on W2(No C2C)
Visa: Only H4 EAD/ GC EAD/ Green Card and US Citizens (No OPT/CPT/ H1)
Will the worker need to be Remote local to the office or Remote anywhere in the US: Would prefer PST time zone -must work PST hours
How many years of related experience are you looking for in your ideal candidate?: 3-5 years min
Specific Systems Knowledge Required: Cloud Service – Azure , Advanced SQL
Specific Systems Knowledge Preferred: Databricks, Python, PowerBI
Expected Shift: M-F 9-5 PST
Interview Process: Will interview with 2 from my team and 2 from IT
Top Three things Worker will be doing: Developing scripts to automate the Audit and Monitoring process
1) Take business requirements and build queries to spec
2) create alert reporting when audit condition are met
3) create automation to reduce the amount of manual review of audit scenarios
4) complete documentation of products and training business users to manage and maintain queries and reports
Top Three Skillsets needed:
1) Must be fully conversant in English and be able to work in a collaborative environment
2) Advance sql development skills
3) self-starter – be able produce results with very little guidance
Anything else important we need to know to fill your role?: This is not a purely Technical role.
The individual must be able to work with business users to translate business requirements into technical specifications in a collaborative and interactive manner. Active listening skills are critical. Knowledge of HIPAA rules and audit framework highly desirable.
Primarily by building, aggregating, and manipulating rich datasets that will support the analysis, models, and data visualizations.
You will work primarily with business users to translate business requirements into automated process while at the same time partnering with database architects, data analysts, data scientists and other data engineers.
Key Responsibilities include, but are not limited to: Data Pipeline: Build robust data pipelines: develop jobs to process, validate, transport, collate, aggregate, and distribute data Build workflows that empower analysts to efficiently use data Develop data virtualizations and actual data movement and transformation processes, using the appropriate technologies, tools and techniques to balance performance and cost objectives. Define and establish processes to maintain the integrity of data within our data pipeline and warehouse Develop tests to validate, and monitor data transfer integrity and efficiency.
Work collaboratively across the organization to address and predict data performance issues Develop documentation for the tools and data products deployed Required Qualifications
- Strong SQL writing skills
- Sound understanding of areas of computer science such as algorithms, data structures and databases
- Experience with relational databases - Software engineering experience with expertise in at least one high-level programming language (preferably Python) - Experience with Linux command line and bash scripting.
- Knowledge of distributed computing frameworks e.g. MapReduce, HDFS, etc. Required Qualifications - Experience working with Microsoft Azure cloud services
- Experience with Devops tools and processes
- Bachelor's in computer science or related fields 3-5 years