The Data Engineer III will play a pivotal role in designing, developing, and maintaining the data architecture for the organization. This individual will be responsible for T-SQL development, ETL processes, and Python scripting to ensure efficient data transfer, transformation, and integration. The primary focus will be on automating data workflows, supporting diverse file formats, and utilizing APIs for seamless data import and export. The Data Engineer will collaborate closely with cross-functional teams to enhance data-driven decision-making and contribute to the overall success of our data infrastructure.
JOB RESPONSIBILITIES:
- Utilize advanced T-SQL skills to design, optimize, and maintain database structures.
- Develop complex SQL queries for reporting and data transformation.
- Design, implement, and optimize ETL processes for efficient data extraction, transformation, and loading.
- Develop and maintain ETL scripts using Python for seamless integration with existing data workflows.
- Automate SFTP file transfer processes, ensuring secure and reliable data exchange.
- Import and export data and files in various formats, including flat file, XML, JSON, X12, Excel, and Parquet using Python.
- Utilize Python to integrate with APIs for importing and exporting data.
- Collaborate with team members and stakeholders to review and optimize code for efficiency and maintainability.
- Collaborate with analysts and business stakeholders to understand data visualization requirements.
- Work closely with cross-functional teams to understand data requirements and ensure data integrity.
- Document data engineering processes, standards, and best practices.
- Use Git and Github for version control of scripts and queries, ensuring a collaborative and organized development environment.
- Integrate Tableau for effective data representation and reporting.
QUALIFICATIONS:
Education: Bachelor's Degree in Computer Science, Information Technology, or a related field.
Experience: Minimum of 5 years of hands-on experience in T-SQL development and ETL processes. Proven experience in Python scripting for data manipulation and automation.
Technical Skills: Strong expertise in database design, optimization, and maintenance using T-SQL. Proficient in developing and optimizing ETL processes. Experience with automating SFTP file transfers and handling diverse file formats. Knowledge of data formats such as XML, JSON, X12, Excel, and Parquet. Familiarity with API integration for data import and export using Python. Proficient in version controlling scripts and queries using Git and Github.
Preferred Skills: Experience in data visualization using Tableau.Strong problem-solving skills and ability to troubleshoot data-related issues effectively. Identify and implement solutions to complex technical projects. Ability to work independently and in a team environment. Excellent communication and collaboration skills. Knowledge of health care delivery and operations a plus.
If you are a data engineering professional with a solid background in T-SQL, ETL, Python scripting, and a passion for enhancing data infrastructure, we encourage you to apply and contribute to our dynamic and innovative team.
Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities and activities may change at any time with or without notice.