Applied AI/ML Data Engineering Vice President
About the role
As an Data Engineer Vice President in our Data and AI team, you will contribute to the firm’s objectives by delivering rapid and scalable solutions that unlock value for Cerberus desks, portfolio companies, or other businesses/investments. You’ll do this by designing, implementing, a Data and AI systems for a broad range of business objectives. You may also participate in due diligence analyses of future investments.
Responsibilities:
- Build and deliver Data, AI and Engineering projects
- Delivery focused: Help design solutions using a rigorous hypothesis-based approach, partner with cross-functional teams, and execute development and implementation with a focus on value.
- Agile and pragmatic: Rapidly and iteratively deliver results in high pressured projects with the skill to pivot quickly as needed relative to business needs and value.
- Contemporary and innovative approach: Develop novel solutions using modern platforms (e.g., Cloud based platforms like Azure, AWS, GCP), languages/technologies (e.g., Python, Spark, SQL, DBT), Orchestration (ADF, Airflow, Dagster) and tools (e.g., Power BI), with an ability to build IP into reusable software.
- Structured approach: Bring order to disparate requirements with high tolerance for ambiguity, very strong problem-solving ability, and excellent stakeholder engagement skills.
- Synthesize analytical findings for consumption by senior business executives
- Communicator: Break down complex structures and problems into succinct components for a range of clients and colleagues at all levels of seniority
- Storytelling: Be a storyteller capable of delivering practical insights in a compelling manner
- Build a reputation as a trusted technologist and voice on Data and AI topics both internally and externally
- Technology polymath: Experience with a wide range of technology and can learn and develop any solutions across the full data science lifecycle and application stack
- Test & Learn mentality: Challenge our current best thinking, test theories, and iterate rapidly.
- Creativity: Invent new analyses to solve business problems
- Trusted voice: Establish reputation of delivering on commitments, building trusted relationships with stakeholders.
- Subject matter expertise: Develop deep subject matter expertise across techniques, technologies, and industries.
Requirements:
General
- University degree in STEM field or equivalent.
- 10+ years of hands-on experience delivering Data solutions in a production environment.
- Exceptional intellectual curiosity, problem solving, data intuition, and effectiveness in a team.
- Ability to communicate ideas and solutions to stakeholders clearly.
- Manage and implement an end-to-end data solution to support various AI/ML use cases.
Programming
- Highly proficient with SQL and one of these languages (Python, Java, Scala)
- Deep understanding of data structures, run-time and memory complexities (e.g., list, tuples, queues, trees, graphs)
- Able to perform code optimization, including by means of measuring code complexity for alternative approaches.
- Writes reusable and efficient general-purpose libraries which can be reused across projects by the team.
- Has good intuition for separating interface functions and implementation logic; comments functions appropriately.
- Reviews pull requests; provides feedback and guidance to junior team members for code review.
- Enforce best practice for source code management. Git Policies, documentation, naming conventions, and directory structure.
Data
- Highly proficient with at least one Relational DB (e.g., MySQL, PostgreSQL, SQL Server, etc.)
- Deep understanding of OLAP vs OLTP solutions and storage solutions (Row Store vs Columnar Store)
- Proficient in developing ERD diagrams.
- Proficient with Data Warehousing solutions (e.g., Redshift, Snowflake, BigQuery, Lakehouse, Data Mesh, Medallion Architecture)
- Proficient with data modeling; can design complex data models (3dnf, star schema, galaxy schema)
- Proficient with data privacy and security.
Cloud
- Proficiency with at least one Cloud platform (e.g., Azure) and knowledge of others
- Proficient with Cloud storage and file storage (e.g., Blob, Samba, NFS, SFTP)
- Proficient with Cloud services (e.g., Azure functions, API Management, Orchestration, Messaging queues, Kafka)
- Familiarity with Cloud monitoring and logging.
Deployment
- Proficient with version control (e.g., Git), containers (e.g., Docker), APIs, and orchestration
- Proficient with CICD pipelines
- Proficient with deploying APIs to serve model results; familiar with standard Python backend web frameworks (e.g., Flask, Django)
About Cerberus Capital Management & Cerberus Technology Solutions
Cerberus Capital Management (CCM) is a private equity firm with partial or full ownership stakes in over 40 companies in a variety of industries. Cerberus Technology Solutions (CTS) is a subsidiary of CCM focused exclusively on leveraging emerging technology, data, and advanced analytics to drive transformations.
Our expert technologists work closely with Cerberus investment and operating professionals across our global businesses and platforms on a variety of operating initiatives targeted at improving systems and generating value from data.
The base salary for this position is expected to be between $170,000.00 and $230,000.00. The base salary offered to the chosen candidate will be commensurate with a candidate’s relevant experience and other qualifications for the position, as determined by the Company in its sole discretion. In addition to base salary, this position is eligible for an annual discretionary bonus, [which is often a meaningful portion of the compensation package], and a robust benefits package.