Job title:- Data Information Architect
Location:- Cambridge, MA 02141
Duration:- 06 Months
Job description:-
- Candidate should have strong communication and presentation skills
- This team is comprised of 8 internal and multiple teams in Digital, depending on the initiatives taken on, on a rolling basis.
- Top must-have skills: Minimum 8 years of related experience Data Mesh, domain-driven design Data, Object modeling Database/Data warehouse - AWS RDS, Snowflake, AWS Redshift
- Nice-to-have skills: Previous biotech/ pharma experience Master Data Management and Governance Big Data, Data Analytics TOGAF, Architectural frameworks (TOGAF is an architectural framework like ZACHMAN, UAM, others)
Description:
- Designs and implements systems and frameworks that can succeed long term (The Data Architect designs Architecture to meet business needs, within the agreed requirements, and are both pragmatic and supportive of the strategic architecture direction)
- Works in close collaboration with Business, Technology and Product stakeholders to contextualize the architecture design
- Verifies the consideration of non-functional requirements as part of the solution architecture
- Designs and produces design documents and solution roadmaps showing the states of transition for the Architecture
- Leverages the existing Portfolio of digital products, catalog of services and promotes the usage of existing building blocks
- Defines and manages standards, guidelines and processes for analytics solutions and architecture, aligning with the broader architecture council
- Remains up to date on IT industry practices and emerging standards
- Recommend emerging technologies for Data Analytics tech stack; optimize existing stack
- Contributes to the prioritization of technology enablers in solution roadmaps
- Be a champion for high standards to ensure Digital Data Standards for Processes and Technologies are shared and used.
Key Functional Requirements & Qualifications:
- Experience with Agile methodologies, supporting and working with cross-functional teams
- Experience in all phases of a Data lifecycle: concept & design, development, implementation, change and operation
- Demonstrate excellent skills of translating requirements into workable, fit for purpose, technical designs and solutions
- Have extensive experience in technical solution identification and rationalization
- Facilitates design and review sessions to define broad solutions with the big picture in mind
- Ability to understand architectural dependencies of technologies in a vast and complex environment
- Explore how new technologies can be applied to solve challenging business problems and compile architecture decision records (ADRs)
- Desires to work in a fast-paced, constantly evolving environment; ability to manage multiple priorities
- Ability to work independently as well as in a team with a proven ability to influence
- Structured, planned and traceable approach
- Integrity and trust -- unwavering commitment to "doing the right thing"
- Good communication and facilitation skills.
Key Technical Requirements & Qualifications:
- Bachelor’s degree in computer science, Information Systems, Software Engineering or similar field
- Strong Experience designing and implementing Enterprise Data & Analytics Architectures
Strong background in the areas of Data:
- Integration Services: Informatica Cloud, Airflow
- Database/Data Warehouse: AWS RDS, Snowflake, AWS Redshift, Knowledge of SQL
- Data modeling: SQLDBM
- BIG Data: Experience with distributed systems (Spark, Hadoop) is a plus
- Data Quality: Informatica CDQ
- Master Data Management: Informatica MDM
- Meta data management/Data Catalog: Informatica EDC
- Data Mesh and domain-driven design
- Data Governance
- Data Security: Experience in working within compliance (e.g.: quality, regulatory – data privacy, GxP, SOX) and cybersecurity requirements is a plus
- Experience in Agile development processes and DevOps methodology / principles:
- Cloud-based services (AWS, Azure)
- Infrastructure as a code (Terraform)
- GIT-based source code management (GitHub) and Continuous Integration and Continuous
- Delivery (CI/CD) practices (GitHub Actions)
- Python, Shell scripting
- Data storage technologies (Object, structured, and unstructured)
- Automating deployment, scaling, and management of containerized applications
- Experience in Data Science/cloud-based machine learning platforms (AWS Bedrock, AWS Sagemaker, Snowflake Cortex etc..,) is a plus.
- Minimum of 8 years related experience is required.