We have a requirement for Big Data Enterprise Architect position
Big Data Enterprise Architect
Dallas, TX
Onsite W2 Contract
About the Role:
Role & Responsibilities
- Support technical pre-sales efforts and attend customer meetings as needed, to function as a trusted advisor to clients.
- Guide clients in setting up and execution of the roadmap and vision for enterprise data warehouse, big data, BI & analytics, and data management.
- Recommend & drive the standards for Data Quality across multiple datasets, Governance best practices and drive for data trust across client enterprise.
- Lead Cloud Strategy development, modernization defining business drivers, cloud adoption roadmap and business value.
- Provide program level support on client development projects in Big Data, Datawarehouse/Data Lake.
- Participate and oversee the design of Cloud solutions, from conceptual, logical, and physical design to meet client’s business and technical requirements leveraging architecture patterns and following development processes.
- Partner with client business and technology stakeholders to drive future state architecture for enterprise Data, Reporting and Analytics platform and solutions.
- Define architecture/blueprints, and advise clients on technology strategy, Migration/modernization, cloud adoption.
- Lead data migration, modernization projects of databases/data warehouses (Oracle or Teradata) from on-premises to AWS/GCP cloud.
- Qualifications:15+ Years of IT experience in Big data, Datawarehouse & Data Architecture.
- Experience with RDBMS, Data warehouse/Data Lake platforms like Oracle, Teradata, Snowflake, BI and ETL tools DataStage, Tableau.
- Implementation, management and tuning experience of Data Warehouse with complex ETL Data Pipelines.
- Experience in Data Modelling, Schema Design, Query Tuning and Optimization, and Data Migration and Integrations.
- Prior project experience in data normalization, meta data management, source to target data mapping documentation using industry standard tools.
- Prior experience in executing projects in data quality & Data Governance.
- Detailed understanding of the Cloud Infrastructure for building & maintaining large data enterprise in cloud platforms like GCP (BigQuery, BigTable, DataFlow, DataProc, DataPrep, Pub/Sub, Machine Learning).
- Minimum of 3 years of development, and implementation of enterprise-level data solutions utilizing large data sets.
- Good knowledge data governance and compliance regulations GDPR/CCPA/P.