Data Engineer
- Location: Dallas, TX
- Position is On-site
- Client : Mid sized Bank
- Contract : 12 + months
- Open on c2c or w2
The client is seeking a Data Engineer, with experience in Google Big Query, GCP, ETL Pipeline, BI, Cloud Skills, Microsoft SQL with experience in both building and designing.
Job Description -
We are looking for a Sr Google Cloud Data engineer, who wants to collaborate in an agile team of peers developing cloud based analytics platform integrating data from broad amount of systems to enable next-gen analytical products.
The Data Engineering Google Cloud Platform (GCP) Engineer is responsible to develop and deliver effective cloud solutions for different business units. This position requires in-depth knowledge and expertise in GCP services, architecture, and best practices. They will collaborate with cross-functional teams to design, implement, and manage scalable and reliable cloud solutions.
Qualifications:
What will help you succeed:
- Bachelors University degree computer science/IT. Master’s in data Analytics/Information Technology/Management Information System (preferred)
- At least 5-10 years of professional experience in building data engineering capabilities for various analytics portfolios with at least 2-3 years in GCP/Cloud based platform.
- Strong understanding of data fundamentals, knowledge of data engineering and familiarity with core cloud concepts
- Must have good implementation experience on various GCP’s Data Storage and Processing services such as Big Query, Dataflow, Bigtable, Data form, Data fusion, cloud spanner, Cloud SQL
- Must have solid programmatic experience of SQL, Python, Apache Spark
Your expertise in one or more of the following areas is highly valued:
- Google Cloud Platform, ideally with Google Big Query, Cloud Composer and Cloud Data Fusion, Cloud spanner, Cloud SQL
- Experience with legacy data warehouses (on SQL Server or any Relational Datawarehouse platform)
- Experience with our main tools DBT (Data Build Tool) , Terraform/Terragrunt, Git (CI/CD)
- Experience with a testing framework.
- Experience with Business Intelligence tools like PowerBI and/or Looker.
What sets you apart:
- Experience in complex migrations from legacy data warehousing solutions or on-prem Data Lakes to GCP
- Experience with building generic, re-usable capabilities and understanding of data governance and quality frameworks.
- Experience in building real-time ingestion and processing frameworks on GCP.
- Adaptability to learn new technologies and products as the job demands.
- Multi-cloud & hybrid cloud experience
- Any cloud certification (Preference to GCP Certifications)
- Experience working with Financial and Banking Industry
Responsibilities:
- You will directly work on the platform based on Google BigQuery and other GCP services to integrate new data sources and model the data up to the serving layer.
- Contribute to this unique opportunity as the program is set-up to completely rethink reporting and analytics with Cloud technology.
- Collaborate with different business groups, users to understand their business requirements and design and deliver GCP architecture, Data Engineering scope of work
- You will work on a large-scale data transformation program with the goal to establish a scalable, efficient and future-proof data & analytics platform.
- Work with cross-functional teams to design, implement, and manage scalable and reliable cloud solutions on GCP.
- Stay up-to-date with the latest GCP technologies, trends, and best practices and assess their applicability to client solutions.