GCP Data Architect

Added: April 11, 2023
  • Country: United States
  • Region: Michigan



Position- GCP Data Architect- Hadoop

Duration- Contract

Location- Dearborn MI (Day 1 onsite)

 

Detail Job Description


Good to have GCP Certification (Either GCP Data Engineer or GCP Cloud
Architect)
15+ years of experience in Architecting Data projects and knowledge of
multiple Hadoop/Hive/Spark/ML implementation
5+ experience in Data modeling and Data warehouse and Data lake
implementation
Working experience in implementing Hadoop to GCS and HIVE to Bigquery
migration project
Ability to identify and gather requirements to define a solution to be
built and operated on GCP, perform high-level and low-level design for the GCP
platform
Capabilities to implement and provide GCP operations and deployment
guidance and best practices throughout the lifecycle of a project.
GCP technology areas of Data store, Big Query, Cloud storage, Persistent
disk IAM, Roles, Projects, Organization.
Databases including Big table, Cloud SQL, Cloud Spanner, Memory store, Data
Analytics Data Flow, DataProc, Cloud Pub/Sub, Kubernetes, Docker, managing
containers, container auto scaling and container security
GCP technology areas of Data store, Big Query, Cloud storage, Persistent
disk IAM, Roles, Projects, Organization.
Databases including Big table, Cloud SQL, Cloud Spanner, Memory store, Data
Analytics Data Flow, DataProc, Cloud Pub/Sub, Kubernetes, Docker, managing
containers, container auto scaling and container security
Experience in Design, Deployment, configuration and Integration of
application infrastructure resources including GKE clusters, Anthos, APIGEE and
DevOps Platform
Application development concepts and technologies (e.g. CI/CD, Java, Python)
Capabilities to implement and provide GCP operations and deployment
guidance and best practices throughout the lifecycle of a project. 


Reference : GCP Data Architect jobs

Job details

Apply for this job