drjobs GCP Data Architect العربية

GCP Data Architect

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Phoenix, OR - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Role: GCP Data Architect with 12 years of experience

Location: Phoenix AZ (Hybrid)

Duration: Long Term Contract

Must be certified in GCP





Key Skills: Data Architecture GCP Data Governance Metadata Management Devops CI/CD Python Scala

Strong experience with GCP (not AWS/Azure)




JOB SUMMARY & PRINCIPAL DUTIES:


A solid experience and understanding of considerations for largescale solutioning and operationalization of data warehouses data lakes and analytics platforms on GCP is a must.
Monitors the Data Lake and Warehouse to ensure that the appropriate support teams are engaged at the right times.
Design build and test scalable data ingestion pipelines perform end to end automation of ETL process for various datasets that are being ingested.
Participate in peer review and provide feedback to the engineers keeping development best practices business and technical requirements in view
Determine best way to extract application telemetry data structure it send to proper tool for reporting (Kafka Splunk).
Work with business and crossfunctional teams to gather and document requirements to meet business needs.
Provide support as required to ensure the availability and performance of ETL/ELT jobs.
Provide technical assistance and cross training to business and internal team members.
Collaborate with business partners for continuous improvement opportunities.


Requirements

JOB SPECIFICATIONS:

Education: Bachelors degree with 12 years of experience



Experience Skills & Qualifications:


6 years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.
4 years of experience with one of the leading public clouds.
4 years of experience in design and build of salable data pipelines that deal with extraction transformation and loading.
4 years of experience with Python Scala with working knowledge on Notebooks.
2 years hands on experience on GCP Cloud data implementation projects (Dataflow DataProc Cloud Composer Big Query Cloud Storage GKE Airflow etc.).
At least 2 years of experience in Data governance and Metadata Management.
Ability to work independently solve problems update the stake holders.
Analyze design develop and deploy solutions as per business requirements.
Strong understanding of relational and dimensional data modeling.
Experience in DevOps and CI/CD related technologies.
Excellent written verbal communication skills including experience in technical documentation and ability to communicate with senior business managers and executives.

AZURE , DATA WAREHOUSING , DATA ANALYTICS , DATA ARCHITECTURE , DATA GOVERNANCE , PYTHON , CI/CD , DEVOPS , SCALA , BIG QUERY , CLOUD STORAGE , AWS , METADATA MANAGEMENT

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.