صاحب العمل نشط
حالة تأهب وظيفة
سيتم تحديثك بأحدث تنبيهات الوظائف عبر البريد الإلكترونيحالة تأهب وظيفة
سيتم تحديثك بأحدث تنبيهات الوظائف عبر البريد الإلكترونيGCP Data engineer
Location: Remote (Candidates around Grand Rapids MI will be an advantage)
Primary Skill : Gcp Bigquery GCP Data Engineering Stacks
JD:
10 to 15 years of proven experience in modern cloud data engineering broader data
landscape experience and exposure and solid software engineering experience.
Prior experience architecting and building successful selfservice enterprise scale data
platforms in a green field environment with microservices based architecture.
Proficiency in building end to end data platforms and data services in GCP is a must.
Proficiency in tools and technologies: BigQuery Cloud Functions Cloud Run
Dataform Dataflow Dataproc SQL Python Airflow PubSub.
Experience with Microservices architectures Kubernetes Docker. Our microservices
are build using TypeScript NestJS NodeJS stack. Prefer candidates with this
experience.
Proficiency in architecting and designing and development experience with batch and
real time streaming infrastructure and workloads.
Solid experience with architecting and implementing metadata management including
data catalogues data lineage data quality and data observability for big data
workflows
Handson experience with GCP ecosystem and data lakehouse architectures.
Strong understanding of data modeling data architecture and data governance
principles.
Excellent experience with DataOps principles and test automation.
Excellent experience with observability tooling: Grafana Datadog.
Experience building Symantec layers.
Experience with Data Mesh architecture.
Experience building Semantic layers for data platforms.
Experience building scalable IoT architectures
دوام كامل