drjobs CDB AI & Analytics - Development - Sr. Developer - MDV115 العربية

CDB AI & Analytics - Development - Sr. Developer - MDV115

Employer Active

The job posting is outdated and position may be filled
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Dearborn - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Job Description

1.Job title LOOKING FOR SR.DEVELOPER 2.Job summary "Job Description " Minimum 8 + Years working experience in Bigdata, Hadoop, Spark, Python, Scala, Kafka,SQLs,ETL development, data modelling " Hands on experience in GCP, Big Query, GCS Bucket, G-Cloud Function, cloud dataflow pub/sub cloud shell, GSUTIL, BQ command line utilities, DataProc, Stack driver " Experience in writing a program using g-cloud function to load data in to BigQuery for on arrival files in GCS bucket " Experience in writing a program to maintain raw file archival in GCS B 3.Experience 5to9yrs 4.Required Skills Technical Skills- ,Python,Hadoop Administration,Big Data Management Domain Skills- ,Industrial Manufacturing,Manufacturing Oper-Manlog 5.Nice to have skills Techincal Skills- ,Big Data,Java Domain Skills- 6.Technology Data Management 7.Shift Day 9:00AM-7:00PM 8.Roles & Responsibilities "Job Description " Minimum 8 + Years working experience in Bigdata, Hadoop, Spark, Python, Scala, Kafka,SQLs,ETL development, data modelling " Hands on experience in GCP, Big Query, GCS Bucket, G-Cloud Function, cloud dataflow pub/sub cloud shell, GSUTIL, BQ command line utilities, DataProc, Stack driver " Experience in writing a program using g-cloud function to load data in to BigQuery for on arrival files in GCS bucket " Experience in writing a program to maintain raw file archival in GCS Bucket " Designing any schema in BigQuery full scan for OLAP/BI use cases, experience in any technology usage for disk I/O throughput cloud platform economy of scale any technology in combination of MapRed and BigQuery to get better performance " Loading Data on incremental basis to BIGQUERY raw and UDM layer using SOQL, Google DataProc, GCS bucket, HIVE, Spark, Scala, Python, Gsutil and Shell Script. " Experience in writing a program to download Database (SQL Server, Oracle, DB2) dump and load it in GCS Bucket from GCS bucket to Database (hosted in Google cloud) and load to BigQuery using python/spark/Scala/DataProc " Experience in processing and loading bound and unbound Data from Google pub/sub to BigQuery using cloud Dataflow with scripting language " Using BigQuery rest API with (python/spark/Scala) to ingest Data from and some other site to BIGQUERY, build App Engine-Based Dashboards " Do participate in architecture council for database architecture recommendation. " Deep analysis on SQL execution plan and recommend hints or restructure or introduce index or materialized view for better performance " Open SSH tunnel to Google DataProc to access to yarn manager to monitor spark jobs. " Submit spark jobs using gsutil and spark submission get it executed in Dataproc cluster Qualifications: " 2+ Experience Google Cloud Platform technologies. " Experience in Private,hybrid or public cloud technology. " Process GCP and other cloud implementaion an 9.Job Location Primary: USMIDEAC01-Dearborn - MI USA, CLT Alternate: , 10.Job Type 60CW00 Business Associate 11.Demand Requires Travel? N 12.Certification(s) Required NA

Employment Type

Full Time

Company Industry

IT - Software Services

About Company

10 employees
Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.