صاحب العمل نشط
حالة تأهب وظيفة
سيتم تحديثك بأحدث تنبيهات الوظائف عبر البريد الإلكترونيحالة تأهب وظيفة
سيتم تحديثك بأحدث تنبيهات الوظائف عبر البريد الإلكترونيلم يكشف
لم يتم الكشف عن الراتب
Additional Information:
A high priority will be given to Dearborn local candidates (or those that will relocate). Expectation will be inoffice at least 1 day per week and at home rest of the time subject to change.
DATA ENGINEER ROLE; NOT A DEVELOPER
HACKERANK TEST IN GCP
HACKERANK CODING IS REQUIRED. You will provide me the email address to send the link of Hackerank. The sooner they take Hackerank and I get results the sooner I talk to them and submit.
Job Description:
Were seeking a Data Engineer who has experience building data products on a cloud analytics platform.
You will work on ingesting transforming and analyzing large datasets to support the Enterprise in the Data Factory on Google Cloud Platform (GCP).
Experience with large scale solution and operationalization of data lakes data warehouses and analytics platforms on Google Cloud Platform or other cloud environments is a must.
We are looking for candidates who have a broad set of technical skills across these areas.
You will:
Work in collaborative environment that leverages paired programming
Work on a small agile team to deliver curated data products
Work effectively with fellow data engineers product owners data champions and other technical experts
Demonstrate technical knowledge and communication skills with the ability to advocate for welldesigned solutions
Develop exceptional analytical data products using both streaming and batch ingestion patterns on Google Cloud Platform with solid data warehouse principles
Be the Subject Matter Expert in Data Engineering with a focus on GCP native services and other well integrated thirdparty technologies
Primary Skills Required:
Experience in working in an implementation team from concept to operations providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production
Experience in analyzing complex data organizing raw data and integrating massive datasets from multiple data sources to build analytical domains and reusable data products
Experience in working with architects to evaluate and productionalize data pipelines for data ingestion curation and consumption
Experience in working with stakeholders to formulate business problems as technical data requirements identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management
Additional Skills Preferred:
Strong drive for results and ability to multitask and work independently
Selfstarter with proven innovation skills
Ability to communicate and work with crossfunctional teams and all levels of management
Demonstrated commitment to quality and project timing
Demonstrated ability to document complex systems
Experience in creating and executing detailed test plans
Experience Required:
5 years of SQL development experience
5 years of analytics/data product development experience required
3 years of Google cloud experience with solutions designed and implemented at production scale
Experience working in GCP native (or equivalent) services like Big Query Google Cloud Storage PubSub Dataflow Dataproc Cloud Build etc.
Experience migrating Teradata to GCP
Experience working with Airflow for scheduling and orchestration of data pipelines
Experience working with Terraform to provision Infrastructure as Code
2 years professional development experience in Java or Python
Additional Experience Preferred
Indepth understanding of Googles product technology (or other cloud platform) and underlying architectures
Experience in working with DBT/Dataform
Experience with DataPlex or other data catalogs is preferred
Experience with development ecosystem such as Tekton Git Jenkins for CI/CD pipelines
Exceptional problem solving and communication skill
دوام كامل