drjobs Data Engineer - GCP English

Data Engineer - GCP

صاحب العمل نشط

1 وظيفة شاغرة
drjobs

حالة تأهب وظيفة

سيتم تحديثك بأحدث تنبيهات الوظائف عبر البريد الإلكتروني
Valid email field required
أرسل الوظائف
drjobs
أرسل لي وظائف مشابهة
drjobs

حالة تأهب وظيفة

سيتم تحديثك بأحدث تنبيهات الوظائف عبر البريد الإلكتروني

Valid email field required
أرسل الوظائف
موقع الوظيفة drjobs

Pune - الهند

الراتب الشهري drjobs

لم يكشف

drjobs

لم يتم الكشف عن الراتب

عدد الوظائف الشاغرة

1 وظيفة شاغرة

الوصف الوظيفي

Job Description:
We are seeking a skilled and experienced GCP (Google Cloud Platform) Data Engineering Specialist to join our team. The ideal candidate should have 3 years of relevant experience with expertise in BigQuery Dataflow Spark and Pub/Sub. As a Data Engineering Specialist you will be responsible for designing developing and maintaining data pipelines data integration solutions and ETL processes on the GCP platform to support our datadriven applications and analytics initiatives.
Responsibilities:
  • Design develop and maintain data pipelines and ETL processes using BigQuery Dataflow Spark and Pub/Sub on the GCP platform.
  • Collaborate with data scientists data analysts and other stakeholders to gather requirements and define data engineering solutions that meet business needs.
  • Optimize and troubleshoot data pipelines for performance reliability and scalability.
  • Ensure data quality and integrity by implementing data validation cleansing and transformation processes.
  • Monitor and manage data processing workflows troubleshoot and resolve data processing issues.
  • Develop and maintain documentation for data engineering processes workflows and best practices.
  • Stay updated with the latest advancements in GCP data engineering technologies and recommend and implement improvements to existing data engineering processes.
  • Strong GCP Hadoop Hive Spark Unix Shell scripting and Python experience.
  • Exceptional troubleshooting skills in the following: GCP Hadoop Hive Spark Unix Shell scripting.
  • Construct and maintain ELT/ETL job processes sourcing from disparate systems throughout the enterprise and load into an enterprise data lake.
  • Effectively acquire and translate user requirements into technical specifications to develop automated data pipelines to satisfy business demand
Skills and Qualifications
  • Bachelors or Masters degree in Computer Science Engineering or a related field.
  • 3 years of relevant experience in data engineering with a strong focus on GCP technologies including BigQuery Dataflow Spark and Pub/Sub.
  • Handson experience in designing developing and optimizing data pipelines ETL processes and data integration solutions on GCP.
  • Proficiency in programming languages such as Python or Scala with experience in writing efficient and scalable code for data processing.
  • Strong understanding of data modeling data warehousing and data integration concepts.
  • Knowledge of data lake and data warehouse architectures data governance and data security best practices on GCP.
  • Excellent problemsolving skills with the ability to troubleshoot and resolve complex data engineering issues.
  • Strong communication and collaboration skills to work effectively with crossfunctional teams and stakeholders.
  • GCP certifications such as Google Cloud Certified Data Engineer Google Cloud Certified Professional Data Engineer or related certifications are a plus.

gcp,spark,data flow,data engineering,data integration,etl

نوع التوظيف

دوام كامل

نبذة عن الشركة

الإبلاغ عن هذه الوظيفة
إخلاء المسؤولية: د.جوب هو مجرد منصة تربط بين الباحثين عن عمل وأصحاب العمل. ننصح المتقدمين بإجراء بحث مستقل خاص بهم في أوراق اعتماد صاحب العمل المحتمل. نحن نحرص على ألا يتم طلب أي مدفوعات مالية من قبل عملائنا، وبالتالي فإننا ننصح بعدم مشاركة أي معلومات شخصية أو متعلقة بالحسابات المصرفية مع أي طرف ثالث. إذا كنت تشك في وقوع أي احتيال أو سوء تصرف، فيرجى التواصل معنا من خلال تعبئة النموذج الموجود على الصفحة اتصل بنا