Role: GCP Data Engineer
Location: Phoenix AZ
Min Exp: 11
Prefers who are Certified in GCP
Client is looking for someone who is good at:
- Strong in GCP and Python
- Knowledge on migration projects
- experience on building reusable idempotent pipelines
- emphasize on data engineering principles on pipelines being built
- comprehensive knowledge on some of the core GCP services
JOB SUMMARY & PRINCIPAL DUTIES:
- A solid experience and understanding of considerations for largescale solutioning and operationalization of data warehouses data lakes and analytics platforms on GCP is a must.
- Monitors the Data Lake and Warehouse to ensure that the appropriate support teams are engaged at the right times.
- Design build and test scalable data ingestion pipelines perform end to end automation of ETL process for various datasets that are being ingested.
- Participate in peer review and provide feedback to the engineers keeping development best practices business and technical requirements in view
- Determine best way to extract application telemetry data structure it send to proper tool for reporting (Kafka Splunk).
- Work with business and crossfunctional teams to gather and document requirements to meet business needs.
- Provide support as required to ensure the availability and performance of ETL/ELT jobs.
- Provide technical assistance and cross training to business and internal team members.
- Collaborate with business partners for continuous improvement opportunities.
- Requirements
JOB SPECIFICATIONS:
- Education: Bachelors Degree in Computer Science Information Technology Engineering or related field
Experience Skills & Qualifications:
- 6 years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.
- 4 years of experience with one of the leading public clouds.
- 4 years of experience in design and build of salable data pipelines that deal with extraction transformation and loading.
- 4 years of experience with Python Scala with working knowledge on Notebooks.
- 2 years hands on experience on GCP Cloud data implementation projects (Dataflow DataProc Cloud Composer Big Query Cloud Storage GKE Airflow etc.).
- At least 2 years of experience in Data governance and Metadata Management.
- Ability to work independently solve problems update the stake holders.
- Analyze design develop and deploy solutions as per business requirements.
- Strong understanding of relational and dimensional data modeling.
- Experience in DevOps and CI/CD related technologies.
- Excellent written verbal communication skills including experience in technical documentation and ability to communicate with senior business managers and executives.