Key Responsibilities
- Designing and implementing scalable highperformance GCPbased big data solutions
- Leading architecture design discussions with stakeholders to understand requirements and align on solutions
- Developing data pipelines and ETL processes to move transform and aggregate large volumes of data
- Optimizing data storage retrieval and processing using GCP services such as BigQuery Datastore and Pub/Sub
- Implementing security and access controls for big data solutions on GCP
- Ensuring data governance and compliance with regulatory requirements
- Collaborating with data scientists and analysts to provide access to and enable analysis of big data
- Monitoring and optimizing the performance of GCP big data solutions
- Identifying opportunities for innovation and improvement in technology and processes
- Providing technical guidance and mentoring to junior team members
Required Qualifications
- Bachelors degree in Computer Science Engineering or a related field; Masters degree preferred
- 5 years of experience in architecting and implementing GCP solutions for big data
- Proficiency in big data technologies and frameworks such as Hadoop Spark or Kafka
- Expertise in GCP services such as BigQuery Dataflow Dataproc and Pub/Sub
- Demonstrated experience in designing and implementing largescale data systems
- Strong understanding of cloud security principles and best practices
- Excellent problemsolving and troubleshooting skills
- Experience with agile development methodologies and DevOps practices
- Ability to communicate complex technical concepts effectively to nontechnical stakeholders
- Certifications in GCP and big data technologies are a plus
gcp,big data,cloud,data management,data warehouse,data architecture