Job description
- Act as a subject matter expert in data engineering and GCP data technologies.
- Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform.
- Work with Agile and DevOps techniques and implementation approaches in the delivery.
- Be required to showcase your GCP Data engineering experience when communicating with clients on their requirements, turning these into technical data solutions.
- Be required to build and deliver Data solutions using GCP products and offerings.
- Liaise and be part of our extensive GCP community, contributing in the knowledge exchange learning programme of the platform.
Skill:
- Hands on and deep experience working with Google Data Products (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.).
- Experience in Spark (Scala/Python/Java) and Kafka.
Experience in MDM, Metadata Management, Data Quality and Data Lineage tools.
- E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management.
- Regulatory and Compliance work in Data Management.
- E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy.
- Experience with SQL and NoSQL modern data stores.