12-15 YRS
As GCP Lead, you will be responsible for Supervising D2P Activity (Development/Code Review/Test Phases/Implementation Plan) solution on GCP. You will contribute in pre-sales and design, implement scalable and information solutions covering data security, data privacy, data governance, metadata management, multi-tenancy and mixed workload management, and provide delivery oversight.
Here’s how you’ll contribute:
GCP Lead participates in end-to-end cycle from opportunity identification to its closure and takes up complete ownerships of project execution and provide valuable expertise in the project. You will do this by:
- Understanding customer requirements and create technical proposition
- Managing and owning all aspects of technical development and delivery
- Contributing to SoWs, technical project roadmaps, etc required for successful execution of projects leveraging Technical Scoping & Solutioning approach
- Provide technical leadership and be a role model/coach to software engineers pursuing technical career path in engineering
- Ensuring code review and developing best practices
- Planning end to end technical scope of the project and customer engagement area including planning sprint and delivery
- Estimating effort, identifying risk and providing technical support whenever needed
- Demonstrating the ability to multitask and re prioritizing responsibility based on dynamic requirements
- Provide regular updates & guidance to leadership team regarding status, risks etc. on time.
- Leading & Mentoring teams as needed
- Understanding technical requirement and take part in technical discussion.
- Purpose and plan technical solution accordingly (Design, Development, and Implementation)
- Defining and dividing tasks based on the requirement.
- Take part in and host regular knowledge sharing sessions, mentor more junior members of the team and support the continuous development of our practice.
Skills required to contribute:
12-15+ Years of overall IT experience with –
- 6+ Experience with Google Cloud Platform (GCP) products including BigQuery, Cloud Storage, Cloud Functions, DataProc, DataStudio.
- Must have Google Cloud BigQuery experience, including Datasets, Objects, IAM roles/bindings, logging explorer, troubleshooting the issues.
- Understanding of CI/CD pipeline, Terraform scripting for deploying the objects, IAM bindings.
- Knowledge of having data modelling (Erwin) and governance, Objects review and best practices.
- Good knowledge of Data Warehouse concepts, ETL pipelines including Informatica/Talend, IICS, any RDBMS (nice to have Teradata)
- Excellent communication and presentation skills.
- Extensive experience in Google Cloud stack – Google Cloud Storage, Google BigQuery, Google Data Flow, Google DataProc, Google Data Studio etc.
- Experience in job scheduling using Oozie or Airflow or any other ETL scheduler
- Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala.
- Good experience in designing & delivering data analytics solutions using GCP Cloud native services.
- Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design
- Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies.
- Experienced in internal as well as external stakeholder management
- Professional Google Cloud Data engineer certification will be added advantage
- Nice to have skills: Working experience with Snowflake, Databricks, Open source stack like Hadoop Bigdata, Hive etc.