• Remote

12-15 YRS

As GCP Lead, you will be responsible for Supervising D2P Activity (Development/Code Review/Test Phases/Implementation Plan) solution on GCP. You will contribute in pre-sales and design, implement scalable and information solutions covering data security, data privacy, data governance, metadata management, multi-tenancy and mixed workload management, and provide delivery oversight.

Here’s how you’ll contribute:

GCP Lead participates in end-to-end cycle from opportunity identification to its closure and takes up complete ownerships of project execution and provide valuable expertise in the project. You will do this by:

  • Understanding customer requirements and create technical proposition
  • Managing and owning all aspects of technical development and delivery
  • Contributing to SoWs, technical project roadmaps, etc required for successful execution of projects leveraging Technical Scoping & Solutioning approach
  • Provide technical leadership and be a role model/coach to software engineers pursuing technical career path in engineering
  • Ensuring code review and developing best practices
  • Planning end to end technical scope of the project and customer engagement area including planning sprint and delivery
  • Estimating effort, identifying risk and providing technical support whenever needed
  • Demonstrating the ability to multitask and re prioritizing responsibility based on dynamic requirements
  • Provide regular updates & guidance to leadership team regarding status, risks etc. on time.
  • Leading & Mentoring teams as needed
  • Understanding technical requirement and take part in technical discussion.
  • Purpose and plan technical solution accordingly (Design, Development, and Implementation)
  • Defining and dividing tasks based on the requirement.
  • Take part in and host regular knowledge sharing sessions, mentor more junior members of the team and support the continuous development of our practice.

Skills required to contribute:

12-15+ Years of overall IT experience with –

  1. 6+ Experience with Google Cloud Platform (GCP) products including BigQuery, Cloud Storage, Cloud Functions, DataProc, DataStudio.
  2. Must have Google Cloud BigQuery experience, including Datasets, Objects, IAM roles/bindings, logging explorer, troubleshooting the issues.
  3. Understanding of CI/CD pipeline, Terraform scripting for deploying the objects, IAM bindings.
  4. Knowledge of having data modelling (Erwin) and governance, Objects review and best practices.
  5. Good knowledge of Data Warehouse concepts, ETL pipelines including Informatica/Talend, IICS, any RDBMS (nice to have Teradata)
  6. Excellent communication and presentation skills.
  7. Extensive experience in Google Cloud stack – Google Cloud Storage, Google BigQuery, Google Data Flow, Google DataProc, Google Data Studio etc.
  8. Experience in job scheduling using Oozie or Airflow or any other ETL scheduler
  9. Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala.
  10. Good experience in designing & delivering data analytics solutions using GCP Cloud native services.
  11. Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design
  12. Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies.
  13. Experienced in internal as well as external stakeholder management
  14. Professional Google Cloud Data engineer certification will be added advantage
  15. Nice to have skills: Working experience with Snowflake, Databricks, Open source stack like Hadoop Bigdata, Hive etc.
Upload your CV/resume or any other relevant file. Max. file size: 512 MB.