From data-agent-kit-starter-pack
Automates declarative provisioning of GCP data pipeline resources like BigQuery datasets/tables, Dataform, Dataproc, DTS via deployment.yaml for dev/staging/prod environments.
npx claudepluginhub gemini-cli-extensions/data-agent-kit-starter-pack --plugin data-agent-kit-starter-packThis skill uses the workspace's default tool permissions.
Create or update existing `deployment.yaml` file and deploy resources. All
Implements structured self-debugging workflow for AI agent failures: capture errors, diagnose patterns like loops or context overflow, apply contained recoveries, and generate introspection reports.
Monitors deployed URLs for regressions in HTTP status, console errors, performance metrics, content, network, and APIs after deploys, merges, or upgrades.
Provides React and Next.js patterns for component composition, compound components, state management, data fetching, performance optimization, forms, routing, and accessible UIs.
Create or update existing deployment.yaml file and deploy resources. All
configuration files MUST be maintained together in the repository root.
[!IMPORTANT] Whenever you generate resource definitions in
deployment.yaml, you MUST directly populate thedatacloudlabel underdefinition.labelsfor every resource to track the source of creation. Determine the value based on your current IDE environment:
- For Antigravity, set
datacloud: "antigravity"- For VS Code, set
datacloud: "vscode"- For any other environment, set
datacloud: "other"Do not use a variable substitution for this label; hardcode the appropriate string value directly into each resource definition (e.g., replacing
__REQUIRED_LABEL__placeholders).Special rule for BigQuery DTS Ingestion: Whenever you generate a
bigquerydatatransfer.transferConfigindeployment.yaml, you MUST also explicitly define its target destinationbigquery.datasetin the same file and apply thedatacloudlabel to it. You must do this even if the dataset already exists, to ensure the destination dataset's labels are patched and updated.
The framework supports deploying various GCP resources. To see the comprehensive list of supported resource types, run the following command:
gcloud beta orchestration-pipelines resource-types list
Refer to: references/gcp-pipeline-resource-provisioning_spec.md to understand
the template for deployment.yaml.
Before generating configurations, discover the actual values for the target project, region, environment, and commit SHA.
[!TIP] If
deployment.yamlalready exists in the repository root, prioritize extractingprojectandregionfrom the target environment configuration (e.g.,dev).
Project ID:
gcloud config get project
Project Number:
gcloud projects describe $(gcloud config get project) --format="value(projectNumber)"
Region:
gcloud config get-value compute/region
Commit SHA:
git rev-parse HEAD
Environment Name: If initialization is needed, you MUST ask the user for the environment name. If the user does not provide it, use dev as the default.
[!TIP]
Use these commands to replace placeholders like
YOUR_PROJECT_IDwith actual values. Always remove associated comments that start with TODO once replaced.
Create or update deployment.yaml in the repository root. This file maps
supported environments (dev, stage, prod) to their specific
configurations and resources.
[!TIP]
Use the Reference Spec: The agent can use the
references/gcp_pipeline_resource_provisioning_spec.mdfile as a template. It includes sample definitions for select supported resource types. Copy and adapt the required resource blocks into thedeployment.yaml. Usegcloud beta orchestration-pipelines resource-types listwhen needed.
[!IMPORTANT]
Handling Secrets & Privacy (CRITICAL): NEVER hardcode plain-text secrets in
deployment.yaml.
- Sensitive Data (Secrets): Sensitive information such as passwords, API keys, and other sensitive information MUST be stored in Secret Manager and declared in the
secrets:block ofdeployment.yaml.- Non-Sensitive Data (Variables): General configuration (e.g., dataset names, table IDs, regions) could be declared in the
variables:block.- Substitution via
{{ VAR }}: Bothvariables:andsecrets:MUST be used as{{ VARIABLE_NAME }}substitutions in resource definitions.- No Creation: The agent MUST NOT use the framework to create new secrets. If
gcloudindicates the secret does not exist, the agent MUST ask the user to create it manually and then re-verify.- Reference Only Policy: The agent's role is strictly limited to referencing existing secrets. The agent MUST NEVER read, print, or inspect the values of secrets.
- Safe Deployment: The actual value injection happens during deployment execution. The agent only provides the reference.
- Manual Secret Management: Advise the user to manage secret payloads and versions manually.
The agent MUST validate the deployment.yaml before generating the deployment
script. This ensures the configuration is syntactically correct and all
variables are resolvable.
gcloud beta orchestration-pipelines validate --environment=<ENV_NAME>
Run the following command to deploy the resources to the target environment.
gcloud beta orchestration-pipelines deploy --environment=<ENV_NAME> --local
[!NOTE] If a new transfer is being created, make sure to NOT remove the DTS transfer resource from
deployment.yamlafter it completes the run.
deployment.yaml exists in the repository root with actual discovered
values (no placeholders) and correct resource definitions.