Category
Security
1. Introduction
Cloud Asset Inventory is Google Cloud’s inventory and discovery service for resources, IAM policies, and governance policies across your Google Cloud environment. It gives you a consistent way to answer questions like: What resources do we have? Who can access them? What changed, and when?
In simple terms, Cloud Asset Inventory provides a searchable “catalog” of your Google Cloud assets—projects, buckets, service accounts, firewall rules, keys, and more—along with the security and policy metadata attached to them.
Technically, Cloud Asset Inventory is exposed through the Cloud Asset API and related tooling (Console, gcloud, client libraries). It can:
– List and search assets across projects, folders, or organizations
– Export asset snapshots to Cloud Storage or BigQuery for reporting and audits
– Track history of asset metadata and IAM/policy changes over time
– Publish real-time change notifications via feeds to Pub/Sub for automation
The core problem it solves is visibility: in a fast-changing cloud environment, security, operations, and compliance all depend on having an accurate, queryable view of resources and who/what can access them—plus a reliable way to detect and respond to change.
Service name note: The product is currently called Cloud Asset Inventory and is accessed programmatically via the Cloud Asset API. Verify the latest naming and feature set in the official documentation: https://cloud.google.com/asset-inventory/docs
2. What is Cloud Asset Inventory?
Official purpose
Cloud Asset Inventory is designed to help you inventory, search, monitor, and analyze Google Cloud assets and their associated policies (such as IAM policies and organization policies).
Core capabilities
Cloud Asset Inventory commonly supports these workflows (exact availability depends on asset type and scope—verify in docs for your environment): – Asset inventory: enumerate resources (assets) across a scope (project/folder/org) – Search: – Search resources across a scope using query syntax (resource properties, labels, names) – Search IAM policies to find principals and bindings – Export: – Export point-in-time snapshots (resources + selected policy types) to Cloud Storage or BigQuery – Historical views: – Retrieve change history for supported asset types (resource metadata and/or policies) – Real-time feeds: – Publish change events to Pub/Sub when assets change (create/update/delete, policy updates), enabling automation
Major components
- Cloud Asset API: REST API used by
gcloudand client libraries
Reference: https://cloud.google.com/asset-inventory/docs/reference/rest - Asset Search & Inventory: list/search endpoints, query syntax, scoping rules
- Export jobs: snapshot exports to Cloud Storage or BigQuery
- Feeds: configuration objects that send asset change notifications to Pub/Sub
- IAM/Org policy analysis (where supported): analysis methods to evaluate access and policy effects (verify current supported methods in docs)
Service type
- Managed Google Cloud service (control-plane API)
- Primarily metadata-oriented; it does not move your application data. It inventories resource and policy metadata.
Scope and locality (global vs regional)
Cloud Asset Inventory is typically considered a global service from a user perspective:
– You query by scope: projects/…, folders/…, or organizations/…
– Feeds are commonly created under a locations/global parent (verify current feed location requirements)
Even though your resources may live in specific regions/zones, Cloud Asset Inventory is about centralized metadata access across your hierarchy.
How it fits into the Google Cloud ecosystem
Cloud Asset Inventory is foundational for Security and governance because it connects well with: – IAM (who has access) – Organization Policy Service (policy constraints) – Access Context Manager (access policies; where applicable) – Cloud Logging / Audit Logs (who changed what; event source-of-truth) – Pub/Sub + automation (respond to changes) – BigQuery (analytics/reporting) – Security Command Center (security posture; CAI often supplies context and supports detection/response workflows)
3. Why use Cloud Asset Inventory?
Business reasons
- Faster audits: produce consistent asset and access reports for internal and external audits.
- Reduced risk: quickly find risky configurations (public buckets, over-permissive roles, unmanaged keys).
- Better change control: understand what changed after incidents or outages.
Technical reasons
- Single inventory API across many Google Cloud services.
- Search at scale: find resources by name, label, property, location, or IAM principal.
- Programmatic exports: create repeatable reports without scraping the Console.
Operational reasons
- Inventory drift detection: detect when reality diverges from intended infrastructure.
- Automation triggers: respond to changes in near real time using feeds → Pub/Sub.
- Central visibility: inventory across many projects from an org-level view (with the right permissions).
Security/compliance reasons
- Who-has-access analysis: find where a principal appears in IAM policies.
- Policy visibility: export IAM policies and organization policies to prove compliance posture.
- Forensics: historical asset views support investigations (for supported asset types and time ranges—verify).
Scalability/performance reasons
- Designed for large organizations with many projects and resources.
- Export/search workflows reduce the need for per-project scripts and manual enumeration.
When teams should choose it
Choose Cloud Asset Inventory when you need: – Centralized inventory and search across Google Cloud – Repeatable exports for governance reporting – Change monitoring hooks for security automation – Support for IAM/policy visibility as part of your security program
When teams should not choose it
Cloud Asset Inventory may not be the right primary tool if: – You need a full configuration compliance engine with remediation workflows out-of-the-box (consider pairing with policy-as-code, SCC, or third-party CSPM tools). – You need real-time application telemetry (use Cloud Monitoring/Logging). – You need a CMDB with business-level relationships and approvals—CAI is metadata-centric and cloud-native, not a full ITSM platform.
4. Where is Cloud Asset Inventory used?
Industries
- Financial services (SOX, PCI, internal controls)
- Healthcare (HIPAA-aligned governance and auditing)
- Retail/e-commerce (fast-changing environments; least privilege)
- SaaS and technology companies (multi-project scale, automation)
- Public sector (strict compliance, traceability)
Team types
- Security engineering and cloud security (visibility and detection)
- Platform engineering (inventory, standardization, automation)
- SRE/operations (change tracking, incident response)
- DevOps teams (drift and governance guardrails)
- Compliance and audit teams (evidence generation)
Workloads and architectures
- Multi-project organizations (dev/test/prod separated)
- Microservices on GKE/Cloud Run plus managed services
- Data platforms (BigQuery, GCS, Dataproc) with sensitive datasets
- Hybrid organizations using shared VPC and centralized networking
Real-world deployment contexts
- Org-level asset reporting into BigQuery for dashboards
- Near real-time change detection via Pub/Sub feeds and automated responders
- IAM access reviews across many projects
- M&A / consolidation: understanding what exists before moving projects/folders
Production vs dev/test usage
- In dev/test, CAI helps track resource sprawl and ensure sandbox boundaries.
- In production, CAI becomes a governance primitive: scheduled exports, audit evidence, change-triggered automation, and security investigations.
5. Top Use Cases and Scenarios
Below are realistic, production-relevant use cases. Each includes the problem, why Cloud Asset Inventory fits, and a brief scenario.
1) Organization-wide resource inventory for audits
- Problem: You need a complete list of cloud resources across hundreds of projects.
- Why it fits: CAI can list and export resources across an organization/folder scope.
- Scenario: Quarterly audit requires a snapshot of all storage buckets, KMS keys, and service accounts.
2) IAM access review (“Who has access to what?”)
- Problem: You must identify users/groups/service accounts with privileged roles across the estate.
- Why it fits: CAI can search IAM policies and export IAM policy data for analysis.
- Scenario: Security team searches for
roles/ownerbindings or a specific external principal across all projects.
3) Detect creation of risky resources (public endpoints, public buckets)
- Problem: Resources are created outside standards, introducing exposure.
- Why it fits: Feeds can publish asset changes to Pub/Sub, enabling automated checks.
- Scenario: On new bucket creation, a Cloud Run job checks IAM bindings and flags public access.
4) Change tracking during incident response
- Problem: An outage occurs; you need to know what changed recently.
- Why it fits: CAI supports asset history for supported types and time windows.
- Scenario: You retrieve recent history for a firewall rule or IAM policy to correlate with incident timing.
5) Compliance reporting into BigQuery
- Problem: Auditors want recurring evidence in a queryable format.
- Why it fits: Export snapshots to BigQuery for SQL-based compliance checks.
- Scenario: A nightly export powers dashboards that highlight noncompliant IAM patterns.
6) Migrations and reorganizations (project/folder moves)
- Problem: Moving projects can break inherited policies and access patterns.
- Why it fits: CAI includes analysis capabilities related to policy impact and moves (verify supported analysis methods).
- Scenario: Before moving projects under a new folder, you evaluate potential policy changes and access effects.
7) Enforcing tagging/labeling standards
- Problem: Teams don’t consistently apply labels, making cost/security governance harder.
- Why it fits: Search resources missing labels and export results for remediation tickets.
- Scenario: Weekly report identifies compute instances missing
env,owner, ordata_classificationlabels.
8) Build an internal CMDB-like cloud catalog
- Problem: You need a searchable internal portal for cloud assets.
- Why it fits: CAI provides authoritative inventory; you can sync to an internal database.
- Scenario: A scheduled export loads into BigQuery; an internal UI queries BigQuery and shows ownership/labels.
9) Least-privilege cleanup for service accounts
- Problem: Service accounts accumulate permissions over time.
- Why it fits: Search IAM policies for a service account principal across all projects/resources.
- Scenario: A decommission initiative uses CAI to find all bindings for old service accounts.
10) Validate guardrails after landing zone changes
- Problem: Platform team updates org policies and wants to confirm coverage.
- Why it fits: CAI exports org policies and governed resources/containers (where supported).
- Scenario: After restricting external IPs, team checks whether any projects still allow exceptions.
11) Security posture enrichment for alerts
- Problem: An alert fires, but you need context about the resource and its IAM.
- Why it fits: CAI lookups enrich incidents with asset metadata and access context.
- Scenario: SOC tooling queries CAI to attach labels, project ancestry, and IAM bindings to alerts.
12) Inventory-based cost optimization inputs
- Problem: You want to find unused or oversized resources.
- Why it fits: CAI provides the “what exists” baseline; pair with monitoring for utilization.
- Scenario: Weekly inventory feeds a pipeline that joins CAI exports with Cloud Monitoring metrics.
6. Core Features
Feature availability varies by asset type and scope. Always verify supported asset types, content types, and API methods in the official docs: https://cloud.google.com/asset-inventory/docs
6.1 Asset listing (inventory)
- What it does: Lists assets (resources) under a project, folder, or organization.
- Why it matters: Gives you a reliable baseline of what exists.
- Practical benefit: Replace ad-hoc scripts that call dozens of service-specific APIs.
- Limitations/caveats: Not every Google Cloud product exposes identical metadata fields; some assets may have partial metadata.
6.2 Resource search
- What it does: Searches resources across a scope using query syntax (names, labels, properties).
- Why it matters: Find assets quickly without knowing which project they’re in.
- Practical benefit: “Find all buckets labeled
pci=true” or “find all instances with external IP” (where metadata supports it). - Limitations/caveats: Search results depend on indexed metadata; confirm query syntax and supported fields.
6.3 IAM policy search
- What it does: Searches IAM policies across a scope to find where principals have access.
- Why it matters: Central tool for access reviews and incident response.
- Practical benefit: Find all bindings for a user, group, or service account across the org.
- Limitations/caveats: Interpreting effective access can be complex (inheritance, conditions, organization policies). Consider using analysis methods where supported.
6.4 Export asset snapshots to Cloud Storage
- What it does: Exports selected content types (resources and/or policies) to Cloud Storage as files.
- Why it matters: Low-friction archival, evidence retention, and offline processing.
- Practical benefit: Store daily snapshots for compliance; run batch jobs on exported data.
- Limitations/caveats: Storage costs apply; exporting frequently at org scale creates many objects.
6.5 Export asset snapshots to BigQuery
- What it does: Exports assets into BigQuery tables for SQL analysis.
- Why it matters: BigQuery makes it easy to build dashboards and recurring compliance queries.
- Practical benefit: Join asset inventory with other datasets (billing export, security findings, CMDB data).
- Limitations/caveats: BigQuery storage/query costs apply; schema can change as asset types evolve—design queries defensively.
6.6 Asset history
- What it does: Retrieves historical states of supported assets over a time window.
- Why it matters: Supports investigations and change reviews.
- Practical benefit: See when an IAM policy changed and what it was before.
- Limitations/caveats: History retention and support vary—verify time ranges and supported assets.
6.7 Real-time feeds to Pub/Sub
- What it does: Publishes change notifications for selected assets/content types to a Pub/Sub topic.
- Why it matters: Enables near real-time governance automation.
- Practical benefit: Trigger remediation checks when a sensitive resource is created or a policy changes.
- Limitations/caveats: Pub/Sub costs apply; you must grant the Cloud Asset service agent publish rights; event delivery is “near real-time,” not a strict SLA for instantaneous triggers—design idempotent consumers.
6.8 IAM and policy analysis (where supported)
- What it does: Provides analysis endpoints that help evaluate IAM access, policies, and potential impacts (method availability evolves).
- Why it matters: Moves beyond “what is configured” to “what is effectively governed.”
- Practical benefit: Support risk assessments and safer migrations.
- Limitations/caveats: Coverage varies; verify in the API reference for the exact analysis methods you plan to use.
7. Architecture and How It Works
High-level service architecture
Cloud Asset Inventory aggregates metadata from Google Cloud resource APIs and policy systems. You access it via:
– Google Cloud Console (limited views)
– gcloud asset … commands
– Cloud Asset API (REST + client libraries)
Key flows: 1. Read/query flow: user or system queries CAI for current state or search results. 2. Export flow: CAI writes a snapshot to Cloud Storage or BigQuery. 3. Change notification flow: CAI detects changes and publishes notifications to Pub/Sub topics configured by feeds.
Request/data/control flow
- Control plane request: Your caller identity (user or service account) calls the Cloud Asset API.
- Authorization: IAM checks whether the caller has permission to read assets under the requested scope.
- Data retrieval: CAI returns asset metadata/policies (or launches an export job).
- Export delivery: CAI writes into Cloud Storage or BigQuery using service-managed mechanisms; you must ensure the destination permissions are correct.
- Feed delivery: Cloud Asset service agent publishes messages to Pub/Sub.
Integrations with related services
Common, practical integrations: – Pub/Sub: event-driven asset change processing (feeds) – Cloud Functions / Cloud Run: consumers that validate and remediate changes – BigQuery: compliance analytics and dashboards – Cloud Storage: archival exports, evidence retention – Cloud Logging: correlate CAI changes with Admin Activity audit logs – Security Command Center (SCC): CAI exports/search can enrich findings and support posture reporting (integration patterns vary—verify for your SCC tier)
Dependency services
- IAM and the resource hierarchy (Resource Manager) are fundamental.
- Destination services for exports and feeds: BigQuery, Cloud Storage, Pub/Sub.
Security/authentication model
- Access to CAI is governed by IAM permissions on the scope (org/folder/project).
- Feeds require a service agent to publish to Pub/Sub; you grant Pub/Sub publisher permissions on the topic.
Networking model
- CAI is accessed over Google APIs endpoints.
- Your callers typically do not need VPC connectivity to use the API, but may use:
- Private access patterns (for workloads) via standard Google API access approaches (verify your organization’s networking requirements)
- Data egress charges usually relate to destinations or consumers, not CAI itself.
Monitoring/logging/governance considerations
- Use Cloud Audit Logs for tracking who called the Cloud Asset API and who changed resources/policies.
- Monitor Pub/Sub subscriptions for feed consumer lag/backlog.
- Treat exports as sensitive: inventory and IAM policy snapshots often contain security-relevant metadata.
Simple architecture diagram
flowchart LR
A[Engineer / Automation] -->|gcloud / API| B[Cloud Asset Inventory]
B --> C[Search & List Results]
B -->|Export| D[Cloud Storage]
B -->|Export| E[BigQuery]
B -->|Feed notifications| F[Pub/Sub Topic]
F --> G[Cloud Run / Functions Consumer]
G --> H[Ticketing / Slack / Remediation]
Production-style architecture diagram
flowchart TB
subgraph Org[Google Cloud Organization]
O1[Folders / Projects]
O2[IAM Policies & Org Policies]
O3[Resources (GCS, GCE, GKE, KMS, ...)]
end
subgraph CAI[Cloud Asset Inventory]
S1[Inventory & Search]
S2[Asset History]
S3[Exports]
S4[Feeds]
end
Org --> CAI
S3 -->|Daily export| BQ[(BigQuery Dataset: Asset Snapshots)]
S3 -->|Archive| GCS[(Cloud Storage: Evidence)]
S4 --> PST[Pub/Sub Topic: Asset Changes]
PST --> RUN[Cloud Run: Policy Checker]
PST --> SIEM[SIEM / SOAR Ingestion]
RUN --> LOGS[Cloud Logging]
RUN --> REM[Automated Remediation\n(e.g., remove public binding)]
RUN --> CASES[Case Management / Tickets]
BQ --> DASH[Security & Compliance Dashboards]
LOGS --> ALERTS[Monitoring Alerts]
8. Prerequisites
Account and project requirements
- A Google Cloud account with access to a billing-enabled project (for any paid dependent services like Pub/Sub, BigQuery, Cloud Storage).
- If you want org/folder-wide inventory, you need access to an Organization and appropriate permissions.
Permissions / IAM roles
You need IAM permissions in two categories: 1. To query/export assets via Cloud Asset Inventory (viewer/search/export permissions on the scope) 2. To write to destinations (Cloud Storage bucket permissions, BigQuery dataset permissions) 3. To create and run feeds (Cloud Asset feed permissions + Pub/Sub permissions)
Common roles (verify exact role names and required permissions in official docs): – Cloud Asset Inventory roles: https://cloud.google.com/asset-inventory/docs/access-control – Pub/Sub roles: https://cloud.google.com/pubsub/docs/access-control – Cloud Storage roles: https://cloud.google.com/storage/docs/access-control – BigQuery roles: https://cloud.google.com/bigquery/docs/access-control
For the hands-on lab, a practical minimum is typically:
– Ability to enable APIs (roles/serviceusage.serviceUsageAdmin or broader)
– Cloud Asset Inventory permissions to list/search/export and create feeds
– Pub/Sub admin permissions (to create topic/subscription and grant publisher role)
– Storage admin or bucket-level permissions to create a bucket and write exports
Billing requirements
- Cloud Asset Inventory pricing may be listed as no additional charge (verify on the official pricing page).
- You still pay for dependent services used in the tutorial:
- Pub/Sub (messages, storage)
- Cloud Storage (objects, storage, operations)
- BigQuery (storage, queries) if you choose to export there
CLI / tools
- Google Cloud SDK (
gcloud): https://cloud.google.com/sdk/docs/install - Optional:
jqfor JSON formatting in the terminal
Region availability
- Cloud Asset Inventory is accessed globally.
- Your export destinations (buckets/datasets) are regional/multi-regional—choose according to data residency needs.
Quotas / limits
- Cloud Asset Inventory API quotas apply (request rates, export sizes, etc.). Verify in:
- Quotas documentation: https://cloud.google.com/asset-inventory/quotas (verify exact URL in docs navigation)
- Pub/Sub quotas may matter if you generate many changes.
Prerequisite services
- Enable the Cloud Asset API (
cloudasset.googleapis.com) - For this lab:
- Cloud Storage API (usually enabled by default in many projects)
- Pub/Sub API
9. Pricing / Cost
Current pricing model (how to verify)
Check the official pricing page and calculator:
– Cloud Asset Inventory pricing: https://cloud.google.com/asset-inventory/pricing
– Google Cloud Pricing Calculator: https://cloud.google.com/products/calculator
As of this writing, Google Cloud commonly positions Cloud Asset Inventory as no additional charge, but you must verify the current pricing and any billable SKUs in the official pricing page.
Pricing dimensions to understand
Even if Cloud Asset Inventory itself is free (verify), your solution cost is driven by:
-
Export destinations – Cloud Storage – Storage (GB-month) – Operations (PUT/GET/LIST) – Lifecycle transitions (if used) – BigQuery – Storage for exported tables – Query processing (on-demand or flat-rate) – Scheduled queries / BI Engine (if used)
-
Real-time feeds – Pub/Sub – Message delivery and storage – Subscription delivery type (pull vs push) – Retention and backlog (storage cost if backlog grows)
-
Automation consumers – Cloud Run / Cloud Functions – Compute time – Requests – Logging volume
-
Logging and monitoring – Cloud Logging ingestion and retention for high-volume consumers can become non-trivial.
Free tier considerations
- Some dependent services have free tiers (Pub/Sub, Cloud Storage, BigQuery) depending on account and region. Free tiers change—verify in official pricing docs.
Cost drivers (practical)
- Scope size: org-wide exports with thousands of projects and millions of assets are much heavier than a single project.
- Export frequency: hourly exports cost more than daily.
- Feed volume: if you emit events for many asset types, Pub/Sub volume can spike.
- BigQuery query patterns: dashboards that scan full tables frequently can be expensive.
Hidden/indirect costs
- Operational overhead: building and maintaining consumers, dashboards, and remediation logic.
- Data retention: compliance often demands long retention; storage costs accumulate.
- Security overhead: access control to exported IAM policies may require extra governance and tooling.
Network/data transfer implications
- CAI is metadata-based; most costs are not classic “data egress.”
- Costs primarily come from API usage and destination services rather than moving large application data.
- If you export and then move data across regions or out of Google Cloud, standard egress may apply.
How to optimize cost
- Export only the content types you need (resources vs IAM/org/access policies).
- Limit export scope (start at folder/project before org-wide).
- Use Cloud Storage lifecycle rules to archive/delete older snapshots.
- Partition and cluster BigQuery tables if you query them heavily (verify exported schema patterns).
- Keep Pub/Sub consumers healthy to avoid backlog storage.
- Sample: monitor only sensitive asset types in feeds (e.g., IAM policy changes, buckets, KMS keys) rather than “everything.”
Example low-cost starter estimate (no fabricated numbers)
A low-cost starter setup typically includes: – Infrequent exports (weekly/daily) to a small Cloud Storage bucket – One Pub/Sub topic + subscription – A lightweight consumer that only logs/alerts on critical events
Your actual cost depends on: – Number of exported assets – Export frequency – Pub/Sub event volume – Retention duration in Storage/BigQuery Use the pricing calculator to model your specific environment: https://cloud.google.com/products/calculator
Example production cost considerations
For production, the big cost levers are: – Org-wide BigQuery exports (storage + frequent dashboard queries) – High-volume Pub/Sub feeds (especially if monitoring many asset types) – Multiple consumers and SIEM ingestion – Long retention of daily snapshots for audit evidence
10. Step-by-Step Hands-On Tutorial
Objective
Set up a practical Cloud Asset Inventory workflow in a Google Cloud project: 1. Enable the Cloud Asset API 2. Search and list assets 3. Export an asset snapshot to Cloud Storage 4. Create a real-time feed to Pub/Sub 5. Trigger a change and observe notifications 6. Clean up resources to avoid ongoing cost
This lab is designed to be safe and low-cost, but it still uses billable services (Pub/Sub, Cloud Storage). Keep retention short and clean up.
Lab Overview
You will create: – A Cloud Storage bucket to store an export snapshot – A Pub/Sub topic and subscription – A Cloud Asset Inventory feed that publishes changes to the Pub/Sub topic – A test resource change (create a storage bucket) to generate a feed event
You will verify: – You can search assets – A snapshot export file is created in Cloud Storage – A Pub/Sub message arrives when an asset changes
Step 1: Select a project and configure your environment
1) Authenticate and set your project:
gcloud auth login
gcloud config set project YOUR_PROJECT_ID
2) Store variables:
export PROJECT_ID="$(gcloud config get-value project)"
export PROJECT_NUMBER="$(gcloud projects describe "$PROJECT_ID" --format='value(projectNumber)')"
export REGION="us-central1"
Expected outcome: PROJECT_ID and PROJECT_NUMBER are set and point to the project you will use.
Verify:
echo "$PROJECT_ID"
echo "$PROJECT_NUMBER"
Step 2: Enable required APIs
Enable Cloud Asset API and Pub/Sub (and Storage if needed):
gcloud services enable cloudasset.googleapis.com
gcloud services enable pubsub.googleapis.com
gcloud services enable storage.googleapis.com
Expected outcome: APIs are enabled without errors.
Verify enabled services (optional):
gcloud services list --enabled --filter="name:cloudasset.googleapis.com OR name:pubsub.googleapis.com"
Step 3: Run a basic Cloud Asset Inventory search
Search for resources in the project (example: all storage buckets, if any exist):
gcloud asset search-all-resources \
--scope="projects/$PROJECT_ID" \
--asset-types="storage.googleapis.com/Bucket" \
--format="table(name, assetType, location, project)"
If the project has no buckets yet, the result may be empty. That’s fine.
Now search broadly (no asset type filter):
gcloud asset search-all-resources \
--scope="projects/$PROJECT_ID" \
--query='state:ACTIVE' \
--format="table(name, assetType, location)"
Expected outcome: You see a list of assets (depends on what exists in your project).
Step 4: Create a Cloud Storage bucket for exports
Create a dedicated bucket for CAI exports. Bucket names must be globally unique.
export EXPORT_BUCKET="cai-export-$PROJECT_ID-$RANDOM"
gcloud storage buckets create "gs://$EXPORT_BUCKET" --location="$REGION"
Expected outcome: A new bucket is created.
Verify:
gcloud storage buckets list --filter="name:$EXPORT_BUCKET"
Step 5: Export an asset snapshot to Cloud Storage
Run an export to Cloud Storage. This creates a point-in-time snapshot.
export EXPORT_PATH="gs://$EXPORT_BUCKET/snapshots/$(date +%Y%m%d-%H%M%S)/"
gcloud asset export \
--scope="projects/$PROJECT_ID" \
--output-path="$EXPORT_PATH" \
--content-type="resource"
Notes:
– --content-type controls what is exported. Common values include resource and policy-related types (exact values and naming can vary between CLI/API versions—verify via gcloud asset export --help and docs).
– For IAM policy exports, you may use a policy content type if supported by your gcloud version and permissions.
Expected outcome: The export operation completes and writes one or more objects to your Cloud Storage path.
Verify:
gcloud storage ls "$EXPORT_PATH"
Download one file locally (optional):
gcloud storage cp "$(gcloud storage ls "$EXPORT_PATH" | head -n 1)" .
ls -lah
Step 6: Create a Pub/Sub topic and subscription
Create a topic and a pull subscription:
export TOPIC="cai-asset-changes"
export SUBSCRIPTION="cai-asset-changes-sub"
gcloud pubsub topics create "$TOPIC"
gcloud pubsub subscriptions create "$SUBSCRIPTION" --topic="$TOPIC"
Expected outcome: Topic and subscription exist.
Verify:
gcloud pubsub topics list --filter="name:$TOPIC"
gcloud pubsub subscriptions list --filter="name:$SUBSCRIPTION"
Step 7: Grant the Cloud Asset service agent permission to publish to Pub/Sub
Cloud Asset Inventory feeds publish to Pub/Sub using a service agent. You must grant that identity permission on the topic.
The Cloud Asset service agent commonly follows this pattern (verify in official docs for feeds):
– service-PROJECT_NUMBER@gcp-sa-cloudasset.iam.gserviceaccount.com
Set it:
export CLOUDASSET_SERVICE_AGENT="service-${PROJECT_NUMBER}@gcp-sa-cloudasset.iam.gserviceaccount.com"
echo "$CLOUDASSET_SERVICE_AGENT"
Grant Pub/Sub Publisher on the topic:
gcloud pubsub topics add-iam-policy-binding "$TOPIC" \
--member="serviceAccount:$CLOUDASSET_SERVICE_AGENT" \
--role="roles/pubsub.publisher"
Expected outcome: IAM policy binding is added successfully.
Troubleshooting note: If this fails because the service agent doesn’t exist yet, create a feed first (next step) and retry, or verify the correct service agent identity in the feeds documentation.
Step 8: Create a Cloud Asset Inventory feed
Create a feed scoped to your project that publishes changes for Storage Buckets.
Feed parent often uses projects/PROJECT_ID and a --location flag in gcloud. The exact CLI flags can change—use gcloud asset feeds create --help to confirm.
Example:
export FEED_ID="bucket-changes-feed"
gcloud asset feeds create "$FEED_ID" \
--project="$PROJECT_ID" \
--asset-types="storage.googleapis.com/Bucket" \
--content-type="resource" \
--pubsub-topic="projects/$PROJECT_ID/topics/$TOPIC"
Expected outcome: Feed is created.
Verify:
gcloud asset feeds list --project="$PROJECT_ID"
gcloud asset feeds describe "$FEED_ID" --project="$PROJECT_ID"
If feed creation fails due to permissions, confirm: – You have Cloud Asset feed permissions – The Pub/Sub topic exists and has the publisher binding for the service agent – APIs are enabled
Step 9: Trigger a change (create a bucket) and pull a feed message
Create a new bucket to generate an asset change event:
export TEST_BUCKET="cai-test-$PROJECT_ID-$RANDOM"
gcloud storage buckets create "gs://$TEST_BUCKET" --location="$REGION"
Wait a short period (feeds are near real-time; allow a minute or two), then pull messages:
gcloud pubsub subscriptions pull "$SUBSCRIPTION" --limit=5 --auto-ack
Expected outcome: You see one or more messages related to the bucket creation/change. The payload format is defined by Cloud Asset Inventory feed notifications—treat it as an event envelope you parse in downstream automation. If messages don’t arrive immediately, wait and retry.
Validation
Use this checklist:
1) Search works
– You can run gcloud asset search-all-resources and get results (even if limited).
2) Export created objects
– gcloud storage ls "$EXPORT_PATH" shows one or more objects.
3) Feed is active
– gcloud asset feeds describe "$FEED_ID" returns details, including the Pub/Sub topic.
4) Change generated a Pub/Sub message
– Pulling the subscription returns at least one message after creating TEST_BUCKET.
Optional: confirm the bucket appears in CAI search:
gcloud asset search-all-resources \
--scope="projects/$PROJECT_ID" \
--asset-types="storage.googleapis.com/Bucket" \
--query="name:$TEST_BUCKET" \
--format="table(name, assetType, location)"
Troubleshooting
Common issues and fixes:
1) Permission denied when searching or exporting – Ensure your identity has Cloud Asset Inventory permissions on the scope. – If searching at org/folder scope, you need permissions at that level.
2) Export fails writing to Cloud Storage
– Ensure your identity has permission to write to the bucket path.
– Confirm bucket exists and you spelled gs:// path correctly.
3) Feed creation fails with Pub/Sub permission errors
– Ensure the Cloud Asset service agent has roles/pubsub.publisher on the topic.
– Confirm the service agent email format in official docs for your project/service.
4) No Pub/Sub messages arrive – Wait longer; then pull again. – Confirm your feed asset types and content type. – Confirm you created/changed a resource that matches the feed filter. – Confirm subscription pull is correct and messages weren’t already acked.
5) gcloud asset command flags differ
– Run:
– gcloud asset --help
– gcloud asset feeds create --help
– CLI options can change across Cloud SDK versions—use the help output as the source of truth.
Cleanup
To avoid ongoing costs, delete the resources you created:
Delete the test bucket and export bucket:
gcloud storage rm --recursive "gs://$TEST_BUCKET"
gcloud storage rm --recursive "gs://$EXPORT_BUCKET"
Delete the feed:
gcloud asset feeds delete "$FEED_ID" --project="$PROJECT_ID"
Delete Pub/Sub resources:
gcloud pubsub subscriptions delete "$SUBSCRIPTION"
gcloud pubsub topics delete "$TOPIC"
(Optional) If this was a dedicated lab project, delete the entire project (be careful):
# gcloud projects delete "$PROJECT_ID"
11. Best Practices
Architecture best practices
- Start with scope design: decide whether inventory is managed at project, folder, or org scope.
- Separate duties:
- Security team runs org-wide exports/search
- App teams operate within project scope
- Use BigQuery for analytics, Storage for archives:
- BigQuery for dashboards and compliance queries
- Storage for immutable snapshots and long-term evidence
IAM/security best practices
- Apply least privilege:
- Read-only roles for inventory viewers
- Separate admin role for feed creation/modification
- Restrict who can export IAM policies; inventory data can be security-sensitive.
- Use dedicated service accounts for automation and scheduled exports.
- Use conditional IAM (where applicable) to limit export actions (for example, time-bound access), but test carefully.
Cost best practices
- Don’t export everything at high frequency by default.
- Use lifecycle policies on export buckets (delete or archive old snapshots).
- In BigQuery, reduce query costs by:
- Scheduling incremental logic (when possible)
- Avoiding repeated full-table scans
- Using partitions/clustering if compatible with export schemas (verify)
Performance best practices
- Prefer export + query for large-scale analysis rather than repeated API calls.
- Keep feed consumers idempotent and backpressure-aware (Pub/Sub retry patterns).
- Use filtering (asset types) to reduce feed volume.
Reliability best practices
- Run feed consumers in multiple instances (Cloud Run) and handle duplicates.
- Alert on Pub/Sub backlog and dead-letter patterns (if used).
- Maintain runbooks for “inventory data stale” scenarios (exports failing, feeds paused).
Operations best practices
- Centralize logs for feed consumers and export pipelines.
- Track export job success/failure and notify owners.
- Maintain a versioned “inventory schema contract” for downstream consumers (schemas can evolve).
Governance/tagging/naming best practices
- Enforce consistent labels/tags (owner, env, data classification).
- Maintain naming conventions so searches are easier (e.g.,
app-env-component). - Use folders to represent business units and environments to make scope-based inventory meaningful.
12. Security Considerations
Identity and access model
- Cloud Asset Inventory access is controlled by IAM.
- Inventory and policy exports can reveal:
- Project structure and names
- Resource identifiers
- IAM principals and role bindings
- Treat exports as sensitive security artifacts.
Recommendations: – Grant org-wide visibility only to security/platform roles. – If you export IAM policies to BigQuery/Storage, apply strict dataset/bucket access controls. – Prefer short-lived credentials for human operators.
Encryption
- Google Cloud encrypts data at rest by default in Storage and BigQuery.
- For higher assurance, use Customer-Managed Encryption Keys (CMEK) where supported by the destination service (verify CMEK support for your chosen destination and configuration).
Network exposure
- CAI is accessed via Google APIs endpoints; enforce secure access patterns:
- MFA for users
- Controlled admin workstations
- Organization policies restricting risky configurations (where applicable)
- For feed consumers, keep services private when possible and avoid exposing admin endpoints publicly.
Secrets handling
- If your automation posts to external systems (ticketing, chatops), store secrets in Secret Manager.
- Do not embed webhook URLs or credentials in code.
Audit/logging
Use: – Cloud Audit Logs to track: – Who changed resources/policies – Who called Cloud Asset API methods – Pub/Sub logs/metrics and consumer logs for operational visibility.
Compliance considerations
- Exports can be used as compliance evidence, but you must define:
- Retention period
- Access controls
- Evidence integrity model (e.g., bucket retention lock / object versioning—verify suitability)
- Ensure data residency by selecting appropriate Storage locations and access constraints.
Common security mistakes
- Exporting IAM policies to a broadly accessible bucket/dataset.
- Allowing too many users to create or modify feeds (attackers could suppress or flood monitoring).
- Not monitoring feed consumer health (missed events).
- Assuming CAI replaces Audit Logs—CAI is complementary; Audit Logs are the primary record of actions.
Secure deployment recommendations
- Use separate projects for:
- Central security tooling (feeds, exports, dashboards)
- Application workloads
- Use org/folder-level permissions sparingly and monitor their assignment.
- Implement “break-glass” procedures for temporary elevated access to CAI.
13. Limitations and Gotchas
These are common realities in production. Always verify current limits, quotas, and supported assets in official docs.
- Not all assets have identical metadata: some services expose richer searchable fields than others.
- History coverage varies: asset history availability, depth, and retention vary by asset type and policy type.
- Schema evolution: BigQuery export schemas and asset metadata fields can change as Google Cloud services evolve.
- Event delivery semantics: feed notifications are near real-time and can be duplicated; consumers must be idempotent.
- Permission complexity:
- Exporting to BigQuery requires dataset permissions
- Exporting to Storage requires bucket/object permissions
- Feed publishing requires Pub/Sub permissions for the Cloud Asset service agent
- Scope matters: org-level operations require org-level permissions; project-level roles won’t be enough.
- Pricing surprises (usually indirect):
- BigQuery dashboards scanning large exports frequently
- Pub/Sub backlog storage if consumers fail
- Logging volume from verbose consumers
- Query syntax gotchas:
- Search queries are not the same as SQL; learn the supported fields/operators in CAI docs.
- CLI/API differences:
gcloudflags can change with SDK versions; always consult--helpand API reference.- Data sensitivity:
- Inventory exports can be sensitive even if they don’t contain customer data.
14. Comparison with Alternatives
Cloud Asset Inventory is best understood as inventory + search + export + change notification for Google Cloud metadata. It is not a full CSPM by itself, not an IaC state manager, and not a logging system.
Comparison table
| Option | Best For | Strengths | Weaknesses | When to Choose |
|---|---|---|---|---|
| Cloud Asset Inventory (Google Cloud) | Inventory, search, exports, asset change notifications | Native coverage of Google Cloud assets and policies; org/folder/project scoping; exports to BigQuery/Storage; feeds to Pub/Sub | Not a full compliance/remediation platform by itself; schema/coverage varies by asset type | You need authoritative GCP inventory and policy visibility with automation hooks |
| Security Command Center (Google Cloud) | Security posture management and findings | Centralized findings, detectors, posture (varies by tier), security workflows | Different primary purpose; may not replace raw inventory/export needs | Use SCC for security findings and posture; pair with CAI for inventory and enrichment |
| Cloud Logging + Audit Logs (Google Cloud) | Forensic record of admin actions and events | Authoritative “who did what” logs; strong querying/log sinks | Not an inventory; harder to reconstruct current state without additional processing | Use when you need action/event trails; pair with CAI for current state and exports |
| Resource Manager + service-specific APIs | Custom inventory scripts for narrow services | Direct control, service-specific details | High engineering overhead; inconsistent schemas; doesn’t scale well | Only for specialized metadata not well covered or for niche workflows |
| AWS Config (AWS) | Resource inventory and configuration history in AWS | Built-in change history and rules in AWS | Different cloud; not applicable to GCP workloads | Multi-cloud orgs may use AWS Config for AWS and CAI for GCP |
| Azure Resource Graph (Azure) | Query Azure resources at scale | Fast KQL queries across Azure subscriptions | Different cloud; not applicable to GCP workloads | Multi-cloud orgs may use ARG for Azure and CAI for GCP |
| Open-source: Cloud Custodian / Steampipe | Policy-as-code or SQL-like inventory across clouds | Flexible; multi-cloud; integrate with CI/CD | Requires setup, credentials management, maintenance; inventory source varies | Choose when you need multi-cloud abstraction or custom policy frameworks (often alongside CAI) |
| Terraform state / IaC repos | Intended infrastructure state | Shows what you meant to deploy; great for review workflows | Not always equal to reality; drift possible; doesn’t cover console-created resources well | Use for desired-state governance; pair with CAI for actual-state verification |
15. Real-World Example
Enterprise example (regulated industry)
Problem
A regulated enterprise has:
– Hundreds of projects across multiple business units
– Strict access review requirements
– Auditors requesting repeatable evidence of IAM controls and resource inventory
Proposed architecture – Org-level Cloud Asset Inventory exports: – Daily exports of resources + IAM policies (content types as required) to BigQuery – Weekly immutable exports to Cloud Storage for audit evidence retention – Near real-time feeds: – Feeds for sensitive asset types (e.g., buckets, KMS keys, IAM policy changes) to Pub/Sub – Cloud Run consumer validates changes and files tickets for violations – Governance: – Separate security tooling project – Least-privilege IAM with tightly controlled export access – Monitoring on Pub/Sub backlog and export job success
Why Cloud Asset Inventory was chosen – Native org-wide inventory and IAM policy visibility – BigQuery exports support repeatable, queryable evidence generation – Feeds enable automation without scraping logs for every use case
Expected outcomes – Faster audit evidence generation (hours instead of weeks) – Reduced misconfiguration dwell time (near real-time detection) – Improved access review accuracy with centralized policy search
Startup / small-team example
Problem
A startup runs production on Google Cloud with multiple environments. They want:
– A weekly inventory of what exists (to avoid resource sprawl)
– A basic alert when someone creates a public-facing resource accidentally
Proposed architecture – Weekly Cloud Asset Inventory export to a small Cloud Storage bucket – Simple Cloud Asset feed to Pub/Sub for a few high-risk asset types – Lightweight Cloud Run service that checks for obvious policy violations and posts to a chat channel
Why Cloud Asset Inventory was chosen – Minimal setup, native Google Cloud integration – Avoids building an inventory crawler across many APIs – Works well with serverless and Pub/Sub
Expected outcomes – Better visibility with low operational overhead – Early warning system for risky changes – A foundation they can later expand into BigQuery reporting as they grow
16. FAQ
1) Is Cloud Asset Inventory the same as Cloud Asset API?
Cloud Asset Inventory is the product/service; the Cloud Asset API is the programmatic interface used to access inventory, search, exports, feeds, and analysis methods.
2) What’s the difference between Cloud Asset Inventory and Audit Logs?
Audit Logs record actions (who did what). Cloud Asset Inventory provides current and historical metadata state (what resources/policies exist). They are complementary.
3) Can Cloud Asset Inventory show assets across my entire organization?
Yes—if you query with an organization scope and have the required org-level IAM permissions.
4) Does Cloud Asset Inventory include IAM policies?
Yes, it can return/search/export IAM policies depending on the method and content type you use, and your permissions.
5) Can I export to BigQuery?
Yes, Cloud Asset Inventory supports exports to BigQuery. BigQuery costs and dataset IAM controls apply.
6) Can I export to Cloud Storage?
Yes. This is a common low-friction way to store snapshot files for audits and offline processing.
7) Are real-time notifications supported?
Yes, via feeds that publish to Pub/Sub. Design consumers for retries and duplicates.
8) Does Cloud Asset Inventory support configuration history?
It supports asset history for certain asset types and policy types. Coverage and retention vary—verify for your assets.
9) Can I use Cloud Asset Inventory to detect public buckets?
Cloud Asset Inventory can help you find bucket resources and associated IAM policy data; you can also trigger automation via feeds. For “public,” you typically check IAM bindings for allUsers/allAuthenticatedUsers (verify best practice for your org).
10) How do I restrict who can export IAM policy data?
Use least-privilege IAM: only allow a small set of users/service accounts to call export methods and to read the export destinations (bucket/dataset).
11) Does Cloud Asset Inventory cover GKE objects (like Kubernetes Deployments)?
Cloud Asset Inventory focuses on Google Cloud resource assets. Some Kubernetes-related assets may be represented at the cloud resource level (clusters, node pools). In-cluster Kubernetes objects usually require Kubernetes-native inventory tools (unless integrated via other products). Verify current coverage for your use case.
12) How do feeds authenticate to Pub/Sub?
Feeds publish using a Google-managed Cloud Asset service agent. You grant that service account roles/pubsub.publisher on the topic.
13) Why do I see duplicate feed messages?
Pub/Sub delivery is at-least-once. Your consumer must be idempotent and able to deduplicate.
14) Should I run exports from every project separately?
Not usually. Many orgs centralize exports at folder/org scope to reduce complexity—if permissions and governance allow.
15) Is Cloud Asset Inventory a CSPM tool?
Not by itself. It provides inventory/search/exports/feeds that are often used as building blocks inside broader security posture management programs (native or third-party).
16) What’s the best storage format for compliance snapshots?
Cloud Storage exports are straightforward for retention. BigQuery exports are best for analytics. Many enterprises do both: BigQuery for dashboards + Storage for immutable archives.
17) How do I keep inventory data from becoming stale?
Use a combination of: – Scheduled exports (daily/weekly) – Feeds for near real-time changes – Monitoring/alerts for failed jobs and Pub/Sub backlog
17. Top Online Resources to Learn Cloud Asset Inventory
| Resource Type | Name | Why It Is Useful |
|---|---|---|
| Official documentation | Cloud Asset Inventory docs — https://cloud.google.com/asset-inventory/docs | Primary source for concepts, supported assets, and workflows |
| API reference | Cloud Asset API REST reference — https://cloud.google.com/asset-inventory/docs/reference/rest | Definitive for endpoints, request/response fields, and methods |
| CLI reference | gcloud asset reference — https://cloud.google.com/sdk/gcloud/reference/asset |
Practical commands for search, export, feeds, and automation |
| Pricing | Cloud Asset Inventory pricing — https://cloud.google.com/asset-inventory/pricing | Verify whether CAI has direct charges and what’s included |
| Pricing tool | Google Cloud Pricing Calculator — https://cloud.google.com/products/calculator | Model costs for Storage/BigQuery/Pub/Sub components |
| Tutorial/workflow | Export assets (docs section) — https://cloud.google.com/asset-inventory/docs/exporting-to-cloud-storage (verify exact URL in docs) | Step-by-step guidance for exports and formats |
| Tutorial/workflow | Export to BigQuery (docs section) — https://cloud.google.com/asset-inventory/docs/exporting-to-bigquery (verify exact URL in docs) | Shows how to build BigQuery-based inventory reporting |
| Tutorial/workflow | Feeds overview/manage feeds — https://cloud.google.com/asset-inventory/docs/monitoring-asset-changes (verify exact URL in docs) | Explains Pub/Sub notifications and feed setup |
| IAM | Access control for CAI — https://cloud.google.com/asset-inventory/docs/access-control | Clarifies roles/permissions and least-privilege guidance |
| Video | Google Cloud Tech (YouTube) — https://www.youtube.com/googlecloudtech | Often includes official walkthroughs and best practices (search “Cloud Asset Inventory”) |
| Samples | Google Cloud samples (GitHub) — https://github.com/GoogleCloudPlatform | Look for Cloud Asset API usage examples; verify repo relevance and recency |
18. Training and Certification Providers
| Institute | Suitable Audience | Likely Learning Focus | Mode | Website URL |
|---|---|---|---|---|
| DevOpsSchool.com | DevOps engineers, SREs, platform teams | Google Cloud operations, automation, governance fundamentals | Check website | https://www.devopsschool.com |
| ScmGalaxy.com | Beginners to intermediate IT professionals | DevOps/Cloud fundamentals and toolchain training | Check website | https://www.scmgalaxy.com |
| CLoudOpsNow.in | Cloud operations teams | CloudOps practices, monitoring, reliability, security basics | Check website | https://www.cloudopsnow.in |
| SreSchool.com | SREs, operations, platform engineers | Reliability engineering, incident response, operational readiness | Check website | https://www.sreschool.com |
| AiOpsSchool.com | Ops teams adopting AIOps | AIOps concepts, automation, observability-driven operations | Check website | https://www.aiopsschool.com |
19. Top Trainers
| Platform/Site | Likely Specialization | Suitable Audience | Website URL |
|---|---|---|---|
| RajeshKumar.xyz | DevOps/Cloud training content | Beginners to working professionals | https://www.rajeshkumar.xyz |
| devopstrainer.in | DevOps tooling and practices | DevOps engineers, CI/CD practitioners | https://www.devopstrainer.in |
| devopsfreelancer.com | Freelance DevOps enablement | Startups and small teams needing hands-on guidance | https://www.devopsfreelancer.com |
| devopssupport.in | Operational support and training | Ops teams and engineers needing practical support | https://www.devopssupport.in |
20. Top Consulting Companies
| Company | Likely Service Area | Where They May Help | Consulting Use Case Examples | Website URL |
|---|---|---|---|---|
| cotocus.com | Cloud/DevOps consulting | Cloud governance, automation, operational best practices | Designing an inventory export pipeline; building Pub/Sub-driven remediation | https://www.cotocus.com |
| DevOpsSchool.com | DevOps and cloud consulting | Platform engineering, DevOps transformation, training + delivery | Implementing CAI exports to BigQuery; building runbooks and dashboards | https://www.devopsschool.com |
| DEVOPSCONSULTING.IN | DevOps consulting services | CI/CD, cloud operations, security-oriented automation | Implementing change-notification consumers; integrating inventory with ticketing | https://www.devopsconsulting.in |
21. Career and Learning Roadmap
What to learn before Cloud Asset Inventory
- Google Cloud basics:
- Projects, folders, organizations, billing accounts
- Resource hierarchy and inheritance
- IAM fundamentals:
- Principals, roles, bindings, conditions
- Service accounts and least privilege
- Core ops tools:
gcloudCLI basics- Cloud Logging and Audit Logs basics
- Data basics (helpful):
- Cloud Storage buckets and IAM
- BigQuery datasets/tables and access control
- Pub/Sub topics/subscriptions
What to learn after Cloud Asset Inventory
- Policy and governance:
- Organization Policy Service constraints
- Access Context Manager (if used)
- Policy-as-code approaches
- Security operations:
- Security Command Center
- Threat detection and response patterns
- Automation:
- Cloud Run/Functions event consumers
- Idempotency and retry-safe design for Pub/Sub
- Reporting:
- BigQuery optimization (partitioning, clustering, cost controls)
- Looker / dashboards (if applicable)
Job roles that use it
- Cloud Security Engineer / Security Architect
- Platform Engineer
- SRE / Operations Engineer
- DevOps Engineer
- Cloud Architect
- Compliance / GRC engineering roles (technical)
Certification path (Google Cloud)
Cloud Asset Inventory is not a standalone certification topic, but it commonly appears in: – Google Cloud Digital Leader (foundations) – Associate Cloud Engineer (operations basics) – Professional Cloud Architect (governance at scale) – Professional Cloud Security Engineer (IAM, inventory, monitoring, compliance)
Verify the latest certification guides: https://cloud.google.com/learn/certification
Project ideas for practice
- Weekly inventory export pipeline: export to Storage and track diffs.
- BigQuery compliance dashboard: export to BigQuery, write queries for risky IAM patterns.
- Pub/Sub feed responder: when a bucket is created, validate its IAM and labels; alert on violations.
- Access review tool: search IAM policies for a principal and generate a report for managers.
- Change correlation: join CAI exports with Audit Logs to show who changed sensitive policies.
22. Glossary
- Asset: A representation of a Google Cloud resource (and optionally its related policy metadata) in Cloud Asset Inventory.
- Scope: The resource container you query:
projects/*,folders/*, ororganizations/*. - Resource hierarchy: Organization → folders → projects → resources; affects policy inheritance and visibility.
- IAM policy: A set of bindings granting roles to principals for a resource.
- Principal: An identity (user, group, domain, service account) referenced in IAM bindings.
- Binding: A role assignment to one or more principals, optionally with a condition.
- IAM condition: A conditional expression that restricts when a binding applies.
- Organization policy (Org Policy): Governance constraints applied to resources in a hierarchy.
- Access policy: Policies typically associated with Access Context Manager (context-based access), where applicable.
- Export: A snapshot output of assets/policies written to Cloud Storage or BigQuery.
- Feed: A Cloud Asset Inventory configuration that publishes asset change notifications to Pub/Sub.
- Pub/Sub: Messaging service used for asynchronous event delivery from feeds to consumers.
- At-least-once delivery: A message delivery model where duplicates are possible; consumers must handle idempotency.
- Service agent: A Google-managed service account used by a Google service (like CAI) to perform actions (like publishing to Pub/Sub).
- BigQuery dataset: A container for tables/views where exported asset data can be stored and queried.
- Cloud Storage bucket: Object storage container commonly used for snapshot exports and retention.
23. Summary
Cloud Asset Inventory is Google Cloud’s native inventory, search, export, and change-notification service for cloud resources and their associated policy metadata. It matters because security, compliance, and operations all depend on having accurate visibility into what exists, who can access it, and what changed across projects, folders, and organizations.
From an architecture standpoint, Cloud Asset Inventory is a control-plane capability that integrates naturally with BigQuery (analytics), Cloud Storage (evidence snapshots), and Pub/Sub (real-time change automation). Cost is often driven less by Cloud Asset Inventory itself (verify current pricing) and more by the destination services and your export/feed volume—especially BigQuery query patterns and Pub/Sub backlog.
Use Cloud Asset Inventory when you need authoritative Google Cloud inventory at scale, recurring exports for audits, and event-driven security automation. As a next step, implement either: – A scheduled export pipeline to BigQuery for reporting, or – A Pub/Sub feed + Cloud Run consumer for near real-time policy guardrails
Then expand toward organization-wide governance with least-privilege IAM and well-defined retention controls.