Category
Security
1. Introduction
What this service is
“Audit Manager” is not currently a standalone, first-party Google Cloud product name in the way “Cloud Logging” or “Security Command Center” is. In Google Cloud environments, the capabilities people often expect from an “audit manager” (collecting audit evidence, centralizing logs, retaining them immutably, and producing audit-ready reports) are typically implemented using Cloud Audit Logs + Cloud Logging (Log Router) + sinks to BigQuery/Cloud Storage, optionally enriched with Cloud Asset Inventory and Security Command Center.
If you found “Audit Manager” referenced in internal documentation, a third-party tool, or another cloud provider, verify in official Google Cloud docs whether it refers to a partner solution or an internal pattern. This tutorial stays strictly within Google Cloud Security services and shows how to build an “Audit Manager” capability using official Google Cloud features.
One-paragraph simple explanation
Audit Manager on Google Cloud (as a practical capability) means turning on the right audit logs, centralizing them to a protected location, keeping them for the required retention period, and making them searchable and reportable for compliance, security investigations, and operational governance.
One-paragraph technical explanation
Technically, you implement Audit Manager by configuring Cloud Audit Logs (Admin Activity, Data Access, System Event, Policy Denied), using Cloud Logging Log Router with project/folder/organization-level sinks to route logs into a central logging project and onward into BigQuery datasets (for analytics/reporting) and/or Cloud Storage buckets (for long-term retention). You secure access with IAM, protect retention with log buckets retention policies and object retention policies (for Storage), optionally use CMEK via Cloud KMS, and operationalize alerts via Cloud Monitoring and log-based metrics.
What problem it solves
Audit Manager solves these common problems:
- Audit evidence is scattered across projects and teams.
- Logs are retained inconsistently (or deleted), breaking compliance.
- Investigations take too long because there’s no centralized, queryable audit trail.
- It’s hard to prove “who did what, when, and from where” across an organization.
- Security teams need alerts on sensitive changes (IAM edits, firewall changes, KMS key changes, etc.).
2. What is Audit Manager?
Official purpose (in Google Cloud terms)
Because “Audit Manager” is not an official single Google Cloud product, the official purpose is realized by combining these first-party services:
- Cloud Audit Logs: Records administrative actions and data access events across Google Cloud services.
- Cloud Logging: Stores, searches, routes, and exports logs (including audit logs) via Log Router.
- BigQuery / Cloud Storage: Provide analytics and long-term storage for audit evidence.
- Cloud Asset Inventory: Provides asset state/history and can export inventory for compliance evidence.
- Security Command Center (optional): Consolidates security findings and posture (not an audit log store, but useful for audit narratives and evidence).
Core capabilities (what an “Audit Manager” capability typically includes)
- Centralized collection of audit logs across many projects/folders
- Separation of duties (teams can’t tamper with audit evidence)
- Retention controls aligned to compliance requirements
- Query/reporting workflows (e.g., BigQuery)
- Alerting on sensitive administrative activity
- Support for investigations and compliance audits (SOC 2, ISO 27001, PCI DSS, HIPAA—requirements vary)
Major components (Google Cloud building blocks)
- Cloud Audit Logs (emitted automatically by supported services)
- Log buckets and Log Router (in Cloud Logging)
- Sinks:
- Project-level sinks
- Folder-level sinks
- Aggregated sinks (organization or folder level) to centralize logs at scale
- Destinations:
- BigQuery dataset (reporting/analytics)
- Cloud Storage bucket (archival/immutability)
- Pub/Sub topic (streaming workflows, optional)
- IAM for access control, plus retention policies and CMEK (optional)
Service type
Audit Manager (as used in this tutorial) is a Security governance pattern built using managed Google Cloud services (Logging, BigQuery, Storage, Monitoring).
Scope (regional/global/project/org)
- Cloud Audit Logs are generated per Google Cloud resource (project/folder/org) and viewed via Cloud Logging.
- Cloud Logging is a global service, but:
- Log buckets exist in specific locations (including global in many cases; verify in official docs for available locations).
- Destinations (BigQuery dataset location, Storage bucket location) are regional/multi-regional and impact data residency.
- Centralization is commonly done at the organization scope using aggregated sinks.
How it fits into the Google Cloud ecosystem
Audit Manager sits at the intersection of:
- Security: detection, investigation, governance
- Operations: change tracking, troubleshooting, incident response
- Compliance: evidence collection, retention, access controls
It complements (does not replace) services like Security Command Center (findings/posture) and Cloud Monitoring (metrics/alerts).
3. Why use Audit Manager?
Business reasons
- Reduce audit preparation time by keeping evidence centralized and queryable.
- Lower compliance risk by enforcing retention and controlling who can delete or alter evidence.
- Improve accountability across engineering and operations teams.
Technical reasons
- One place to answer: who changed IAM, who modified firewall rules, who created service accounts, who accessed sensitive datasets, etc.
- BigQuery enables scalable analysis across large log volumes.
Operational reasons
- Standardize logging and retention across many projects.
- Enable consistent alerting and investigation workflows.
- Support incident response with fast, organization-wide searches.
Security/compliance reasons
- Helps meet requirements for:
- Audit logging and monitoring
- Change management evidence
- Access review evidence
- Retention and integrity controls
Exact mappings depend on your framework; validate with your compliance team.
Scalability/performance reasons
- Aggregated sinks scale better than manually configuring every project.
- BigQuery scales for analytics across billions of log rows (cost must be managed).
When teams should choose it
Choose an Audit Manager approach on Google Cloud when you need:
- Centralized audit evidence across multiple projects/folders
- Separation of duties (security/compliance owns evidence store)
- Compliance-driven retention and reporting
- Alerts on sensitive actions
When teams should not choose it
Avoid (or postpone) a full Audit Manager build if:
- You only have one small project and basic troubleshooting needs.
- You don’t have a clear retention/reporting requirement (you can start with default Logging views).
- You can’t commit to ongoing operations (access control, cost management, query optimization).
- Your organization requires a certified third-party GRC tool with workflow approvals—Google Cloud primitives may be necessary but not sufficient.
4. Where is Audit Manager used?
Industries
- Financial services (change tracking, access oversight)
- Healthcare (access and administrative audit trails)
- E-commerce and SaaS (SOC 2 evidence, incident response)
- Public sector (governance, data residency, accountability)
- Education and research (access monitoring and incident investigations)
Team types
- Security engineering / SOC
- Compliance and risk teams (with engineering support)
- Platform engineering (central logging platform)
- SRE/Operations (incident response)
- Data platform teams (BigQuery logging analytics)
Workloads and architectures
- Multi-project organizations with shared networking and shared services
- Microservices on GKE/Cloud Run (audit of IAM, config, deployment changes)
- Data platforms (BigQuery, Cloud Storage, Dataproc) needing access audit trails
- Hybrid/multi-cloud environments (Google Cloud as one audit domain)
Real-world deployment contexts
- Central “Security/Audit” project that receives logs from all other projects
- Dual-destination exports:
- BigQuery for analysis
- Cloud Storage for long-term retention and immutability
Production vs dev/test usage
- Production: enforce retention, limited access, alerting, and documented procedures.
- Dev/test: smaller retention and simpler exports; be careful not to leak sensitive logs to less-controlled projects.
5. Top Use Cases and Scenarios
Below are realistic Audit Manager use cases implemented with Google Cloud audit logging and exports.
1) Organization-wide IAM change tracking
- Problem: IAM policy changes can grant excessive privileges without detection.
- Why Audit Manager fits: Admin Activity audit logs capture IAM policy changes; centralized export enables reporting and alerts.
- Example: Alert whenever
roles/owneris granted to any principal in any project.
2) Firewall and VPC security change monitoring
- Problem: Network rule changes can expose workloads publicly.
- Why it fits: Audit logs capture changes to VPC firewall rules and routes.
- Example: Alert if a firewall rule allowing
0.0.0.0/0to sensitive ports is created.
3) KMS key and encryption policy auditing
- Problem: Key rotation, IAM on keys, or disabling keys can cause outages or weaken controls.
- Why it fits: Admin Activity logs track KMS administrative actions.
- Example: Weekly report of all KMS key IAM changes and key state changes.
4) BigQuery dataset access auditing (sensitive data)
- Problem: Need evidence of who accessed regulated data.
- Why it fits: BigQuery Data Access logs can capture read events (subject to configuration and service behavior).
- Example: Monthly access report for datasets tagged “PII”.
5) Service account lifecycle governance
- Problem: Orphaned service accounts and keys increase risk.
- Why it fits: Audit logs track service account creation, key creation, and deletion.
- Example: Alert on any service account key creation, with a ticket automatically created.
6) Change management evidence for deployments
- Problem: Auditors want evidence of controlled changes.
- Why it fits: Audit logs record deployments/updates to many services (e.g., Cloud Run revisions, IAM changes, GKE control plane actions).
- Example: Produce a change log for production services during a release window.
7) Multi-project incident investigation (“blast radius” search)
- Problem: Incident responders need to search across many projects quickly.
- Why it fits: Centralized BigQuery dataset enables cross-project queries.
- Example: Search for all IAM policy changes by a suspicious user across the org.
8) Policy denied events and misconfiguration detection
- Problem: You need visibility into blocked actions to tune policies and detect abuse.
- Why it fits: Policy Denied audit logs show when org policies or IAM deny actions.
- Example: Identify repeated denied attempts to disable logging exports.
9) Evidence retention and legal hold
- Problem: Compliance requires long retention and integrity.
- Why it fits: Route logs to Storage with retention policies; lock down delete permissions.
- Example: Keep immutable audit archives for 1–7 years (as required) in a dedicated bucket.
10) Internal/external audit preparation (“evidence pack”)
- Problem: Audit cycles require consistent evidence packages.
- Why it fits: Use BigQuery queries and scheduled exports to produce standardized reports.
- Example: Quarterly report: “All privileged role grants + all firewall changes + all KMS changes”.
11) Cloud Asset Inventory for configuration evidence
- Problem: Auditors ask for proof of configuration state (assets, IAM bindings, policies).
- Why it fits: Cloud Asset Inventory exports provide point-in-time and history evidence.
- Example: Export org asset inventory weekly to BigQuery for compliance snapshots.
12) Separation-of-duties logging platform
- Problem: Engineers shouldn’t be able to delete or alter audit evidence.
- Why it fits: Central sinks and IAM separation protect evidence.
- Example: Only compliance team can access the audit dataset; engineers can’t modify sinks.
6. Core Features
Because Audit Manager is implemented via Google Cloud services, the “features” below map to what you can configure today using official Google Cloud capabilities.
Feature 1: Cloud Audit Logs (Admin Activity, Data Access, System Event, Policy Denied)
- What it does: Records actions taken in Google Cloud services.
- Why it matters: Provides authoritative “who did what” evidence.
- Practical benefit: Enables investigations and compliance evidence without adding agents.
- Limitations/caveats:
- Not all services emit the same level of audit detail.
- Data Access logs can be high volume and may have different default enablement depending on service; verify per-service behavior in docs.
Official docs: https://cloud.google.com/logging/docs/audit
Feature 2: Centralization with Log Router and sinks (including aggregated sinks)
- What it does: Routes logs to a central project and/or export destinations.
- Why it matters: Eliminates per-project silos.
- Practical benefit: One analytics location and one retention policy.
- Limitations/caveats:
- Requires correct IAM for sink writers and destination permissions.
- Aggregated sinks require organization/folder privileges.
Official docs: https://cloud.google.com/logging/docs/routing/overview
Feature 3: Log buckets with retention policies
- What it does: Stores logs in Cloud Logging with controlled retention.
- Why it matters: Retention is a compliance control; prevents “accidental short retention.”
- Practical benefit: Standard retention per environment (prod vs dev) and log type.
- Limitations/caveats:
- Retention is not the same as immutability (deletion control still depends on IAM and policies).
Official docs (retention and buckets): https://cloud.google.com/logging/docs/storage
Feature 4: Export to BigQuery for audit analytics and reporting
- What it does: Copies log entries into BigQuery tables.
- Why it matters: BigQuery is suited for cross-project, large-scale queries and reporting.
- Practical benefit: Create reusable SQL reports for auditors and security teams.
- Limitations/caveats:
- BigQuery query costs can grow if queries scan large tables.
- Schema is nested; you must learn how audit log fields are structured.
Official docs (export): https://cloud.google.com/logging/docs/export/configure_export_v2
Feature 5: Export to Cloud Storage for long-term archival
- What it does: Writes logs to Storage objects (often as batched files).
- Why it matters: Storage can be cheaper for long retention and supports retention policies.
- Practical benefit: Create an immutable archive (with correct bucket settings and IAM).
- Limitations/caveats:
- Searching archived logs is less convenient than BigQuery/Logging.
- Immutability requires correct configuration (retention policy + lock, and restricted permissions).
Official docs (export destinations): https://cloud.google.com/logging/docs/export
Feature 6: Alerting with log-based metrics + Cloud Monitoring
- What it does: Turns matching log events into metrics and triggers alerts.
- Why it matters: Auditing is not only retrospective; you need real-time detection.
- Practical benefit: Page/on-call when critical changes occur.
- Limitations/caveats:
- Alert noise is common unless filters are precise.
- Some events are frequent (e.g., automated changes) and need allowlists.
Official docs: https://cloud.google.com/logging/docs/logs-based-metrics
Feature 7: Evidence enrichment with Cloud Asset Inventory (optional)
- What it does: Inventories resources and IAM policies; can export snapshots and history.
- Why it matters: Audits often require configuration evidence, not only activity logs.
- Practical benefit: Produce reports like “all public buckets” or “all service accounts with keys”.
- Limitations/caveats:
- Asset Inventory exports are different from audit logs; use both for a complete story.
Official docs: https://cloud.google.com/asset-inventory/docs/overview
Feature 8: Separation of duties via IAM, dedicated audit project, and restricted sink management
- What it does: Prevents project owners in workload projects from disabling audit exports.
- Why it matters: Integrity of audit evidence is a core compliance requirement.
- Practical benefit: Central team controls the pipeline; workload teams have least privilege.
- Limitations/caveats:
- Requires org-level governance and careful IAM design.
- Misconfiguration can lock you out; use break-glass accounts.
7. Architecture and How It Works
High-level architecture
- Google Cloud services emit Cloud Audit Logs into Cloud Logging for each project.
- A sink (project/folder/org) routes matching logs to a central destination: – Cloud Logging log bucket in a central audit project, and/or – BigQuery dataset, and/or – Cloud Storage bucket.
- Security/compliance teams query BigQuery and review archived logs; Monitoring alerts on high-risk patterns.
Request/data/control flow
- Control plane actions (e.g., IAM updates) generate Admin Activity audit logs automatically.
- Data plane access (e.g., reading objects) may generate Data Access logs depending on service and configuration.
- The Log Router evaluates sink filters and exports matching logs.
- Exports write to destinations using sink writer identities (service accounts managed by Logging).
Integrations with related services
- Cloud Monitoring: alerting on log-based metrics.
- BigQuery: reporting dashboards (Looker/Looker Studio can be layered on top; verify product fit and governance).
- Cloud Storage: retention/archival, optionally with CMEK.
- Security Command Center: posture and findings context for audit narratives (not a log store).
- Cloud Asset Inventory: periodic evidence snapshots.
Dependency services
- Cloud Logging and Cloud Audit Logs are foundational.
- BigQuery and/or Cloud Storage are common destinations.
- IAM and Cloud Resource Manager govern scope (project/folder/org).
Security/authentication model
- Access to view/query logs is governed by IAM roles on:
- Logging buckets/views
- BigQuery datasets/tables
- Storage buckets/objects
- Export uses sink writer service accounts that must be granted write permissions to destinations.
Networking model
- Audit logs are generated and routed within Google’s managed control plane.
- Export destinations (BigQuery/Storage) are Google-managed services; network design focuses on:
- Data residency choices (dataset/bucket locations)
- Private access patterns for analysts (e.g., via corp network/VPN) as needed
Monitoring/logging/governance considerations
- Monitor export pipeline health (sink errors, destination permissions).
- Track log ingestion and BigQuery query costs.
- Govern with org policies and least privilege.
Simple architecture diagram (Mermaid)
flowchart LR
A[Google Cloud Projects] --> B[Cloud Audit Logs]
B --> C[Cloud Logging]
C --> D[Log Router Sink]
D --> E[BigQuery Dataset (Audit Analytics)]
D --> F[Cloud Storage Bucket (Archive)]
C --> G[Cloud Monitoring Alerts]
Production-style architecture diagram (Mermaid)
flowchart TB
subgraph ORG[Google Cloud Organization]
subgraph FOLDERS[Folders / Environments]
P1[Prod Projects]
P2[Non-Prod Projects]
P3[Shared Services Projects]
end
end
P1 --> L1[Cloud Logging + Audit Logs]
P2 --> L2[Cloud Logging + Audit Logs]
P3 --> L3[Cloud Logging + Audit Logs]
L1 --> SINK[Org/Folder Aggregated Sink\n(filter: audit logs)]
L2 --> SINK
L3 --> SINK
subgraph AUDIT[Central Audit Project]
LB[Central Log Bucket\nRetention Policy]
BQ[BigQuery Dataset\nAudit Reporting]
CS[Cloud Storage Bucket\nRetention + Optional Lock]
MON[Cloud Monitoring\nAlerts/Dashboards]
KMS[Cloud KMS (Optional CMEK)]
end
SINK --> LB
LB -->|Export| BQ
LB -->|Export| CS
LB --> MON
KMS -.optional.-> BQ
KMS -.optional.-> CS
subgraph USERS[Security / Compliance / SRE]
AN[Analysts]
IR[Incident Responders]
AU[Auditors (Read-only)]
end
AN --> BQ
IR --> BQ
AU --> BQ
AU --> CS
8. Prerequisites
Account/project requirements
- A Google Cloud billing account.
- At least one Google Cloud project for workloads.
- Recommended for production patterns:
- A Google Cloud Organization and optionally folders for environments.
- A dedicated central audit project.
Permissions / IAM roles (minimum guidance)
Exact roles depend on scope (project vs org). Common roles include:
- For configuring sinks and logging:
roles/logging.configWriterorroles/logging.admin(scope-limited)- For viewing logs:
roles/logging.viewer- For BigQuery destination setup:
roles/bigquery.admin(setup) or dataset-level permissions- For Storage destination setup:
roles/storage.admin(setup) or bucket-level permissions- For enabling APIs:
roles/serviceusage.serviceUsageAdmin- Organization-level setup (if using aggregated sinks):
roles/resourcemanager.organizationAdminor more scoped admin roles as appropriate (verify least-privilege options)
Tip: For production, avoid broad primitives and prefer custom roles or scoped predefined roles. Always test in a non-production folder first.
Billing requirements
- Cloud Logging ingestion/retention beyond free allotments may incur costs.
- BigQuery storage and queries cost money.
- Cloud Storage archive costs money.
CLI/SDK/tools
- Google Cloud SDK (
gcloud) bqCLI (included with Cloud SDK)- Access to Google Cloud Console
Region availability and data residency
- Choose BigQuery dataset location (US/EU/regional) and Storage bucket location based on requirements.
- Cloud Logging bucket location options vary; verify in official docs for current availability and constraints.
Quotas/limits
Quotas apply to: – Logging sinks, export volume, API usage – BigQuery query and load quotas – Storage request rates and lifecycle operations
Check quotas in Google Cloud Console and verify current limits in official docs.
Prerequisite services / APIs
Enable as needed: – Cloud Logging API (typically enabled) – BigQuery API – Cloud Resource Manager API (usually enabled) – IAM API (usually enabled) – Cloud Asset API (optional) – Cloud Monitoring API (for alerting)
9. Pricing / Cost
Pricing model (accurate, non-fabricated)
There is no single “Audit Manager” SKU in Google Cloud. Costs come from the underlying services:
-
Cloud Logging – Pricing dimensions typically include:
- Log ingestion volume (GiB)
- Retention beyond default periods
- Log routing/export (some aspects may be free; verify current pricing)
- Official pricing: https://cloud.google.com/logging/pricing
-
BigQuery – Pricing dimensions:
- Storage (active/long-term)
- Query processing (bytes scanned) for on-demand, or capacity-based pricing for editions/reservations
- Streaming inserts (if used)
- Official pricing: https://cloud.google.com/bigquery/pricing
-
Cloud Storage – Pricing dimensions:
- Storage (by class: Standard, Nearline, Coldline, Archive)
- Operations (PUT/GET/LIST)
- Data retrieval and early delete (depending on class)
- Egress (if accessed across regions or to the internet)
- Official pricing: https://cloud.google.com/storage/pricing
-
Cloud Monitoring (alerts/metrics) – Pricing varies by metrics volume, logs-based metrics, and monitoring usage. – Official pricing: https://cloud.google.com/monitoring/pricing
-
Cloud KMS (optional) – Key versions and cryptographic operations – Official pricing: https://cloud.google.com/kms/pricing
Free tier
Many Google Cloud services have free tiers or free allotments (especially for Logging/Monitoring), but they change over time and can be region/service dependent. Verify in official pricing pages.
Primary cost drivers
- High-volume Data Access logs (especially for data platforms)
- Exporting everything to BigQuery and running frequent, unoptimized queries
- Long retention in Logging and/or Storage with large daily ingestion
- Retaining duplicated copies (Logging + BigQuery + Storage) without a clear purpose
Hidden or indirect costs
- BigQuery query costs from dashboards that refresh frequently
- Storing logs in multiple places without lifecycle rules
- Egress charges if you export/query across regions
- Operational overhead (time) for maintaining filters, permissions, and reports
Network/data transfer implications
- Intra-service movement within Google can still have location constraints.
- Keep dataset/bucket locations aligned with where you query from and your compliance boundaries.
- If analysts download large results to on-prem or another cloud, egress applies.
How to optimize cost
- Be selective with exported logs:
- Start with Admin Activity for governance.
- Add Data Access logs only for datasets/buckets that truly require it.
- Use sink filters to export only what you need for compliance.
- Partition and cluster BigQuery tables when possible (Logging exports often create partitioned tables automatically; verify current behavior).
- Use scheduled queries that scan only relevant partitions/time ranges.
- Use Storage lifecycle rules to transition older archives to colder classes.
- Avoid duplicate retention (e.g., if Storage is the long-term archive, you may not need very long BigQuery retention).
Example low-cost starter estimate (model, not numbers)
A low-cost starter typically: – Centralizes Admin Activity logs only – Exports to BigQuery for 30–90 days – Archives to Cloud Storage with lifecycle transitions Costs depend on daily log volume, retention, and query frequency. Use: – Google Cloud Pricing Calculator: https://cloud.google.com/products/calculator
Example production cost considerations
In production, watch: – Data Access log volumes (can be orders of magnitude higher) – BigQuery query usage from security analytics (SOC) and dashboards – Retention requirements (1–7 years) driving Storage footprint – Multi-region organizations requiring separate audit stores
10. Step-by-Step Hands-On Tutorial
This lab builds a practical “Audit Manager” pipeline in a single project (low-cost starter). It centralizes Admin Activity audit logs into BigQuery and demonstrates auditing queries and a simple alert signal.
Objective
- Enable and validate Cloud Audit Logs
- Route Admin Activity audit logs to a BigQuery dataset using a Logging sink
- Query audit logs in BigQuery to produce audit evidence
- Create a basic detection signal (log-based metric) for IAM policy changes
- Clean up resources to avoid ongoing costs
Lab Overview
You will: 1. Create a BigQuery dataset for audit logs. 2. Create a Cloud Logging sink that exports Admin Activity audit logs to BigQuery. 3. Generate a real audit event (create a service account). 4. Query BigQuery to confirm the audit entry landed. 5. Create a log-based metric to count IAM policy changes (signal for alerting). 6. Clean up.
Expected cost: Low for a short lab. BigQuery queries and Logging ingestion can cost money depending on usage and free tiers. Keep the lab short and run only the provided queries.
Step 1: Set your project and enable required APIs
1) Pick or create a project.
gcloud config set project YOUR_PROJECT_ID
2) Enable BigQuery API (and ensure Logging is available).
gcloud services enable bigquery.googleapis.com
Expected outcome: API enablement succeeds.
Verify:
gcloud services list --enabled --format="value(config.name)" | grep -E "bigquery.googleapis.com"
Step 2: Create a BigQuery dataset for audit logs
Choose a dataset location that matches your needs (example uses US). For EU or regional, change accordingly.
bq --location=US mk -d \
--description "Audit Manager dataset for Cloud Audit Logs export" \
audit_logs
Expected outcome: Dataset audit_logs exists.
Verify:
bq show YOUR_PROJECT_ID:audit_logs
Step 3: Create a Logging sink to export Admin Activity audit logs to BigQuery
We will export only Admin Activity logs to keep scope and cost small.
Create the sink:
gcloud logging sinks create auditmanager-admin-activity \
bigquery.googleapis.com/projects/YOUR_PROJECT_ID/datasets/audit_logs \
--log-filter='logName:"cloudaudit.googleapis.com%2Factivity"'
Expected outcome: Sink is created.
Verify:
gcloud logging sinks describe auditmanager-admin-activity --format="yaml"
Grant the sink permission to write to BigQuery
When you create a sink, Cloud Logging creates a writer identity (a service account). You must grant it BigQuery permissions on the dataset.
1) Get the sink writer identity:
SINK_WRITER=$(gcloud logging sinks describe auditmanager-admin-activity --format="value(writerIdentity)")
echo $SINK_WRITER
2) Grant dataset write permission. The simplest approach for the lab is to add the writer identity with a BigQuery role on the dataset.
bq update --dataset \
--add_iam_member="${SINK_WRITER}:roles/bigquery.dataEditor" \
YOUR_PROJECT_ID:audit_logs
Expected outcome: Sink can write to the dataset.
Verify (basic): – In the Cloud Console: BigQuery → dataset → “Sharing” / “Permissions” should show the sink writer principal. – Or view dataset IAM:
bq show --format=prettyjson YOUR_PROJECT_ID:audit_logs | sed -n '1,200p'
Common issue: Permission denied when exporting. – Fix: Ensure the dataset IAM includes the sink writer identity with appropriate permissions.
Step 4: Generate an audit event (create a service account)
Creating a service account generates Admin Activity audit logs.
gcloud iam service-accounts create auditmanager-lab-sa \
--display-name="Audit Manager Lab SA"
Expected outcome: Service account is created.
Verify:
gcloud iam service-accounts list --filter="email:auditmanager-lab-sa@"
Step 5: Confirm logs arrived in BigQuery and query them
Log exports can take a few minutes. Wait 2–5 minutes, then check for new tables in the dataset.
List tables:
bq ls YOUR_PROJECT_ID:audit_logs
You should see one or more tables created by the export. The exact table naming can vary by export configuration and time. If you don’t see tables yet, wait a bit longer and try again.
Query example: find the service account creation event
In BigQuery, audit log exports typically store structured fields under protoPayload. Use a query like the following, adjusting table names as needed.
1) Identify the newest table name from bq ls, then run:
bq query --use_legacy_sql=false '
SELECT
timestamp,
protoPayload.authenticationInfo.principalEmail AS actor,
protoPayload.methodName AS method,
protoPayload.resourceName AS resource,
protoPayload.serviceName AS service
FROM `YOUR_PROJECT_ID.audit_logs.*`
WHERE protoPayload.serviceName = "iam.googleapis.com"
AND protoPayload.methodName LIKE "%CreateServiceAccount%"
ORDER BY timestamp DESC
LIMIT 50
'
Expected outcome: You see an entry showing the principal (your user) calling a method that created the service account.
Verification tips:
– If results are empty, expand the time window by removing filters or searching for iam.googleapis.com.
– Ensure you used the correct dataset and wildcard table pattern.
Step 6: Create a log-based metric for IAM policy changes (basic signal)
This step creates a metric that counts IAM policy changes in your project. You can later attach an alert policy.
Create the metric:
gcloud logging metrics create auditmanager_iam_policy_changes \
--description="Counts IAM policy set operations for Audit Manager signal" \
--log-filter='logName:"cloudaudit.googleapis.com%2Factivity"
protoPayload.methodName="SetIamPolicy"'
Expected outcome: Metric exists and begins counting future matching log entries.
Verify:
gcloud logging metrics describe auditmanager_iam_policy_changes --format="yaml"
Generate an IAM policy change to test (optional). For example, grant a role to your lab service account at the project level. Be cautious with privileges; choose a minimal role:
PROJECT_ID=$(gcloud config get-value project)
SA="auditmanager-lab-sa@${PROJECT_ID}.iam.gserviceaccount.com"
gcloud projects add-iam-policy-binding "${PROJECT_ID}" \
--member="serviceAccount:${SA}" \
--role="roles/viewer"
Wait a few minutes, then view the metric in Cloud Console:
– Logging → Log-based metrics → auditmanager_iam_policy_changes
Expected outcome: Metric shows at least one count increment after the IAM change.
Validation
Use this checklist:
- [ ] Sink exists and is enabled:
bash gcloud logging sinks describe auditmanager-admin-activity --format="value(disabled)" - [ ] BigQuery dataset exists:
bash bq show YOUR_PROJECT_ID:audit_logs - [ ] Export created tables in BigQuery:
bash bq ls YOUR_PROJECT_ID:audit_logs - [ ] Query returns audit records (service account creation and/or SetIamPolicy event).
- [ ] Log-based metric exists:
bash gcloud logging metrics describe auditmanager_iam_policy_changes
Troubleshooting
Issue: “No tables appear in BigQuery”
- Wait 5–10 minutes; exports are not always instant.
- Confirm sink filter matches Admin Activity:
logName:"cloudaudit.googleapis.com%2Factivity"- Confirm sink destination points to the right dataset.
- Confirm sink writer identity has BigQuery permissions.
Issue: “Access Denied” querying BigQuery
- Ensure your user has
roles/bigquery.dataViewer(or higher) on the dataset. - If using organization policies, check if BigQuery access is restricted.
Issue: Metric exists but doesn’t increment
- Ensure you generated a matching event after creating the metric.
- Confirm filter matches:
- Admin Activity log name and
protoPayload.methodName="SetIamPolicy"
Issue: Too many results / noisy signals
- Tighten filters with resource type, project, or method name patterns.
- Add allowlists for known automation identities.
Cleanup
To avoid ongoing costs, remove created resources:
1) Delete the IAM binding (optional cleanup):
PROJECT_ID=$(gcloud config get-value project)
SA="auditmanager-lab-sa@${PROJECT_ID}.iam.gserviceaccount.com"
gcloud projects remove-iam-policy-binding "${PROJECT_ID}" \
--member="serviceAccount:${SA}" \
--role="roles/viewer"
2) Delete the service account:
gcloud iam service-accounts delete "auditmanager-lab-sa@${PROJECT_ID}.iam.gserviceaccount.com" --quiet
3) Delete the log-based metric:
gcloud logging metrics delete auditmanager_iam_policy_changes --quiet
4) Delete the sink:
gcloud logging sinks delete auditmanager-admin-activity --quiet
5) Delete the BigQuery dataset (deletes tables inside):
bq rm -r -f -d YOUR_PROJECT_ID:audit_logs
11. Best Practices
Architecture best practices
- Use a dedicated central audit project separate from workload projects.
- Prefer aggregated sinks at folder/org for consistent coverage at scale.
- Export:
- BigQuery for investigation and reporting
- Cloud Storage for long-term retention/archival (especially for multi-year requirements)
- Document a clear evidence model:
- What is retained where
- For how long
- Who can access it
- How to produce reports
IAM/security best practices
- Separate duties:
- Workload project owners should not be able to disable org-level sinks.
- Use least privilege:
- Analysts get read-only access to views/datasets.
- Only a small platform team can change sinks and retention policies.
- Prefer group-based access (Google Groups / Cloud Identity) over individual users.
- Use break-glass procedures for emergencies.
Cost best practices
- Start with Admin Activity logs, then expand carefully.
- For Data Access logs, enable only for critical projects or specific services where required.
- Use time-bounded queries and partition filters in BigQuery.
- Apply lifecycle rules to Storage archives.
Performance best practices
- Avoid exporting “everything everywhere” by default.
- Design BigQuery tables/datasets for your query patterns (time-based reporting is common).
- Build standard SQL views for common audit questions to reduce ad-hoc scanning.
Reliability best practices
- Monitor sink/export errors (permissions, destination issues).
- Use multiple destinations if your compliance posture requires independent retention layers.
- Consider multi-project or multi-folder isolation if your org is large.
Operations best practices
- Maintain an “Audit Manager runbook”:
- How to validate exports
- How to respond to missing logs
- How to handle access requests
- Change control for sink filters and retention settings.
- Periodic access reviews for audit data stores.
Governance/tagging/naming best practices
- Standardize names:
audit-logs-*datasetsaudit-archive-*bucketsauditmanager-*sinks/metrics- Use labels where supported to track owners and cost centers.
- Align resource locations to compliance boundaries (EU vs US, etc.).
12. Security Considerations
Identity and access model
- Logs and exports are controlled by IAM at multiple layers:
- Cloud Logging log buckets/views
- BigQuery datasets/tables
- Storage buckets/objects
- Sinks write using a writer identity. Treat it as a sensitive principal:
- Grant only the minimal write permissions needed.
Encryption
- Google Cloud encrypts data at rest by default.
- For stricter requirements:
- Use CMEK with Cloud KMS for BigQuery/Storage where supported and appropriate (verify current support and constraints).
- Protect KMS admin roles carefully; KMS admin compromise can undermine encryption controls.
Network exposure
- Avoid broad public access to BigQuery datasets and Storage buckets.
- If analysts must access from corporate environments, consider private access patterns and centralized identity controls.
Secrets handling
- Avoid embedding credentials in scripts.
- Use short-lived user credentials or Workload Identity where applicable.
- Store automation secrets in Secret Manager (if you build automation around reporting).
Audit/logging (meta-auditing)
- Audit the audit pipeline:
- Monitor changes to sinks, datasets, buckets, and IAM bindings.
- Set alerts on sink deletion/disablement attempts.
Compliance considerations
- Confirm:
- Retention duration and immutability requirements
- Data residency
- Who can access audit evidence
- Procedures for legal hold and eDiscovery
- Map your configuration to your chosen framework with your compliance team.
Common security mistakes
- Giving workload project owners permission to delete/disable sinks
- Exporting sensitive Data Access logs into broadly accessible datasets
- No retention policy (logs expire too early)
- No monitoring for pipeline failures
- Over-retaining in high-cost systems (e.g., multi-year BigQuery retention without justification)
Secure deployment recommendations
- Centralize evidence in a security-owned project.
- Use folder/org sinks for strong governance.
- Enforce retention via bucket retention policies and Storage retention policies where needed.
- Restrict delete permissions and use approvals/change control for audit pipeline changes.
13. Limitations and Gotchas
- No single “Audit Manager” product in Google Cloud: You must assemble capabilities from multiple services.
- Audit log coverage varies by service: Not all products emit the same data; verify per-service audit logging behavior.
- Data Access logs can be expensive/noisy: They can dramatically increase volume and cost.
- BigQuery wildcard queries can scan huge data: Always filter by time/partition and narrow fields.
- Export latency exists: Logs may take minutes to appear in destinations.
- Retention ≠ immutability: Retention policies help, but IAM permissions still matter. For Storage, consider retention policy lock (use carefully).
- Cross-project governance requires org setup: Aggregated sinks and strong separation-of-duties typically need organization-level design.
- Schema complexity: Audit log fields are nested; teams need training to query
protoPayloadcorrectly. - Multi-region constraints: Dataset/bucket locations must match compliance requirements; moving data later can be complex.
- Alert fatigue: Poorly tuned filters lead to noisy alerts and ignored signals.
14. Comparison with Alternatives
Audit Manager (capability) can be implemented in different ways depending on toolchain and requirements.
| Option | Best For | Strengths | Weaknesses | When to Choose |
|---|---|---|---|---|
| Audit Manager pattern (Cloud Audit Logs + Logging sinks + BigQuery/Storage) | Most Google Cloud orgs needing centralized audit evidence | Native, scalable, flexible, integrates with IAM and org structure | Requires architecture and ongoing ops; not a “one-click” compliance tool | When you need Google Cloud-native evidence collection and reporting |
| Cloud Logging (without exports) | Small teams, short retention, basic investigations | Lowest setup effort | Limited long-term analytics and retention controls; harder cross-project reporting | When requirements are light and you’re early-stage |
| Security Command Center (SCC) | Posture management and findings | Consolidates security findings, supports posture visibility | Not a replacement for audit log evidence store | When you need security posture + findings plus audit logs elsewhere |
| Cloud Asset Inventory exports | Configuration evidence and inventory snapshots | Great for “state of the world” evidence | Not an activity trail | When you need asset/IAM inventory and history evidence |
| Third-party SIEM (e.g., Splunk, Chronicle, etc.) | SOC operations and detection engineering | Advanced correlation, mature alerting workflows | Licensing costs, integration complexity | When you need enterprise SOC workflows beyond native tooling |
| AWS Audit Manager (other cloud) | AWS-only environments needing mapped controls | Control frameworks and evidence collection in AWS | Not Google Cloud; conceptually different product | Choose only if you are auditing AWS environments |
| Azure compliance tooling (other cloud) | Azure-only compliance workflows | Built-in compliance management in Azure | Not Google Cloud | Choose only if your audit domain is Azure |
| Self-managed ELK/OpenSearch | Teams that want full control and have ops maturity | Customizable, can be cost-effective at scale | Operational burden; scaling and retention complexity | Choose if you already operate logging/search infrastructure and accept ops overhead |
15. Real-World Example
Enterprise example (regulated SaaS with multiple teams)
- Problem: A SaaS company must pass SOC 2 and ISO 27001 audits and prove:
- Change management evidence
- Access governance evidence
- Retention of audit trails across all production projects
- Proposed architecture:
- Organization-level aggregated sink routes Admin Activity (and selected Data Access) logs into a central audit project.
- Export to:
- BigQuery for 90–180 days of analytics and investigation
- Cloud Storage archive for multi-year retention with lifecycle rules and retention policy
- Cloud Asset Inventory exports weekly snapshots (IAM policies, resources) to BigQuery.
- Log-based metrics + Monitoring alerting for:
- SetIamPolicy events
- Firewall changes
- KMS key disable events
- Why this service was chosen:
- Google Cloud-native controls, strong org-level governance, scalable analytics.
- Expected outcomes:
- Faster audit evidence production (standard queries/reports)
- Better incident response (org-wide search)
- Reduced risk of evidence tampering via separation of duties
Startup/small-team example (single project, early compliance)
- Problem: A startup wants basic governance and to prepare for SOC 2 later.
- Proposed architecture:
- Project-level sink exports Admin Activity logs to BigQuery with 30–90 day retention.
- A small set of alerts for IAM and networking changes.
- Why this service was chosen:
- Minimal cost and complexity, but builds good habits early.
- Expected outcomes:
- Clear change tracking
- Foundational evidence trail
- A path to scale to org-level sinks later
16. FAQ
1) Is “Audit Manager” an official Google Cloud product?
Not as a commonly documented first-party product name. In Google Cloud, audit management is typically implemented using Cloud Audit Logs and Cloud Logging exports. Verify in official docs if you have a specific link or SKU.
2) What’s the difference between Cloud Audit Logs and Cloud Logging?
Cloud Audit Logs are the events (audit entries). Cloud Logging is the platform where logs are stored, searched, and routed/exported.
3) Do I need to enable Admin Activity logs?
Admin Activity logs are generally available for supported services and are foundational for governance. Verify default behavior per service in official docs.
4) Are Data Access audit logs enabled by default?
It depends on the service and log type. Data Access logs can be optional and can have cost/volume implications. Verify per-service behavior.
5) How do I centralize logs across many projects?
Use folder- or organization-level aggregated sinks to route logs into a central audit project.
6) How can I prevent engineers from deleting audit evidence?
Use separation of duties: centralized sinks managed by a security team, restricted IAM on sinks/destinations, retention policies, and controlled admin access.
7) Should I export to BigQuery or Cloud Storage?
BigQuery is better for querying/reporting; Storage is better for long-term archive and retention economics. Many organizations use both.
8) Can I create audit-ready reports automatically?
Yes—commonly using BigQuery saved queries/views, scheduled queries, and dashboards. The exact reporting approach depends on your audit requirements.
9) How quickly do exported logs show up in BigQuery?
Usually within minutes, but there can be latency. Design processes assuming some delay.
10) How do I alert on risky changes (like IAM policy updates)?
Create log-based metrics and alerting policies in Cloud Monitoring, or export to a SIEM for advanced correlation.
11) Are audit logs encrypted?
Yes, Google Cloud encrypts data at rest by default. For stricter controls, consider CMEK where supported.
12) Can I use this for PCI/HIPAA evidence?
Often yes as part of an overall control set, but compliance requirements vary. Work with compliance/legal teams and verify logs cover required systems and retention.
13) What is the biggest cost risk in audit logging?
High-volume Data Access logging and frequent BigQuery queries scanning large tables.
14) How do I reduce BigQuery query costs for audit investigations?
Use partition/time filters, limit wildcard scans, use views and scheduled summaries, and restrict ad-hoc queries.
15) What’s the minimal viable Audit Manager setup?
Centralize Admin Activity logs to a protected BigQuery dataset for 30–90 days, with a small set of alerts for IAM/networking changes.
17. Top Online Resources to Learn Audit Manager
| Resource Type | Name | Why It Is Useful |
|---|---|---|
| Official documentation | Cloud Audit Logs | Core reference for audit log types and fields: https://cloud.google.com/logging/docs/audit |
| Official documentation | Cloud Logging routing and sinks | How to build centralized exports: https://cloud.google.com/logging/docs/routing/overview |
| Official documentation | Export logs (configure exports) | Destination setup details: https://cloud.google.com/logging/docs/export/configure_export_v2 |
| Official documentation | Cloud Logging storage (log buckets/retention) | Retention and bucket concepts: https://cloud.google.com/logging/docs/storage |
| Official documentation | BigQuery documentation | Querying exported logs and managing datasets: https://cloud.google.com/bigquery/docs |
| Official documentation | Cloud Asset Inventory | Asset inventory and exports for evidence: https://cloud.google.com/asset-inventory/docs/overview |
| Official documentation | Logs-based metrics | Turning logs into alertable signals: https://cloud.google.com/logging/docs/logs-based-metrics |
| Official pricing | Cloud Logging pricing | Understand ingestion/retention pricing: https://cloud.google.com/logging/pricing |
| Official pricing | BigQuery pricing | Storage/query cost model: https://cloud.google.com/bigquery/pricing |
| Official pricing | Cloud Storage pricing | Archive cost model and operations: https://cloud.google.com/storage/pricing |
| Pricing tool | Google Cloud Pricing Calculator | Build estimates by usage: https://cloud.google.com/products/calculator |
| Official product | Security Command Center | Posture/findings context (complements audit logs): https://cloud.google.com/security-command-center/docs |
| Official training | Google Cloud Skills Boost | Hands-on labs for logging/security (search within): https://www.cloudskillsboost.google/ |
| Official videos | Google Cloud Tech YouTube | Practical walkthroughs (search for Logging/Audit Logs): https://www.youtube.com/googlecloudtech |
| Community (reputable) | Google Cloud Architecture Center | Patterns for logging and security architectures: https://cloud.google.com/architecture |
18. Training and Certification Providers
| Institute | Suitable Audience | Likely Learning Focus | Mode | Website URL |
|---|---|---|---|---|
| DevOpsSchool.com | DevOps engineers, SREs, platform teams | Cloud ops + DevOps practices, may include logging and governance | Check website | https://www.devopsschool.com/ |
| ScmGalaxy.com | Beginners to intermediate DevOps learners | DevOps fundamentals, tooling, process | Check website | https://www.scmgalaxy.com/ |
| CLoudOpsNow.in | Cloud operations teams | Cloud operations practices, monitoring/logging | Check website | https://www.cloudopsnow.in/ |
| SreSchool.com | SREs and reliability-focused engineers | SRE principles, monitoring/incident response | Check website | https://www.sreschool.com/ |
| AiOpsSchool.com | Ops teams adopting AIOps | AIOps concepts, automation around ops signals | Check website | https://www.aiopsschool.com/ |
19. Top Trainers
| Platform/Site | Likely Specialization | Suitable Audience | Website URL |
|---|---|---|---|
| RajeshKumar.xyz | DevOps/cloud coaching (verify offerings) | Engineers seeking guided training | https://rajeshkumar.xyz/ |
| devopstrainer.in | DevOps training (verify offerings) | Beginners to working professionals | https://devopstrainer.in/ |
| devopsfreelancer.com | Freelance DevOps help/training (verify offerings) | Teams needing short-term support | https://devopsfreelancer.com/ |
| devopssupport.in | DevOps support/training (verify offerings) | Ops teams needing practical help | https://devopssupport.in/ |
20. Top Consulting Companies
| Company | Likely Service Area | Where They May Help | Consulting Use Case Examples | Website URL |
|---|---|---|---|---|
| cotocus.com | Cloud/DevOps consulting | Architecture, implementation support | Central audit logging design, IAM hardening, cost optimization | https://cotocus.com/ |
| DevOpsSchool.com | DevOps and cloud consulting | Delivery + enablement | Implementing centralized logging pipelines, training teams on ops governance | https://www.devopsschool.com/ |
| DEVOPSCONSULTING.IN | DevOps consulting services | Advisory and implementation | Building audit evidence pipelines, setting up alerting/runbooks | https://devopsconsulting.in/ |
21. Career and Learning Roadmap
What to learn before this service
- Google Cloud fundamentals:
- Projects, folders, organizations
- IAM principals and roles
- Cloud Logging basics:
- Log Explorer, log names, filters
- BigQuery basics:
- Datasets, tables, partitioning concepts
- Standard SQL querying
- Security basics:
- Least privilege, separation of duties, incident response
What to learn after this service
- Organization-wide governance:
- Org policies and policy-as-code approaches (where applicable)
- Security detection engineering:
- Building robust alert rules and reducing false positives
- SIEM integration:
- Export to external platforms for correlation and case management
- Compliance engineering:
- Control mapping, evidence automation, audit response processes
Job roles that use it
- Cloud Security Engineer
- Security Operations (SOC) Analyst (cloud-focused)
- Platform Engineer / SRE (governance responsibilities)
- Cloud Architect (enterprise governance)
- Compliance Engineer / GRC engineer (with technical focus)
Certification path (if available)
Google Cloud certifications that commonly align with this domain include:
– Professional Cloud Security Engineer
– Professional Cloud Architect
Always verify current certification tracks on the official site: https://cloud.google.com/learn/certification
Project ideas for practice
- Build org-level aggregated sinks in a multi-project sandbox.
- Create a BigQuery “audit evidence pack”:
- privileged role grants
- service account key creation
- firewall changes
- Add Cloud Asset Inventory exports and join them with audit logs for richer reports.
- Implement alerting for high-risk methods and build an investigation runbook.
22. Glossary
- Cloud Audit Logs: Google Cloud’s audit event logs for administrative actions and data access.
- Admin Activity logs: Audit logs for administrative changes (e.g., IAM policy updates).
- Data Access logs: Audit logs for reading/writing user data (volume can be high).
- System Event logs: System-generated events affecting resources.
- Policy Denied logs: Events where actions are denied by policy/IAM controls.
- Cloud Logging: Service for storing, searching, and routing logs.
- Log Router: Cloud Logging component that routes logs to destinations via sinks.
- Sink: A routing rule that exports logs matching a filter to a destination.
- Aggregated sink: A sink created at folder/org scope to capture logs from multiple projects.
- BigQuery: Google Cloud data warehouse used for analytics and reporting.
- Cloud Storage: Object storage used for archives and long-term retention.
- Retention policy: A rule that keeps logs/objects for a specified period.
- CMEK: Customer-managed encryption keys (Cloud KMS keys you control).
- Log-based metric: A metric derived from logs, used for alerting and dashboards.
- Separation of duties: Designing permissions so no single team can both operate workloads and erase evidence.
23. Summary
Audit Manager on Google Cloud is best understood as a Security audit evidence capability built from Cloud Audit Logs and Cloud Logging, with exports to BigQuery (for reporting and investigations) and Cloud Storage (for long-term retention). It matters because it creates a centralized, defensible record of administrative actions and (where needed) data access—supporting compliance, incident response, and operational governance.
Key cost and security points: – Costs are driven by log volume (especially Data Access), retention, and BigQuery query patterns. – Security depends on separation of duties, tight IAM, and retention/immutability controls.
Use this approach when you need organization-wide auditability and evidence automation. Start small (Admin Activity → BigQuery), then expand thoughtfully with retention, Storage archives, and alerting.
Next step: move from the single-project lab to an organization/folder-level aggregated sink design and build standardized BigQuery evidence queries aligned to your compliance controls.