{"id":710,"date":"2026-04-15T03:12:09","date_gmt":"2026-04-15T03:12:09","guid":{"rendered":"https:\/\/www.devopsschool.com\/tutorials\/google-cloud-mainframe-assessment-tool-tutorial-architecture-pricing-use-cases-and-hands-on-guide-for-migration\/"},"modified":"2026-04-15T03:12:09","modified_gmt":"2026-04-15T03:12:09","slug":"google-cloud-mainframe-assessment-tool-tutorial-architecture-pricing-use-cases-and-hands-on-guide-for-migration","status":"publish","type":"post","link":"https:\/\/www.devopsschool.com\/tutorials\/google-cloud-mainframe-assessment-tool-tutorial-architecture-pricing-use-cases-and-hands-on-guide-for-migration\/","title":{"rendered":"Google Cloud Mainframe Assessment Tool Tutorial: Architecture, Pricing, Use Cases, and Hands-On Guide for Migration"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\">Category<\/h2>\n\n\n\n<p>Migration<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">1. Introduction<\/h2>\n\n\n\n<p>Mainframe Assessment Tool is a Google Cloud\u2013aligned assessment capability used in <strong>Migration<\/strong> projects to understand what\u2019s running on an enterprise mainframe, how complex it is, and what it will take to modernize or migrate it.<\/p>\n\n\n\n<p>In simple terms: <strong>it helps you take inventory of mainframe applications and dependencies and produce assessment outputs<\/strong> (reports\/exports) that you can use to plan a migration roadmap\u2014before you commit to a target architecture or start refactoring.<\/p>\n\n\n\n<p>In more technical terms: Mainframe Assessment Tool is typically used early in a mainframe modernization program to <strong>collect metadata and workload characteristics<\/strong> from the mainframe estate, <strong>analyze application and data dependencies<\/strong>, and <strong>produce structured outputs<\/strong> that can be used for estimating effort, identifying candidate applications, grouping workloads into migration waves, and informing target platform choices on Google Cloud (for example, replatform vs refactor approaches). Exact collectors, supported mainframe components, and output formats should be confirmed in the current official documentation.<\/p>\n\n\n\n<p>The problem it solves is straightforward and common: <strong>mainframe migration planning fails when teams don\u2019t have a reliable inventory<\/strong> (applications, interfaces, batch jobs, data stores, schedules, operational constraints). Mainframe Assessment Tool is designed to reduce uncertainty and planning risk by turning \u201ctribal knowledge\u201d into artifacts you can review, validate, and operationalize.<\/p>\n\n\n\n<blockquote>\n<p>Naming note (verify in official docs): Google Cloud\u2019s mainframe modernization portfolio evolves over time. If you encounter different names in your organization (for example, assessment tooling referenced under a broader \u201cmainframe modernization\u201d umbrella), treat <strong>\u201cMainframe Assessment Tool\u201d<\/strong> as the primary service name for this tutorial and confirm current branding and workflow in the latest Google Cloud documentation.<\/p>\n<\/blockquote>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">2. What is Mainframe Assessment Tool?<\/h2>\n\n\n\n<p><strong>Official purpose (high level):<\/strong> Mainframe Assessment Tool is intended to support <strong>mainframe modernization planning<\/strong> by assessing the current state of mainframe workloads and producing actionable outputs for migration strategy, sizing, sequencing, and risk management.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Core capabilities (what it generally does)<\/h3>\n\n\n\n<p>Mainframe Assessment Tool commonly supports activities such as:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Application and workload inventory<\/strong>: Identify applications, modules, batch jobs, operational schedules, interfaces, and related artifacts.<\/li>\n<li><strong>Dependency discovery<\/strong>: Identify relationships between programs, data stores, files\/datasets, and upstream\/downstream integrations.<\/li>\n<li><strong>Complexity and modernization readiness insights<\/strong>: Highlight candidates for replatforming\/refactoring and flag potential risk areas (for example, tight coupling, heavy batch windows, or specialized dependencies).<\/li>\n<li><strong>Assessment outputs<\/strong>: Generate reports and structured exports that teams can use for planning, estimation, and stakeholder communication.<\/li>\n<\/ul>\n\n\n\n<p>Because Google Cloud product capabilities can change, confirm the <em>exact<\/em> supported inventory sources, analysis depth, and export formats in the current documentation.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Major components (typical)<\/h3>\n\n\n\n<p>Depending on how Google Cloud distributes the tooling, you\u2019ll typically see a combination of:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Collectors \/ scanners<\/strong> (customer-run): Deployed in the mainframe environment and\/or a connected environment to collect metadata.<\/li>\n<li><strong>Analysis and report generation<\/strong>: Produces assessment reports\/exports.<\/li>\n<li><strong>Optional cloud storage\/analytics path<\/strong>: Upload assessment outputs to Google Cloud for centralized sharing, governance, analytics (for example, storing exports in Cloud Storage and analyzing with BigQuery).<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Service type<\/h3>\n\n\n\n<p>Mainframe Assessment Tool is best understood as <strong>assessment tooling aligned to Google Cloud mainframe modernization<\/strong>. In many organizations, it\u2019s operated as <strong>customer-managed tooling<\/strong> (run in your environment), with Google Cloud used as the destination for outputs and analytics. Verify whether your edition\/workflow includes any managed control plane.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Scope: regional \/ global \/ project-scoped?<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>The tool execution and scanning are typically <strong>environment-scoped<\/strong> (wherever the collectors run).<\/li>\n<li>If you store results in Google Cloud, those artifacts become <strong>project-scoped<\/strong> resources (Cloud Storage buckets, BigQuery datasets) and will also be subject to <strong>region\/location<\/strong> choices you make for storage and analytics.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">How it fits into the Google Cloud ecosystem<\/h3>\n\n\n\n<p>Mainframe Assessment Tool is commonly used with:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Cloud Storage<\/strong> for durable, shareable storage of assessment exports.<\/li>\n<li><strong>BigQuery<\/strong> for querying inventory\/metadata at scale across many applications.<\/li>\n<li><strong>Looker Studio \/ Looker<\/strong> (optional) for dashboards.<\/li>\n<li><strong>Cloud KMS<\/strong> for customer-managed encryption keys (CMEK) to protect sensitive exports.<\/li>\n<li><strong>Cloud Logging \/ Cloud Audit Logs<\/strong> for auditing access to the exported assessment artifacts.<\/li>\n<li><strong>IAM<\/strong> for least-privilege access control.<\/li>\n<li><strong>Network connectivity<\/strong> (Cloud VPN \/ Cloud Interconnect) if assessment outputs flow from on-prem to Google Cloud privately.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">3. Why use Mainframe Assessment Tool?<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Business reasons<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Reduce migration uncertainty<\/strong>: Executive stakeholders want credible timelines and costs; assessment outputs reduce guesswork.<\/li>\n<li><strong>Prioritize modernization work<\/strong>: Identify which applications deliver the most value when modernized first.<\/li>\n<li><strong>Support phased migration<\/strong>: Break a multi-year program into migration waves with measurable milestones.<\/li>\n<li><strong>Improve vendor and partner alignment<\/strong>: A shared inventory\/report reduces ambiguity with SI partners and internal teams.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Technical reasons<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Inventory and dependency clarity<\/strong>: Mainframes often have hidden coupling via shared datasets, scheduler dependencies, and batch chains.<\/li>\n<li><strong>Better target architecture decisions<\/strong>: Replatform vs refactor choices depend on workload patterns, dependencies, and operational constraints.<\/li>\n<li><strong>Earlier risk discovery<\/strong>: Identify \u201chard blockers\u201d (legacy interfaces, specialized data access patterns, unowned code) before build begins.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Operational reasons<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Operational readiness<\/strong>: Capture batch windows, recovery expectations, SLAs, and job scheduling complexity.<\/li>\n<li><strong>Standardize documentation<\/strong>: Convert tribal knowledge into durable artifacts usable by operations and engineering.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Security\/compliance reasons<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Data classification awareness<\/strong>: Even assessment metadata can be sensitive (system names, dataset names, integration endpoints).<\/li>\n<li><strong>Auditability<\/strong>: When outputs are stored in Google Cloud, you can enforce IAM, encryption, and access logging.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Scalability\/performance reasons<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Program-scale planning<\/strong>: Large enterprises may have hundreds\/thousands of jobs and many applications; an assessment-driven approach scales better than manual spreadsheets.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">When teams should choose it<\/h3>\n\n\n\n<p>Choose Mainframe Assessment Tool when:\n&#8211; You\u2019re planning a <strong>Google Cloud mainframe modernization<\/strong> initiative and need a structured baseline.\n&#8211; You want to <strong>quantify<\/strong> complexity and dependencies before committing to timelines or architecture.\n&#8211; You need repeatable artifacts for <strong>governance<\/strong> and <strong>wave planning<\/strong>.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">When teams should not choose it<\/h3>\n\n\n\n<p>Avoid or deprioritize it when:\n&#8211; You are <strong>not<\/strong> modernizing mainframe workloads to Google Cloud and the tool doesn\u2019t align with your target platform.\n&#8211; Your migration is trivially small and already fully documented (rare for mainframes).\n&#8211; You cannot obtain the access approvals required to run collectors or export metadata from the mainframe environment.\n&#8211; You need deep source-code transformation; assessment tools inform planning but do not replace modernization engineering workstreams.<\/p>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">4. Where is Mainframe Assessment Tool used?<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Industries<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Banking and financial services<\/li>\n<li>Insurance<\/li>\n<li>Retail and logistics<\/li>\n<li>Airlines and travel<\/li>\n<li>Telecommunications<\/li>\n<li>Government and public sector<\/li>\n<li>Manufacturing (ERP\/batch-heavy environments)<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Team types<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Enterprise architects and modernization architects<\/li>\n<li>Platform engineering teams<\/li>\n<li>Mainframe engineering teams<\/li>\n<li>Cloud center of excellence (CCoE)<\/li>\n<li>Security and compliance teams<\/li>\n<li>Program management offices (PMO)<\/li>\n<li>SRE\/operations leaders planning cutovers and SLAs<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Workloads and architectures<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Batch-heavy<\/strong> workloads with schedulers, chained jobs, and strict batch windows<\/li>\n<li><strong>Transaction processing<\/strong> workloads with upstream\/downstream integrations<\/li>\n<li>Mixed estates with:<\/li>\n<li>Multiple application portfolios<\/li>\n<li>Shared data stores<\/li>\n<li>File transfers and message-based interfaces<\/li>\n<li>Complex operational runbooks<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Real-world deployment contexts<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Proof-of-concept assessments to estimate program scope<\/li>\n<li>Enterprise-wide discovery for a multi-year modernization roadmap<\/li>\n<li>Re-assessment after remediation (for example, decoupling applications) to validate improved readiness<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Production vs dev\/test usage<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Most often used in <strong>pre-production planning<\/strong> phases.<\/li>\n<li>Can be repeated periodically (for example, quarterly) as part of continuous modernization governance, but always with careful access control and change management.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">5. Top Use Cases and Scenarios<\/h2>\n\n\n\n<p>Below are realistic ways organizations use Mainframe Assessment Tool during Google Cloud Migration planning.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">1) Enterprise application inventory baseline<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> No accurate list of what applications\/jobs\/interfaces exist on the mainframe.<\/li>\n<li><strong>Why this service fits:<\/strong> Produces a consistent inventory artifact to anchor the program.<\/li>\n<li><strong>Scenario:<\/strong> A bank consolidates scattered spreadsheets into a single assessment dataset to kick off wave planning.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">2) Dependency mapping for wave grouping<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> Teams pick \u201ceasy\u201d apps first, but discover hidden shared dependencies mid-migration.<\/li>\n<li><strong>Why this service fits:<\/strong> Helps identify dependency clusters so you migrate coherent groups.<\/li>\n<li><strong>Scenario:<\/strong> An insurer identifies that three \u201csmall\u201d apps share critical datasets and must move together.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">3) Batch window and scheduling risk assessment<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> Migration fails due to batch window overruns after platform change.<\/li>\n<li><strong>Why this service fits:<\/strong> Surfaces batch workload characteristics and scheduling chains (to the extent supported).<\/li>\n<li><strong>Scenario:<\/strong> A retailer identifies the top 20 longest-running batch chains and plans performance testing early.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">4) Candidate identification for replatform vs refactor<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> Architecture teams need a rational method to choose modernization approaches.<\/li>\n<li><strong>Why this service fits:<\/strong> Assessment output informs which apps are better candidates for simpler moves vs deep refactors.<\/li>\n<li><strong>Scenario:<\/strong> A telecom identifies a portfolio suitable for replatform while reserving complex apps for refactor.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">5) Interface and integration discovery<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> Unknown upstream\/downstream systems cause outages during cutover.<\/li>\n<li><strong>Why this service fits:<\/strong> Highlights integration points (based on available signals).<\/li>\n<li><strong>Scenario:<\/strong> A logistics firm finds undocumented file-transfer integrations with a warehouse system.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">6) Compliance scoping and data classification planning<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> Security must understand where sensitive data is referenced before migration.<\/li>\n<li><strong>Why this service fits:<\/strong> Assessment outputs help build an initial map of data-related artifacts and access boundaries.<\/li>\n<li><strong>Scenario:<\/strong> A public sector agency uses assessment artifacts to define which workloads require CMEK and VPC Service Controls.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">7) Effort estimation and stakeholder reporting<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> Leadership needs cost\/time estimates and progress tracking.<\/li>\n<li><strong>Why this service fits:<\/strong> Provides consistent reports and exports used in governance.<\/li>\n<li><strong>Scenario:<\/strong> A finance PMO uses assessment reports to approve staffing and vendor budgets.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">8) M&amp;A mainframe rationalization<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> After an acquisition, two mainframe estates must be rationalized.<\/li>\n<li><strong>Why this service fits:<\/strong> Produces comparable inventories to drive consolidation decisions.<\/li>\n<li><strong>Scenario:<\/strong> A manufacturer compares portfolios and identifies duplicate batch workflows.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">9) Data migration planning for dependent datasets<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> Application migration stalls because data dependencies aren\u2019t understood.<\/li>\n<li><strong>Why this service fits:<\/strong> Helps surface dataset\/file dependencies (where supported) to plan sequencing.<\/li>\n<li><strong>Scenario:<\/strong> A bank plans phased data movement and replication based on dependency outputs.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">10) Operating model redesign<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> Cloud operations differs from mainframe operations; teams need to plan changes.<\/li>\n<li><strong>Why this service fits:<\/strong> Assessment artifacts inform runbook redesign, monitoring priorities, and ownership mapping.<\/li>\n<li><strong>Scenario:<\/strong> An airline builds SRE runbooks around the top availability-critical workloads identified in the assessment.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">11) Building a modernization backlog<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> Engineering teams need an actionable backlog, not just \u201cmigrate the mainframe\u201d.<\/li>\n<li><strong>Why this service fits:<\/strong> Outputs can be transformed into epics (decouple, interface remediation, test harness creation).<\/li>\n<li><strong>Scenario:<\/strong> A retailer turns top dependency risks into remediation work before wave 1.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">12) Program governance and re-assessment checkpoints<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> Program drift occurs; improvements aren\u2019t measured.<\/li>\n<li><strong>Why this service fits:<\/strong> Repeat assessments create measurable checkpoints.<\/li>\n<li><strong>Scenario:<\/strong> A telecom re-runs assessment after decoupling to validate reduced dependency complexity.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">6. Core Features<\/h2>\n\n\n\n<p>Because Mainframe Assessment Tool\u2019s exact feature list can vary by release\/edition and by what signals you can extract from your environment, treat the items below as <strong>core feature categories<\/strong> and confirm specifics in official docs.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Feature 1: Structured discovery (inventory collection)<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>What it does:<\/strong> Collects metadata describing applications and operational artifacts in the mainframe estate.<\/li>\n<li><strong>Why it matters:<\/strong> Migration plans fail when the inventory is incomplete.<\/li>\n<li><strong>Practical benefit:<\/strong> A baseline inventory for scoping and sequencing.<\/li>\n<li><strong>Limitations\/caveats:<\/strong> Discovery depth depends on what you can access and what the tool supports; sensitive environments may restrict collection.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Feature 2: Dependency analysis (relationship mapping)<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>What it does:<\/strong> Identifies relationships between code modules, jobs, datasets\/files, and integrations (to the extent supported).<\/li>\n<li><strong>Why it matters:<\/strong> Dependency clusters determine migration waves and cutover risk.<\/li>\n<li><strong>Practical benefit:<\/strong> Reduces surprises late in the program.<\/li>\n<li><strong>Limitations\/caveats:<\/strong> Some dependencies are implicit or runtime-only and may not be fully discoverable.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Feature 3: Complexity and risk indicators<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>What it does:<\/strong> Produces indicators useful for prioritization (for example, size\/complexity proxies, coupling signals, operational criticality inputs).<\/li>\n<li><strong>Why it matters:<\/strong> Helps separate \u201cwave 1 candidates\u201d from \u201cneeds remediation first.\u201d<\/li>\n<li><strong>Practical benefit:<\/strong> Objective-ish prioritization instead of purely opinion-based ranking.<\/li>\n<li><strong>Limitations\/caveats:<\/strong> Complexity metrics are approximations; validate with SMEs.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Feature 4: Report generation for stakeholders<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>What it does:<\/strong> Generates assessment reports suitable for technical and executive audiences.<\/li>\n<li><strong>Why it matters:<\/strong> Programs require stakeholder alignment and funding approvals.<\/li>\n<li><strong>Practical benefit:<\/strong> Faster governance cycles and clearer communication.<\/li>\n<li><strong>Limitations\/caveats:<\/strong> Reports need context; don\u2019t treat them as a complete migration plan.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Feature 5: Exportable outputs for analytics and governance<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>What it does:<\/strong> Produces exports that can be loaded into analytics tools (commonly CSV\/JSON-like structures; verify exact formats).<\/li>\n<li><strong>Why it matters:<\/strong> You can consolidate, query, and trend results across portfolios and reassessments.<\/li>\n<li><strong>Practical benefit:<\/strong> Build dashboards, track progress, and integrate with ticketing\/backlog tools.<\/li>\n<li><strong>Limitations\/caveats:<\/strong> Schema and semantics must be understood; avoid building brittle pipelines without version control.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Feature 6: Repeatable execution for re-assessment<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>What it does:<\/strong> Supports running the assessment again to detect drift or validate remediation.<\/li>\n<li><strong>Why it matters:<\/strong> Mainframe estates change; assessments can go stale quickly.<\/li>\n<li><strong>Practical benefit:<\/strong> Better governance and fewer \u201cunknown unknowns.\u201d<\/li>\n<li><strong>Limitations\/caveats:<\/strong> Coordinate with change windows and security approvals.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Feature 7: Alignment with Google Cloud Migration planning<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>What it does:<\/strong> Helps provide inputs to a Google Cloud modernization roadmap (target architecture, wave plan, landing zone readiness).<\/li>\n<li><strong>Why it matters:<\/strong> Assessment outputs are only valuable if they translate into an actionable plan.<\/li>\n<li><strong>Practical benefit:<\/strong> More reliable planning for Google Cloud landing zone, connectivity, security, and operations.<\/li>\n<li><strong>Limitations\/caveats:<\/strong> The tool does not automatically build your landing zone or migrate workloads\u2014teams must implement.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">7. Architecture and How It Works<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">High-level architecture<\/h3>\n\n\n\n<p>A common pattern is:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Run Mainframe Assessment Tool collectors<\/strong> in or near the mainframe environment (customer-controlled).<\/li>\n<li>Produce <strong>assessment outputs<\/strong> (reports\/exports).<\/li>\n<li>Transfer outputs to Google Cloud, typically into a <strong>Cloud Storage<\/strong> bucket.<\/li>\n<li>Optionally load exports into <strong>BigQuery<\/strong> for analysis and dashboards.<\/li>\n<li>Apply <strong>IAM<\/strong>, <strong>encryption<\/strong>, and <strong>audit logging<\/strong> across the workflow.<\/li>\n<\/ol>\n\n\n\n<h3 class=\"wp-block-heading\">Request\/data\/control flow<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Control flow:<\/strong> You schedule\/execute the assessment and decide what to export.<\/li>\n<li><strong>Data flow:<\/strong> Metadata and reports flow from mainframe \u2192 controlled staging \u2192 Google Cloud storage\/analytics.<\/li>\n<li><strong>Security flow:<\/strong> IAM governs who can upload, read, and analyze; KMS can enforce CMEK encryption; audit logs capture access.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Integrations with related services (common)<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Cloud Storage<\/strong>: landing zone for exports.<\/li>\n<li><strong>BigQuery<\/strong>: portfolio analytics and querying.<\/li>\n<li><strong>Cloud KMS<\/strong>: CMEK for sensitive metadata.<\/li>\n<li><strong>Cloud Logging \/ Cloud Audit Logs<\/strong>: audit access and changes.<\/li>\n<li><strong>Looker Studio \/ Looker<\/strong>: dashboards for program tracking.<\/li>\n<li><strong>Cloud VPN \/ Cloud Interconnect<\/strong>: private connectivity from on-prem to Google Cloud.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Dependency services<\/h3>\n\n\n\n<p>Mainframe Assessment Tool itself may not require Google Cloud APIs to run (if it\u2019s customer-managed tooling), but <strong>your cloud-side pipeline<\/strong> typically depends on:\n&#8211; Cloud Storage API\n&#8211; BigQuery API (optional)\n&#8211; Cloud KMS API (optional)\n&#8211; IAM \/ Cloud Resource Manager\n&#8211; Cloud Logging (audit logs are enabled by default for many services)<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Security\/authentication model<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Collector side:<\/strong> Depends on your deployment. Often uses local credentials and privileged read access to collect metadata. Treat it as sensitive.<\/li>\n<li><strong>Google Cloud side:<\/strong> Use IAM with a dedicated <strong>service account<\/strong> for uploads and separate reader roles for analysts. Prefer <strong>short-lived credentials<\/strong> and <strong>no broad project owner roles<\/strong>.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Networking model<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Exports can be transferred:<\/li>\n<li>Over the public internet using HTTPS to Cloud Storage endpoints, secured with IAM and optionally signed URLs.<\/li>\n<li>Over private connectivity using Cloud VPN or Cloud Interconnect, plus Google access paths (availability depends on your network design; verify in official docs and with your network team).<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Monitoring\/logging\/governance considerations<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Enable and review <strong>Cloud Audit Logs<\/strong> for Cloud Storage and BigQuery.<\/li>\n<li>Use <strong>log sinks<\/strong> to centralize security logs into a dedicated logging project.<\/li>\n<li>Apply <strong>resource labels<\/strong> and standardized naming to buckets\/datasets for governance and cost tracking.<\/li>\n<li>Consider <strong>VPC Service Controls<\/strong> for reducing data exfiltration risk if your compliance model requires it (verify applicability for your environment).<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Simple architecture diagram (Mermaid)<\/h3>\n\n\n\n<pre><code class=\"language-mermaid\">flowchart LR\n  MF[Mainframe Environment] --&gt; MAT[Mainframe Assessment Tool\\n(collector + report\/export)]\n  MAT --&gt;|Export files| STAGE[Secure staging host or share\\n(optional)]\n  STAGE --&gt;|Upload| GCS[Cloud Storage bucket\\n(project-scoped)]\n  GCS --&gt; BQ[BigQuery dataset\\n(optional)]\n  BQ --&gt; DASH[Dashboards \/ Reports\\n(Looker Studio\/Looker)]\n<\/code><\/pre>\n\n\n\n<h3 class=\"wp-block-heading\">Production-style architecture diagram (Mermaid)<\/h3>\n\n\n\n<pre><code class=\"language-mermaid\">flowchart TB\n  subgraph OnPrem[On-Prem \/ Mainframe Data Center]\n    MF[Mainframe]\n    MAT[Mainframe Assessment Tool\\nCollectors + Analyzer]\n    JH[Hardened Jump Host\\n(or CI runner)]\n    MF --&gt; MAT\n    MAT --&gt;|Exports| JH\n  end\n\n  subgraph Connectivity[Connectivity]\n    VPN[Cloud VPN or Interconnect]\n  end\n\n  subgraph GCP[Google Cloud Project(s)]\n    subgraph Sec[Security &amp; Governance]\n      IAM[IAM + Service Accounts]\n      KMS[Cloud KMS (CMEK)]\n      AUD[Cloud Audit Logs]\n      POL[Org Policy \/ VPC Service Controls\\n(optional)]\n    end\n\n    GCS[Cloud Storage\\nAssessment Landing Bucket]\n    BQ[BigQuery\\nAssessment Analytics]\n    LOG[Cloud Logging \/ SIEM sink\\n(optional)]\n    BI[Looker Studio \/ Looker\\n(optional)]\n  end\n\n  JH --&gt;|HTTPS upload| VPN\n  VPN --&gt; GCS\n  GCS --&gt; BQ\n  BQ --&gt; BI\n\n  IAM --- GCS\n  IAM --- BQ\n  KMS --- GCS\n  KMS --- BQ\n  AUD --- GCS\n  AUD --- BQ\n  AUD --&gt; LOG\n  POL --- GCS\n  POL --- BQ\n<\/code><\/pre>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">8. Prerequisites<\/h2>\n\n\n\n<p>This tutorial includes a hands-on lab that focuses on building a <strong>Google Cloud landing path for Mainframe Assessment Tool outputs<\/strong>. It does not require an actual mainframe.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Account\/project requirements<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>A Google Cloud account with permission to create projects or an existing project you can use.<\/li>\n<li>Billing enabled on the project.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Permissions \/ IAM roles<\/h3>\n\n\n\n<p>For the lab, you need permissions to:\n&#8211; Create and manage Cloud Storage buckets\n&#8211; Create service accounts and grant IAM roles\n&#8211; Enable APIs\n&#8211; Create BigQuery datasets\/tables<\/p>\n\n\n\n<p>Suggested roles for a lab admin (choose the least privilege that works in your environment):\n&#8211; <code>roles\/storage.admin<\/code> (or narrower, if you pre-create the bucket)\n&#8211; <code>roles\/iam.serviceAccountAdmin<\/code> and <code>roles\/resourcemanager.projectIamAdmin<\/code> (or have an admin pre-create IAM bindings)\n&#8211; <code>roles\/bigquery.admin<\/code> (optional, for BigQuery steps)\n&#8211; <code>roles\/serviceusage.serviceUsageAdmin<\/code> to enable APIs<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Billing requirements<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Cloud Storage and BigQuery usage will incur charges if you store data and run queries beyond free allowances. Keep files small for the lab.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">CLI\/SDK\/tools needed<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><a href=\"https:\/\/cloud.google.com\/sdk\/docs\/install\">Google Cloud SDK (<code>gcloud<\/code>)<\/a> (or use Cloud Shell)<\/li>\n<li><code>gsutil<\/code> (included with Cloud SDK)<\/li>\n<li><code>bq<\/code> CLI (included with Cloud SDK) for BigQuery steps<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Region availability<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Cloud Storage buckets use a <strong>location<\/strong> (region or multi-region).<\/li>\n<li>BigQuery datasets use a <strong>location<\/strong>.<\/li>\n<li>Choose locations compatible with your data residency requirements.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Quotas\/limits<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Cloud Storage: bucket\/object quotas (rarely limiting for a lab)<\/li>\n<li>BigQuery: load job and query quotas (unlikely limiting for small lab)<\/li>\n<li>If your organization has restrictive org policies, you may need admin support.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Prerequisite services<\/h3>\n\n\n\n<p>Enable these APIs in your project (steps included in the lab):\n&#8211; Cloud Storage API\n&#8211; BigQuery API (optional)\n&#8211; Cloud KMS API (optional)<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Mainframe-specific prerequisites (if you are doing a real assessment)<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Access approvals to collect metadata from the mainframe environment<\/li>\n<li>A secure staging environment for exports<\/li>\n<li>A data handling and classification decision for assessment outputs<br\/>\n  (Even \u201cmetadata\u201d can be sensitive.)<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">9. Pricing \/ Cost<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Pricing model (what you should expect)<\/h3>\n\n\n\n<p>Mainframe Assessment Tool pricing can vary depending on how it\u2019s provided (for example, bundled into a modernization engagement, partner-delivered, or otherwise). <strong>Do not assume a specific per-hour price<\/strong> without confirming via official documentation or your Google Cloud account team.<\/p>\n\n\n\n<p>What is reliably cost-relevant in most real deployments is:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Google Cloud services you use to store, process, and analyze assessment outputs<\/strong>, such as:<\/li>\n<li>Cloud Storage (object storage)<\/li>\n<li>BigQuery (data warehousing\/analytics)<\/li>\n<li>Cloud KMS (key operations, if using CMEK)<\/li>\n<li>Cloud Logging (log ingestion\/retention beyond free allocations)<\/li>\n<li>Network egress\/ingress (depending on traffic patterns)<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Pricing dimensions (cloud-side)<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Cloud Storage<\/strong><\/li>\n<li>Data stored (GB-month)<\/li>\n<li>Operations (PUT\/GET\/list requests)<\/li>\n<li>Data retrieval and network egress (varies)<\/li>\n<li>\n<p>Storage class (Standard\/Nearline\/Coldline\/Archive)<\/p>\n<\/li>\n<li>\n<p><strong>BigQuery<\/strong><\/p>\n<\/li>\n<li>Data storage (active\/long-term)<\/li>\n<li>Data processed by queries (on-demand) or slot reservations (flat-rate)<\/li>\n<li>\n<p>Load jobs are generally not the main driver; query processing is.<\/p>\n<\/li>\n<li>\n<p><strong>Cloud KMS<\/strong><\/p>\n<\/li>\n<li>Key versions (monthly)<\/li>\n<li>\n<p>Cryptographic operations (per use)<\/p>\n<\/li>\n<li>\n<p><strong>Networking<\/strong><\/p>\n<\/li>\n<li>Uploads into Google Cloud are typically not charged as egress, but <strong>egress out of Google Cloud is charged<\/strong><\/li>\n<li>Interconnect\/VPN have their own cost structures<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Free tier (if applicable)<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Google Cloud has free-tier elements for some services, but they change and vary by region.<br\/>\n  Use official references:<\/li>\n<li>Google Cloud pricing overview: https:\/\/cloud.google.com\/pricing<\/li>\n<li>Pricing calculator: https:\/\/cloud.google.com\/products\/calculator<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Cost drivers<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Size of exports (number of applications\/jobs, history depth, detail level)<\/li>\n<li>Frequency of reassessments<\/li>\n<li>BigQuery query patterns (dashboards that run frequently can process lots of data)<\/li>\n<li>Retention duration for reports\/exports and logs<\/li>\n<li>Whether you replicate data across regions for DR\/compliance<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Hidden\/indirect costs<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Operational overhead<\/strong>: secure staging hosts, CI runners, access reviews<\/li>\n<li><strong>Security tooling<\/strong>: SIEM integration, additional log sinks, DLP scanning (if used)<\/li>\n<li><strong>People costs<\/strong>: time spent validating and curating assessment outputs<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Network\/data transfer implications<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Uploading assessment outputs to Cloud Storage is usually straightforward. The bigger cost risk is:<\/li>\n<li>Downloading large exports repeatedly (egress)<\/li>\n<li>Cross-region replication or multi-region storage when not required<\/li>\n<li>Interconnect\/VPN monthly and throughput charges<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">How to optimize cost<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Store raw exports in Cloud Storage and move older ones to cheaper classes (Nearline\/Coldline\/Archive) via lifecycle rules.<\/li>\n<li>Load only curated\/normalized subsets into BigQuery.<\/li>\n<li>Partition\/cluster BigQuery tables (when applicable) and limit dashboard refresh rates.<\/li>\n<li>Use separate projects for \u201clanding\u201d and \u201canalytics\u201d if it helps enforce governance and cost allocation.<\/li>\n<li>Retain only what you need; set explicit retention policies for logs and exports.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Example low-cost starter estimate (no fabricated prices)<\/h3>\n\n\n\n<p>A small pilot might include:\n&#8211; A Cloud Storage bucket with a few hundred MB to a few GB of exports\n&#8211; A small BigQuery dataset with a few tables created from CSV loads\n&#8211; Occasional ad-hoc queries<\/p>\n\n\n\n<p>To estimate cost accurately:\n&#8211; Use the Google Cloud Pricing Calculator: https:\/\/cloud.google.com\/products\/calculator<br\/>\n  Model:\n  &#8211; Cloud Storage GB-month in your chosen location and storage class\n  &#8211; BigQuery storage + estimated TB processed per month (your dashboards and queries)\n  &#8211; Optional KMS operations<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Example production cost considerations<\/h3>\n\n\n\n<p>In enterprise programs, costs often come from:\n&#8211; Repeated assessments and growing history\n&#8211; Portfolio-scale analytics in BigQuery\n&#8211; Organization-wide dashboards and automated governance checks\n&#8211; Centralized logging and longer retention periods for audit\/compliance<\/p>\n\n\n\n<p>The right approach is to treat assessment outputs as a governed dataset with clear lifecycle policies.<\/p>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">10. Step-by-Step Hands-On Tutorial<\/h2>\n\n\n\n<p>This lab builds a secure, low-cost Google Cloud landing and analytics workflow for <strong>Mainframe Assessment Tool outputs<\/strong>. You\u2019ll create a bucket, lock down access, upload a sample \u201cassessment export\u201d file, and analyze it in BigQuery.<\/p>\n\n\n\n<blockquote>\n<p>Important: This lab uses a <strong>synthetic CSV<\/strong> to demonstrate the workflow. The real Mainframe Assessment Tool export schema and file names may differ. Adjust the load step to match the actual export format (verify in official docs).<\/p>\n<\/blockquote>\n\n\n\n<h3 class=\"wp-block-heading\">Objective<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Create a secure Cloud Storage bucket for Mainframe Assessment Tool exports<\/li>\n<li>Create a least-privilege service account for uploading exports<\/li>\n<li>Upload a sample export file<\/li>\n<li>Load the export into BigQuery and run validation queries<\/li>\n<li>Clean up all resources<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Lab Overview<\/h3>\n\n\n\n<p>You will implement this flow:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Enable APIs<\/li>\n<li>Create a Cloud Storage bucket with uniform bucket-level access<\/li>\n<li>Create a service account with limited permissions to upload<\/li>\n<li>Upload a sample export file to the bucket<\/li>\n<li>Load the file into BigQuery<\/li>\n<li>Query and validate<\/li>\n<li>Clean up<\/li>\n<\/ol>\n\n\n\n<h3 class=\"wp-block-heading\">Step 1: Set up environment variables and enable APIs<\/h3>\n\n\n\n<p>You can do this in <strong>Cloud Shell<\/strong> (recommended) or a local terminal with <code>gcloud<\/code> installed.<\/p>\n\n\n\n<p>1) Set variables:<\/p>\n\n\n\n<pre><code class=\"language-bash\">export PROJECT_ID=\"YOUR_PROJECT_ID\"\nexport REGION=\"us-central1\"\nexport BUCKET_NAME=\"${PROJECT_ID}-mat-exports\"\nexport BQ_DATASET=\"mat_assessment\"\n<\/code><\/pre>\n\n\n\n<p>2) Set your project:<\/p>\n\n\n\n<pre><code class=\"language-bash\">gcloud config set project \"${PROJECT_ID}\"\n<\/code><\/pre>\n\n\n\n<p>3) Enable APIs:<\/p>\n\n\n\n<pre><code class=\"language-bash\">gcloud services enable \\\n  storage.googleapis.com \\\n  bigquery.googleapis.com \\\n  cloudkms.googleapis.com\n<\/code><\/pre>\n\n\n\n<p><strong>Expected outcome:<\/strong> APIs are enabled for the project.<\/p>\n\n\n\n<p><strong>Verification:<\/strong><\/p>\n\n\n\n<pre><code class=\"language-bash\">gcloud services list --enabled --format=\"value(config.name)\" | egrep \"storage|bigquery|cloudkms\"\n<\/code><\/pre>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h3 class=\"wp-block-heading\">Step 2: Create a secure Cloud Storage bucket for assessment exports<\/h3>\n\n\n\n<p>1) Create the bucket (choose a location appropriate for your organization):<\/p>\n\n\n\n<pre><code class=\"language-bash\">gsutil mb -p \"${PROJECT_ID}\" -l \"${REGION}\" \"gs:\/\/${BUCKET_NAME}\"\n<\/code><\/pre>\n\n\n\n<p>2) Enable <strong>uniform bucket-level access<\/strong> (recommended for centralized IAM control):<\/p>\n\n\n\n<pre><code class=\"language-bash\">gsutil uniformbucketlevelaccess set on \"gs:\/\/${BUCKET_NAME}\"\n<\/code><\/pre>\n\n\n\n<p>3) (Recommended) Prevent accidental public access:<\/p>\n\n\n\n<pre><code class=\"language-bash\">gsutil pap set enforced \"gs:\/\/${BUCKET_NAME}\"\n<\/code><\/pre>\n\n\n\n<p>4) (Optional) Add a lifecycle rule to transition older exports to cheaper storage. Create a file <code>lifecycle.json<\/code>:<\/p>\n\n\n\n<pre><code class=\"language-bash\">cat &gt; lifecycle.json &lt;&lt; 'EOF'\n{\n  \"rule\": [\n    {\n      \"action\": {\"type\": \"SetStorageClass\", \"storageClass\": \"COLDLINE\"},\n      \"condition\": {\"age\": 30}\n    }\n  ]\n}\nEOF\n<\/code><\/pre>\n\n\n\n<p>Apply it:<\/p>\n\n\n\n<pre><code class=\"language-bash\">gsutil lifecycle set lifecycle.json \"gs:\/\/${BUCKET_NAME}\"\n<\/code><\/pre>\n\n\n\n<p><strong>Expected outcome:<\/strong> A private bucket exists for Mainframe Assessment Tool exports with strong IAM controls.<\/p>\n\n\n\n<p><strong>Verification:<\/strong><\/p>\n\n\n\n<pre><code class=\"language-bash\">gsutil ls -L \"gs:\/\/${BUCKET_NAME}\" | egrep \"Location constraint|Uniform bucket-level access|Public access prevention\"\n<\/code><\/pre>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h3 class=\"wp-block-heading\">Step 3: Create a least-privilege service account for uploading exports<\/h3>\n\n\n\n<p>In real programs, you typically separate:\n&#8211; <strong>Uploader identity<\/strong> (write-only or limited write)\n&#8211; <strong>Analyst identity<\/strong> (read\/query)\n&#8211; <strong>Admin identity<\/strong> (manage bucket\/dataset)<\/p>\n\n\n\n<p>1) Create a service account:<\/p>\n\n\n\n<pre><code class=\"language-bash\">export SA_NAME=\"mat-uploader\"\nexport SA_EMAIL=\"${SA_NAME}@${PROJECT_ID}.iam.gserviceaccount.com\"\n\ngcloud iam service-accounts create \"${SA_NAME}\" \\\n  --display-name=\"Mainframe Assessment Tool Export Uploader\"\n<\/code><\/pre>\n\n\n\n<p>2) Grant the service account permission to write objects to the bucket.<\/p>\n\n\n\n<p>For most cases, <code>roles\/storage.objectCreator<\/code> is appropriate (it can upload new objects but not overwrite\/delete existing objects):<\/p>\n\n\n\n<pre><code class=\"language-bash\">gcloud storage buckets add-iam-policy-binding \"gs:\/\/${BUCKET_NAME}\" \\\n  --member=\"serviceAccount:${SA_EMAIL}\" \\\n  --role=\"roles\/storage.objectCreator\"\n<\/code><\/pre>\n\n\n\n<p><strong>Expected outcome:<\/strong> The service account can upload exports but cannot read or delete existing objects.<\/p>\n\n\n\n<p><strong>Verification:<\/strong><\/p>\n\n\n\n<pre><code class=\"language-bash\">gcloud storage buckets get-iam-policy \"gs:\/\/${BUCKET_NAME}\" --format=\"json\" | grep -n \"${SA_EMAIL}\"\n<\/code><\/pre>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h3 class=\"wp-block-heading\">Step 4: Create and upload a sample \u201cassessment export\u201d file<\/h3>\n\n\n\n<p>This CSV is a stand-in for Mainframe Assessment Tool exports. In a real scenario, replace it with actual exported files.<\/p>\n\n\n\n<p>1) Create a sample CSV:<\/p>\n\n\n\n<pre><code class=\"language-bash\">cat &gt; mat_export_sample.csv &lt;&lt; 'EOF'\napplication_id,application_name,domain,criticality,language,notes\nAPP001,BillingCore,Finance,High,COBOL,Example record\nAPP002,ClaimsBatch,Insurance,High,COBOL,Example record\nAPP003,CustomerInquiry,CRM,Medium,PL\/I,Example record\nAPP004,ReportGen,BI,Low,JCL,Example record\nEOF\n<\/code><\/pre>\n\n\n\n<p>2) Upload it:<\/p>\n\n\n\n<pre><code class=\"language-bash\">gsutil cp mat_export_sample.csv \"gs:\/\/${BUCKET_NAME}\/exports\/mat_export_sample.csv\"\n<\/code><\/pre>\n\n\n\n<p><strong>Expected outcome:<\/strong> The export file is stored in Cloud Storage under a clear prefix (for example <code>exports\/<\/code>).<\/p>\n\n\n\n<p><strong>Verification:<\/strong><\/p>\n\n\n\n<pre><code class=\"language-bash\">gsutil ls \"gs:\/\/${BUCKET_NAME}\/exports\/\"\ngsutil stat \"gs:\/\/${BUCKET_NAME}\/exports\/mat_export_sample.csv\"\n<\/code><\/pre>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h3 class=\"wp-block-heading\">Step 5: Create a BigQuery dataset and load the export file<\/h3>\n\n\n\n<p>1) Create the dataset in the same location as your bucket (or according to your policy):<\/p>\n\n\n\n<pre><code class=\"language-bash\">bq --location=\"${REGION}\" mk -d \"${PROJECT_ID}:${BQ_DATASET}\"\n<\/code><\/pre>\n\n\n\n<p>2) Load the CSV into a BigQuery table:<\/p>\n\n\n\n<pre><code class=\"language-bash\">bq --location=\"${REGION}\" load \\\n  --source_format=CSV \\\n  --skip_leading_rows=1 \\\n  --autodetect \\\n  \"${PROJECT_ID}:${BQ_DATASET}.applications\" \\\n  \"gs:\/\/${BUCKET_NAME}\/exports\/mat_export_sample.csv\"\n<\/code><\/pre>\n\n\n\n<p><strong>Expected outcome:<\/strong> A BigQuery table exists with rows from the export.<\/p>\n\n\n\n<p><strong>Verification:<\/strong><\/p>\n\n\n\n<pre><code class=\"language-bash\">bq --location=\"${REGION}\" show --schema \"${PROJECT_ID}:${BQ_DATASET}.applications\"\nbq --location=\"${REGION}\" query --use_legacy_sql=false \\\n\"SELECT COUNT(*) AS row_count FROM \\`${PROJECT_ID}.${BQ_DATASET}.applications\\`\"\n<\/code><\/pre>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h3 class=\"wp-block-heading\">Step 6: Run useful validation queries (portfolio-style)<\/h3>\n\n\n\n<p>These example queries show how teams often use assessment outputs.<\/p>\n\n\n\n<p>1) Applications by criticality:<\/p>\n\n\n\n<pre><code class=\"language-bash\">bq --location=\"${REGION}\" query --use_legacy_sql=false \\\n\"SELECT criticality, COUNT(*) AS apps\n FROM \\`${PROJECT_ID}.${BQ_DATASET}.applications\\`\n GROUP BY criticality\n ORDER BY apps DESC\"\n<\/code><\/pre>\n\n\n\n<p>2) Applications by language:<\/p>\n\n\n\n<pre><code class=\"language-bash\">bq --location=\"${REGION}\" query --use_legacy_sql=false \\\n\"SELECT language, COUNT(*) AS apps\n FROM \\`${PROJECT_ID}.${BQ_DATASET}.applications\\`\n GROUP BY language\n ORDER BY apps DESC\"\n<\/code><\/pre>\n\n\n\n<p>3) Candidate wave-1 filter (example heuristic):\n&#8211; Low\/Medium criticality\n&#8211; Not in the highest-risk domains (purely an example)<\/p>\n\n\n\n<pre><code class=\"language-bash\">bq --location=\"${REGION}\" query --use_legacy_sql=false \\\n\"SELECT application_id, application_name, domain, criticality, language\n FROM \\`${PROJECT_ID}.${BQ_DATASET}.applications\\`\n WHERE criticality IN ('Low','Medium')\n ORDER BY criticality, application_id\"\n<\/code><\/pre>\n\n\n\n<p><strong>Expected outcome:<\/strong> You can query and segment the inventory, a foundation for migration wave planning.<\/p>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h3 class=\"wp-block-heading\">Validation<\/h3>\n\n\n\n<p>Confirm:\n&#8211; Bucket exists and is private:\n  &#8211; Uniform bucket-level access: <strong>On<\/strong>\n  &#8211; Public access prevention: <strong>Enforced<\/strong>\n&#8211; File is present in <code>gs:\/\/BUCKET\/exports\/<\/code>\n&#8211; BigQuery dataset and table exist\n&#8211; Queries return expected counts (4 rows in the sample)<\/p>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h3 class=\"wp-block-heading\">Troubleshooting<\/h3>\n\n\n\n<p>Common issues and fixes:<\/p>\n\n\n\n<p>1) <strong><code>AccessDeniedException<\/code> \/ 403 when uploading<\/strong>\n&#8211; Cause: Missing IAM permission on the bucket.\n&#8211; Fix: Ensure the identity you use has permissions, or use the service account with correct role bindings. Re-check bucket IAM policy:\n  <code>bash\n  gcloud storage buckets get-iam-policy \"gs:\/\/${BUCKET_NAME}\"<\/code><\/p>\n\n\n\n<p>2) <strong>BigQuery dataset location mismatch<\/strong>\n&#8211; Cause: BigQuery dataset location differs from your job location or organizational policy.\n&#8211; Fix: Create dataset in the correct location and use <code>--location=...<\/code> consistently.<\/p>\n\n\n\n<p>3) <strong><code>bq load<\/code> fails due to schema<\/strong>\n&#8211; Cause: Autodetect issues or non-CSV export format.\n&#8211; Fix: Provide an explicit schema, or transform exports first. For real Mainframe Assessment Tool exports, consult the official export schema (verify in official docs).<\/p>\n\n\n\n<p>4) <strong>Org policy prevents service account key creation<\/strong>\n&#8211; This lab does not require keys. If your pipeline needs non-interactive uploads, prefer Workload Identity Federation or controlled runtime identities rather than long-lived keys.<\/p>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h3 class=\"wp-block-heading\">Cleanup<\/h3>\n\n\n\n<p>To avoid ongoing charges, remove created resources.<\/p>\n\n\n\n<p>1) Delete BigQuery dataset (and contents):<\/p>\n\n\n\n<pre><code class=\"language-bash\">bq --location=\"${REGION}\" rm -r -f -d \"${PROJECT_ID}:${BQ_DATASET}\"\n<\/code><\/pre>\n\n\n\n<p>2) Delete objects and bucket:<\/p>\n\n\n\n<pre><code class=\"language-bash\">gsutil -m rm -r \"gs:\/\/${BUCKET_NAME}\/**\"\ngsutil rb \"gs:\/\/${BUCKET_NAME}\"\n<\/code><\/pre>\n\n\n\n<p>3) Delete service account:<\/p>\n\n\n\n<pre><code class=\"language-bash\">gcloud iam service-accounts delete \"${SA_EMAIL}\" --quiet\n<\/code><\/pre>\n\n\n\n<p>4) (Optional) Delete the project if it was created only for this lab.<\/p>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">11. Best Practices<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Architecture best practices<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Treat assessment outputs as a governed dataset:<\/li>\n<li>Define a landing bucket structure like <code>gs:\/\/...\/exports\/&lt;assessment-run-id&gt;\/...<\/code><\/li>\n<li>Store raw exports separately from curated\/normalized datasets.<\/li>\n<li>Separate projects when appropriate:<\/li>\n<li>A \u201clanding\u201d project for ingestion<\/li>\n<li>An \u201canalytics\u201d project for BigQuery dashboards<\/li>\n<li>Build a repeatable pipeline:<\/li>\n<li>Version the transformation scripts<\/li>\n<li>Track assessment run metadata (run date, scope, tool version)<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">IAM\/security best practices<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Use least privilege:<\/li>\n<li>Uploader: <code>roles\/storage.objectCreator<\/code> on the bucket<\/li>\n<li>Analysts: <code>roles\/storage.objectViewer<\/code> + <code>roles\/bigquery.dataViewer<\/code> + <code>roles\/bigquery.jobUser<\/code> as needed<\/li>\n<li>Avoid long-lived service account keys; prefer:<\/li>\n<li>Workload Identity Federation<\/li>\n<li>Short-lived credentials in controlled runtime environments<\/li>\n<li>Enforce uniform bucket-level access and public access prevention.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Cost best practices<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Set lifecycle rules to transition older exports to cheaper storage classes.<\/li>\n<li>Avoid loading everything into BigQuery \u201cjust because\u201d\u2014load curated subsets.<\/li>\n<li>Control dashboard refresh rates to limit BigQuery query processing.<\/li>\n<li>Establish retention policies for exports, derived tables, and logs.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Performance best practices<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Use BigQuery partitioning\/clustering for large-scale exports (for example by assessment run date, app domain).<\/li>\n<li>Pre-aggregate common dashboards (counts by criticality\/domain) into summary tables.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Reliability best practices<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Keep raw exports immutable:<\/li>\n<li>Use object versioning if required (note: versioning increases storage usage).<\/li>\n<li>Write once, read many; avoid overwriting historical runs.<\/li>\n<li>Validate uploads with checksums and consistent naming conventions.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Operations best practices<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Centralize Cloud Audit Logs via log sinks.<\/li>\n<li>Create runbooks for:<\/li>\n<li>Upload failures<\/li>\n<li>Schema changes in exports<\/li>\n<li>Access review and periodic permission audits<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Governance\/tagging\/naming best practices<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Use labels on buckets\/datasets:<\/li>\n<li><code>env=prod|nonprod<\/code><\/li>\n<li><code>data_classification=sensitive|internal<\/code><\/li>\n<li><code>owner=mainframe-modernization<\/code><\/li>\n<li>Standardize dataset\/table naming:<\/li>\n<li><code>mat_assessment.applications<\/code><\/li>\n<li><code>mat_assessment.dependencies<\/code> (example; confirm actual schemas)<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">12. Security Considerations<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Identity and access model<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Cloud-side access is controlled by <strong>IAM<\/strong>:<\/li>\n<li>Bucket-level IAM for Cloud Storage<\/li>\n<li>Dataset\/table IAM for BigQuery<\/li>\n<li>Enforce separation of duties:<\/li>\n<li>Writers (uploaders) should not be readers<\/li>\n<li>Analysts should not be bucket admins<\/li>\n<li>Security\/audit roles should be centralized<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Encryption<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Cloud Storage and BigQuery encrypt data at rest by default.<\/li>\n<li>For regulated environments, consider <strong>CMEK<\/strong> using <strong>Cloud KMS<\/strong>:<\/li>\n<li>Use CMEK for the bucket and\/or BigQuery dataset (where supported)<\/li>\n<li>Restrict who can use\/decrypt with KMS IAM<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Network exposure<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Prevent public access at the bucket level (Public Access Prevention).<\/li>\n<li>Consider private connectivity patterns for on-prem uploads:<\/li>\n<li>Cloud VPN or Interconnect<\/li>\n<li>Organization policies and egress controls<\/li>\n<li>If your threat model includes data exfiltration, evaluate <strong>VPC Service Controls<\/strong> for Storage\/BigQuery (verify applicability and design carefully).<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Secrets handling<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Avoid embedding credentials in scripts.<\/li>\n<li>Prefer federated identity or short-lived tokens.<\/li>\n<li>If secrets are required (not recommended), store them in Secret Manager and restrict access tightly.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Audit\/logging<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Use <strong>Cloud Audit Logs<\/strong> to track:<\/li>\n<li>Object access (as available)<\/li>\n<li>IAM policy changes<\/li>\n<li>BigQuery dataset\/table access and job execution<\/li>\n<li>Export logs to a centralized logging project or SIEM if required.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Compliance considerations<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Treat assessment outputs as potentially sensitive:<\/li>\n<li>Application names, dataset names, interface endpoints, and operational schedules can be high-value to attackers.<\/li>\n<li>Apply data classification and retention policies.<\/li>\n<li>Ensure region\/location choices match data residency requirements.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Common security mistakes<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Storing exports in a bucket with legacy ACLs or accidental public exposure<\/li>\n<li>Over-granting roles (for example, <code>Storage Admin<\/code> to everyone)<\/li>\n<li>Sharing exports over email or unmanaged file shares<\/li>\n<li>Keeping long-lived service account keys on jump hosts<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Secure deployment recommendations<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Use uniform bucket-level access + public access prevention.<\/li>\n<li>Use CMEK where mandated.<\/li>\n<li>Centralize audit logs and implement periodic IAM reviews.<\/li>\n<li>Store raw exports immutably; derive curated datasets for broad sharing.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">13. Limitations and Gotchas<\/h2>\n\n\n\n<p>Because Mainframe Assessment Tool depends heavily on your environment and access model, expect practical constraints.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Environment access constraints:<\/strong> You may not be able to scan everything due to least-privilege, separation of duties, or mainframe security controls.<\/li>\n<li><strong>Incomplete dependency visibility:<\/strong> Some dependencies are runtime-only or external; assessment tooling may not detect them fully.<\/li>\n<li><strong>Schema drift:<\/strong> Export formats can change between tool versions; build ingestion with versioning and validation.<\/li>\n<li><strong>False certainty risk:<\/strong> Assessment metrics help planning but do not replace SME validation and hands-on modernization spikes.<\/li>\n<li><strong>Location constraints:<\/strong> BigQuery dataset location and Cloud Storage bucket location must align with organizational policies; mismatches cause job failures.<\/li>\n<li><strong>Cost surprises:<\/strong><\/li>\n<li>BigQuery dashboards can process large amounts of data repeatedly<\/li>\n<li>Long retention of many assessment runs can accumulate storage costs<\/li>\n<li><strong>Governance overhead:<\/strong> Without a clear owner and lifecycle policies, exports become a \u201cdata swamp.\u201d<\/li>\n<li><strong>Security surprises:<\/strong> Metadata can expose sensitive internal system details\u2014treat it as sensitive even if it\u2019s \u201cnot customer data.\u201d<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">14. Comparison with Alternatives<\/h2>\n\n\n\n<p>Mainframe Assessment Tool is one part of a mainframe modernization toolkit. Alternatives exist in Google Cloud, other clouds, and third-party ecosystems.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Comparison table<\/h3>\n\n\n\n<figure class=\"wp-block-table\"><table>\n<thead>\n<tr>\n<th>Option<\/th>\n<th>Best For<\/th>\n<th>Strengths<\/th>\n<th>Weaknesses<\/th>\n<th>When to Choose<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td><strong>Google Cloud \u2013 Mainframe Assessment Tool<\/strong><\/td>\n<td>Planning Google Cloud mainframe modernization<\/td>\n<td>Google Cloud\u2013aligned assessment outputs; integrates naturally with Cloud Storage\/BigQuery governance<\/td>\n<td>Exact supported sources\/outputs depend on edition\/workflow; still requires SME validation<\/td>\n<td>You are modernizing to Google Cloud and want a structured assessment baseline<\/td>\n<\/tr>\n<tr>\n<td><strong>Google Cloud \u2013 General data\/portfolio analytics (Storage + BigQuery) without MAT<\/strong><\/td>\n<td>Teams that already have exports\/inventory<\/td>\n<td>Flexible; you control schema and ingestion<\/td>\n<td>You must build\/maintain discovery and normalization yourself<\/td>\n<td>You already have reliable inventory sources and only need cloud analytics<\/td>\n<\/tr>\n<tr>\n<td><strong>AWS \u2013 Mainframe modernization assessment approaches<\/strong><\/td>\n<td>Planning migrations to AWS<\/td>\n<td>Integrated into AWS modernization ecosystem<\/td>\n<td>Not Google Cloud\u2013aligned; different target architecture patterns<\/td>\n<td>Your target is AWS and your program uses AWS tooling<\/td>\n<\/tr>\n<tr>\n<td><strong>Azure \u2013 Mainframe migration\/modernization partner tooling<\/strong><\/td>\n<td>Planning migrations to Azure<\/td>\n<td>Ecosystem of partners and assessments<\/td>\n<td>Not Google Cloud\u2013aligned<\/td>\n<td>Your target is Azure and your org standardizes on Microsoft tooling<\/td>\n<\/tr>\n<tr>\n<td><strong>IBM \/ ISV analyzers (various)<\/strong><\/td>\n<td>Deep mainframe codebase analysis<\/td>\n<td>Can be very deep for specific languages\/compilers and mainframe artifacts<\/td>\n<td>Licensing and integration complexity; may not map cleanly to Google Cloud plans<\/td>\n<td>You need deeper static analysis and already license IBM\/ISV tools<\/td>\n<\/tr>\n<tr>\n<td><strong>Open-source discovery + custom scripts<\/strong><\/td>\n<td>Small teams with strong engineering and limited budgets<\/td>\n<td>Highly customizable; no vendor lock-in in analysis layer<\/td>\n<td>Hard to achieve completeness and repeatability; security risk if ad hoc<\/td>\n<td>Narrow scope migrations where you can invest engineering time instead of tool licensing<\/td>\n<\/tr>\n<\/tbody>\n<\/table><\/figure>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">15. Real-World Example<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Enterprise example: Global insurer modernizing batch + online workloads<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> Hundreds of batch jobs and multiple customer-facing applications run on a mainframe. Documentation is inconsistent; dependencies are unclear. Leadership needs a multi-year modernization roadmap targeting Google Cloud.<\/li>\n<li><strong>Proposed architecture:<\/strong><\/li>\n<li>Run Mainframe Assessment Tool to generate inventory and dependency exports<\/li>\n<li>Store raw exports in a CMEK-protected Cloud Storage bucket<\/li>\n<li>Load curated datasets into BigQuery for portfolio analysis<\/li>\n<li>Build dashboards by domain\/criticality\/wave<\/li>\n<li>Use findings to prioritize \u201cwave 1\u201d candidates and plan network connectivity, landing zone controls, and operational model<\/li>\n<li><strong>Why this service was chosen:<\/strong><\/li>\n<li>Provides a structured assessment artifact aligned with Google Cloud modernization planning<\/li>\n<li>Reduces reliance on tribal knowledge<\/li>\n<li>Accelerates wave planning and governance reporting<\/li>\n<li><strong>Expected outcomes:<\/strong><\/li>\n<li>Clear wave plan grouped by dependency clusters<\/li>\n<li>Reduced cutover surprises through early interface discovery<\/li>\n<li>Better cost\/timeline estimates and improved stakeholder alignment<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Startup\/small-team example: Fintech with a small inherited mainframe footprint<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Problem:<\/strong> A fintech acquires a small portfolio that includes a mainframe-based billing component. The team is cloud-native and needs to understand the scope quickly to migrate to Google Cloud.<\/li>\n<li><strong>Proposed architecture:<\/strong><\/li>\n<li>Run a focused assessment (limited scope) to inventory the billing app, key batch jobs, and interfaces<\/li>\n<li>Store exports in Cloud Storage with strict access control<\/li>\n<li>Use BigQuery to identify the minimal set of dependent artifacts required for a phased cutover<\/li>\n<li><strong>Why this service was chosen:<\/strong><\/li>\n<li>Faster clarity than manual interviews alone<\/li>\n<li>Produces artifacts the cloud team can use directly in planning<\/li>\n<li><strong>Expected outcomes:<\/strong><\/li>\n<li>A small, realistic backlog for modernization<\/li>\n<li>A migration plan with clear assumptions and identified unknowns<\/li>\n<li>Reduced security risk by keeping artifacts in governed cloud storage instead of ad hoc sharing<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">16. FAQ<\/h2>\n\n\n\n<p>1) <strong>Is Mainframe Assessment Tool a managed Google Cloud service or something I run myself?<\/strong><br\/>\nIt is commonly used as <strong>customer-operated assessment tooling<\/strong> aligned to Google Cloud modernization, with outputs stored\/analyzed in Google Cloud. Confirm the exact operational model in official docs for your version\/edition.<\/p>\n\n\n\n<p>2) <strong>Do I need a Google Cloud project to use Mainframe Assessment Tool?<\/strong><br\/>\nIf you want to store results in Google Cloud (recommended for governance and collaboration), yes\u2014you\u2019ll use a project for Cloud Storage\/BigQuery. The assessment execution itself may run in your environment.<\/p>\n\n\n\n<p>3) <strong>What mainframe components does it support (languages, schedulers, subsystems)?<\/strong><br\/>\nSupport varies by release\/edition and environment constraints. <strong>Verify in official docs<\/strong> and validate with a pilot scan in your environment.<\/p>\n\n\n\n<p>4) <strong>Does it migrate code automatically?<\/strong><br\/>\nAssessment tools generally <strong>do not<\/strong> migrate code; they provide planning artifacts. Modernization\/migration requires separate engineering workstreams and tooling.<\/p>\n\n\n\n<p>5) <strong>How should we treat assessment outputs from a security standpoint?<\/strong><br\/>\nTreat them as <strong>sensitive internal data<\/strong>. They can reveal application names, interfaces, and operational details valuable to attackers.<\/p>\n\n\n\n<p>6) <strong>Where should we store exports in Google Cloud?<\/strong><br\/>\nStart with a dedicated <strong>Cloud Storage bucket<\/strong> with uniform bucket-level access, public access prevention, and (if required) CMEK.<\/p>\n\n\n\n<p>7) <strong>Should we load assessment outputs into BigQuery?<\/strong><br\/>\nLoad curated outputs into BigQuery if you need portfolio analytics, dashboards, and repeatable reporting. Keep raw exports in Cloud Storage.<\/p>\n\n\n\n<p>8) <strong>How do we control who can upload vs who can read?<\/strong><br\/>\nUse separate IAM roles and identities:\n&#8211; Uploader: <code>roles\/storage.objectCreator<\/code>\n&#8211; Readers: <code>roles\/storage.objectViewer<\/code>\nApply least privilege and avoid broad admin roles.<\/p>\n\n\n\n<p>9) <strong>Can we enforce customer-managed encryption keys (CMEK)?<\/strong><br\/>\nYes, typically via <strong>Cloud KMS<\/strong> for Cloud Storage (and in some cases BigQuery). Confirm the exact CMEK support for your storage\/analytics setup in official docs.<\/p>\n\n\n\n<p>10) <strong>How do we avoid schema drift breaking our pipeline?<\/strong><br\/>\nVersion your ingestion:\n&#8211; Store exports under run IDs\n&#8211; Validate file formats and schemas\n&#8211; Keep transformations in source control\n&#8211; Build compatibility layers when tool versions change<\/p>\n\n\n\n<p>11) <strong>How often should we run assessments?<\/strong><br\/>\nAt minimum:\n&#8211; Once for baseline planning<br\/>\nThen again:\n&#8211; After major remediation work<br\/>\n&#8211; Before major migration waves<br\/>\nFrequency depends on how quickly the mainframe estate changes.<\/p>\n\n\n\n<p>12) <strong>What\u2019s the biggest \u201cgotcha\u201d teams hit?<\/strong><br\/>\nAssuming the assessment is \u201ccomplete truth.\u201d Use it to guide discovery, but always validate critical dependencies and operational constraints with SMEs and targeted testing.<\/p>\n\n\n\n<p>13) <strong>How do we estimate Google Cloud costs for the assessment phase?<\/strong><br\/>\nUse the Pricing Calculator and model:\n&#8211; Cloud Storage GB-month for exports\n&#8211; BigQuery storage + query processing<br\/>\nExact costs depend on data size and query frequency.<\/p>\n\n\n\n<p>14) <strong>Can we run this workflow in a restricted enterprise org with policies?<\/strong><br\/>\nYes, but you may need:\n&#8211; Approved regions\/locations\n&#8211; Restricted IAM role sets\n&#8211; Centralized logging requirements<br\/>\nCoordinate early with your cloud governance team.<\/p>\n\n\n\n<p>15) <strong>What is a good first milestone for a mainframe modernization program using this tool?<\/strong><br\/>\nA practical milestone is:\n&#8211; Completed assessment baseline<br\/>\n&#8211; Inventory validated by SMEs<br\/>\n&#8211; Initial wave plan and target architecture options documented<br\/>\n&#8211; Security and landing zone requirements captured<\/p>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">17. Top Online Resources to Learn Mainframe Assessment Tool<\/h2>\n\n\n\n<p>Links can change; if a specific page is reorganized, navigate from the product root.<\/p>\n\n\n\n<figure class=\"wp-block-table\"><table>\n<thead>\n<tr>\n<th>Resource Type<\/th>\n<th>Name<\/th>\n<th>Why It Is Useful<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>Official product overview<\/td>\n<td>https:\/\/cloud.google.com\/mainframe-modernization<\/td>\n<td>Entry point to Google Cloud\u2019s mainframe modernization portfolio and terminology<\/td>\n<\/tr>\n<tr>\n<td>Official documentation<\/td>\n<td>https:\/\/cloud.google.com\/mainframe-modernization\/docs<\/td>\n<td>Canonical documentation hub (look for Mainframe Assessment Tool sections)<\/td>\n<\/tr>\n<tr>\n<td>Pricing overview<\/td>\n<td>https:\/\/cloud.google.com\/pricing<\/td>\n<td>Understand cloud-side service pricing models used with assessment outputs<\/td>\n<\/tr>\n<tr>\n<td>Pricing calculator<\/td>\n<td>https:\/\/cloud.google.com\/products\/calculator<\/td>\n<td>Build an estimate for Cloud Storage\/BigQuery\/KMS used in assessment workflows<\/td>\n<\/tr>\n<tr>\n<td>Cloud Storage docs<\/td>\n<td>https:\/\/cloud.google.com\/storage\/docs<\/td>\n<td>Secure bucket configuration, IAM, lifecycle, encryption, best practices<\/td>\n<\/tr>\n<tr>\n<td>BigQuery docs<\/td>\n<td>https:\/\/cloud.google.com\/bigquery\/docs<\/td>\n<td>Loading CSV\/JSON, schema design, partitioning, cost controls<\/td>\n<\/tr>\n<tr>\n<td>Cloud KMS docs<\/td>\n<td>https:\/\/cloud.google.com\/kms\/docs<\/td>\n<td>CMEK setup and key governance<\/td>\n<\/tr>\n<tr>\n<td>Audit logs docs<\/td>\n<td>https:\/\/cloud.google.com\/logging\/docs\/audit<\/td>\n<td>How to use Cloud Audit Logs for storage and analytics access tracking<\/td>\n<\/tr>\n<tr>\n<td>Architecture Center<\/td>\n<td>https:\/\/cloud.google.com\/architecture<\/td>\n<td>Reference architectures and governance patterns relevant to migration programs<\/td>\n<\/tr>\n<tr>\n<td>Google Cloud YouTube<\/td>\n<td>https:\/\/www.youtube.com\/@GoogleCloudTech<\/td>\n<td>Talks and webinars (search within channel for mainframe modernization topics)<\/td>\n<\/tr>\n<tr>\n<td>Official GitHub org<\/td>\n<td>https:\/\/github.com\/GoogleCloudPlatform<\/td>\n<td>Samples and reference implementations (verify availability of MAT-specific samples)<\/td>\n<\/tr>\n<tr>\n<td>Community learning<\/td>\n<td>https:\/\/stackoverflow.com\/questions\/tagged\/google-cloud-platform<\/td>\n<td>Practical troubleshooting patterns for Cloud Storage\/BigQuery pipelines<\/td>\n<\/tr>\n<\/tbody>\n<\/table><\/figure>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">18. Training and Certification Providers<\/h2>\n\n\n\n<p>The following are third-party training providers. Confirm current course availability and delivery modes on their websites.<\/p>\n\n\n\n<figure class=\"wp-block-table\"><table>\n<thead>\n<tr>\n<th>Institute<\/th>\n<th>Suitable Audience<\/th>\n<th>Likely Learning Focus<\/th>\n<th>Mode<\/th>\n<th>Website URL<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>DevOpsSchool.com<\/td>\n<td>Engineers, architects, DevOps\/SRE<\/td>\n<td>Cloud\/DevOps practices, migration fundamentals, hands-on labs<\/td>\n<td>Check website<\/td>\n<td>https:\/\/www.devopsschool.com<\/td>\n<\/tr>\n<tr>\n<td>ScmGalaxy.com<\/td>\n<td>Beginners to intermediate<\/td>\n<td>DevOps, SCM, cloud fundamentals<\/td>\n<td>Check website<\/td>\n<td>https:\/\/www.scmgalaxy.com<\/td>\n<\/tr>\n<tr>\n<td>CLoudOpsNow.in<\/td>\n<td>Ops, SRE, cloud engineers<\/td>\n<td>Cloud operations, reliability, monitoring, cost awareness<\/td>\n<td>Check website<\/td>\n<td>https:\/\/www.cloudopsnow.in<\/td>\n<\/tr>\n<tr>\n<td>SreSchool.com<\/td>\n<td>SREs, platform teams<\/td>\n<td>SRE principles, SLIs\/SLOs, operations readiness<\/td>\n<td>Check website<\/td>\n<td>https:\/\/www.sreschool.com<\/td>\n<\/tr>\n<tr>\n<td>AiOpsSchool.com<\/td>\n<td>Ops\/SRE leaders, platform teams<\/td>\n<td>AIOps concepts, automation, observability<\/td>\n<td>Check website<\/td>\n<td>https:\/\/www.aiopsschool.com<\/td>\n<\/tr>\n<\/tbody>\n<\/table><\/figure>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">19. Top Trainers<\/h2>\n\n\n\n<p>These are trainer-related platforms\/sites to explore for coaching or training services. Verify specific Google Cloud Mainframe Assessment Tool coverage directly.<\/p>\n\n\n\n<figure class=\"wp-block-table\"><table>\n<thead>\n<tr>\n<th>Platform\/Site<\/th>\n<th>Likely Specialization<\/th>\n<th>Suitable Audience<\/th>\n<th>Website URL<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>RajeshKumar.xyz<\/td>\n<td>DevOps\/cloud coaching (verify scope)<\/td>\n<td>Individuals and teams seeking guided learning<\/td>\n<td>https:\/\/www.rajeshkumar.xyz<\/td>\n<\/tr>\n<tr>\n<td>devopstrainer.in<\/td>\n<td>DevOps training (verify scope)<\/td>\n<td>Beginners to intermediate DevOps learners<\/td>\n<td>https:\/\/www.devopstrainer.in<\/td>\n<\/tr>\n<tr>\n<td>devopsfreelancer.com<\/td>\n<td>Freelance DevOps services\/training (verify scope)<\/td>\n<td>Teams needing short-term guidance<\/td>\n<td>https:\/\/www.devopsfreelancer.com<\/td>\n<\/tr>\n<tr>\n<td>devopssupport.in<\/td>\n<td>DevOps support\/training (verify scope)<\/td>\n<td>Teams needing operational help and mentoring<\/td>\n<td>https:\/\/www.devopssupport.in<\/td>\n<\/tr>\n<\/tbody>\n<\/table><\/figure>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">20. Top Consulting Companies<\/h2>\n\n\n\n<p>These organizations may provide consulting services. Validate their specific Google Cloud mainframe migration experience and references during procurement.<\/p>\n\n\n\n<figure class=\"wp-block-table\"><table>\n<thead>\n<tr>\n<th>Company<\/th>\n<th>Likely Service Area<\/th>\n<th>Where They May Help<\/th>\n<th>Consulting Use Case Examples<\/th>\n<th>Website URL<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>cotocus.com<\/td>\n<td>Cloud\/DevOps consulting (verify scope)<\/td>\n<td>Discovery, landing zones, DevOps pipelines, cloud operations<\/td>\n<td>Secure Cloud Storage\/BigQuery setup for assessment outputs; governance and IAM reviews<\/td>\n<td>https:\/\/www.cotocus.com<\/td>\n<\/tr>\n<tr>\n<td>DevOpsSchool.com<\/td>\n<td>Training + consulting (verify scope)<\/td>\n<td>Enablement, implementation support<\/td>\n<td>Building assessment analytics pipelines; operational readiness for migration<\/td>\n<td>https:\/\/www.devopsschool.com<\/td>\n<\/tr>\n<tr>\n<td>DEVOPSCONSULTING.IN<\/td>\n<td>DevOps consulting (verify scope)<\/td>\n<td>Delivery support, automation, operations<\/td>\n<td>Automation for ingestion and reporting; environment hardening<\/td>\n<td>https:\/\/www.devopsconsulting.in<\/td>\n<\/tr>\n<\/tbody>\n<\/table><\/figure>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">21. Career and Learning Roadmap<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">What to learn before Mainframe Assessment Tool (recommended)<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Google Cloud fundamentals<\/strong><\/li>\n<li>Projects, billing, IAM basics<\/li>\n<li>Regions\/locations and data residency<\/li>\n<li><strong>Cloud Storage<\/strong><\/li>\n<li>Buckets, IAM, uniform bucket-level access, lifecycle rules, encryption<\/li>\n<li><strong>BigQuery basics<\/strong> (if you will analyze exports)<\/li>\n<li>Loading data, schemas, querying, cost model<\/li>\n<li><strong>Security foundations<\/strong><\/li>\n<li>Least privilege IAM<\/li>\n<li>Cloud KMS basics<\/li>\n<li>Audit logs and governance<\/li>\n<li><strong>Migration fundamentals<\/strong><\/li>\n<li>Discovery \u2192 assessment \u2192 landing zone \u2192 waves \u2192 cutover<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">What to learn after<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Mainframe modernization architecture patterns<\/strong> on Google Cloud (verify the official guidance for your chosen approach)<\/li>\n<li><strong>Landing zone and governance at scale<\/strong><\/li>\n<li>Organization policy, centralized logging, VPC Service Controls (where required)<\/li>\n<li><strong>Operational readiness<\/strong><\/li>\n<li>Monitoring strategy, SLOs, incident response, DR planning<\/li>\n<li><strong>Data migration and integration patterns<\/strong><\/li>\n<li>Secure transfer, replication, batch modernization, interface modernization<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Job roles that use it<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Cloud architect \/ migration architect<\/li>\n<li>Platform engineer \/ SRE<\/li>\n<li>Mainframe modernization architect<\/li>\n<li>Security engineer (governance and data protection)<\/li>\n<li>Technical program manager for modernization<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Certification path (if available)<\/h3>\n\n\n\n<p>There is not necessarily a Mainframe Assessment Tool\u2013specific certification. Common Google Cloud certifications that help:\n&#8211; Associate Cloud Engineer\n&#8211; Professional Cloud Architect\n&#8211; Professional Cloud Security Engineer\n&#8211; Professional Cloud DevOps Engineer<br\/>\nVerify current Google Cloud certification tracks: https:\/\/cloud.google.com\/learn\/certification<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Project ideas for practice<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Build a governed \u201cassessment lake\u201d:<\/li>\n<li>Landing bucket + lifecycle + CMEK + audit logging<\/li>\n<li>Create a BigQuery analytics model:<\/li>\n<li>Tables for applications, dependencies, batch chains (if you have data)<\/li>\n<li>Dashboards for wave planning and risk scoring<\/li>\n<li>Implement policy-as-code:<\/li>\n<li>IAM checks, bucket policy checks, automated drift detection<\/li>\n<li>Build a schema versioning strategy:<\/li>\n<li>Support multiple export versions without breaking dashboards<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">22. Glossary<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Assessment (migration):<\/strong> A discovery and analysis phase to understand current workloads, dependencies, risks, and effort before migration.<\/li>\n<li><strong>CMEK (Customer-Managed Encryption Keys):<\/strong> Encryption keys you control in Cloud KMS, used to encrypt data in services like Cloud Storage.<\/li>\n<li><strong>Cloud Storage:<\/strong> Google Cloud object storage used to store files (exports, reports, artifacts).<\/li>\n<li><strong>BigQuery:<\/strong> Google Cloud data warehouse used for analytics on structured\/semi-structured data.<\/li>\n<li><strong>Collector:<\/strong> A component that gathers metadata from an environment for assessment.<\/li>\n<li><strong>Dependency mapping:<\/strong> Identifying relationships between applications, jobs, data stores, and integrations.<\/li>\n<li><strong>IAM (Identity and Access Management):<\/strong> Google Cloud access control system for resources.<\/li>\n<li><strong>Landing zone:<\/strong> A governed cloud foundation (projects, networking, IAM, logging) for workloads and data.<\/li>\n<li><strong>Least privilege:<\/strong> Granting only the minimum access required to perform a task.<\/li>\n<li><strong>Lifecycle policy (Storage):<\/strong> Rules to automatically transition\/delete objects based on age or other conditions.<\/li>\n<li><strong>Org Policy:<\/strong> Google Cloud constraints applied at organization\/folder\/project level for governance.<\/li>\n<li><strong>Public access prevention:<\/strong> Cloud Storage setting that blocks public access to buckets\/objects.<\/li>\n<li><strong>Reassessment:<\/strong> Running assessment again after changes to measure progress and detect drift.<\/li>\n<li><strong>Wave plan:<\/strong> A phased migration sequence grouping workloads to manage risk and dependencies.<\/li>\n<li><strong>Workload Identity Federation:<\/strong> A method to access Google Cloud without long-lived service account keys by federating external identities.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">23. Summary<\/h2>\n\n\n\n<p>Mainframe Assessment Tool supports <strong>Google Cloud Migration<\/strong> programs by helping teams <strong>assess and inventory mainframe estates<\/strong>, understand <strong>dependencies and complexity<\/strong>, and produce planning outputs that reduce risk before modernization begins.<\/p>\n\n\n\n<p>It matters because mainframe modernization fails most often due to incomplete discovery and underestimated coupling; assessment outputs provide a credible baseline for wave planning, target architecture decisions, and stakeholder alignment.<\/p>\n\n\n\n<p>In Google Cloud, Mainframe Assessment Tool fits naturally with a secure analytics workflow: store exports in <strong>Cloud Storage<\/strong>, protect them with <strong>IAM<\/strong> and optional <strong>Cloud KMS (CMEK)<\/strong>, audit access with <strong>Cloud Audit Logs<\/strong>, and analyze them with <strong>BigQuery<\/strong> to drive dashboards and governance.<\/p>\n\n\n\n<p>Key cost and security points:\n&#8211; Costs are usually dominated by Cloud Storage retention and BigQuery query processing\u2014not by the concept of \u201crunning an assessment.\u201d\n&#8211; Treat assessment metadata as sensitive; enforce private buckets, least privilege, and audit logging.<\/p>\n\n\n\n<p>Use Mainframe Assessment Tool when you need a structured, repeatable assessment baseline for a Google Cloud mainframe modernization roadmap. Next, implement a governed landing path for exports (like the lab), then expand into curated analytics models and dashboards that support wave planning and ongoing program governance.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Migration<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[51,46],"tags":[],"class_list":["post-710","post","type-post","status-publish","format-standard","hentry","category-google-cloud","category-migration"],"_links":{"self":[{"href":"https:\/\/www.devopsschool.com\/tutorials\/wp-json\/wp\/v2\/posts\/710","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.devopsschool.com\/tutorials\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.devopsschool.com\/tutorials\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.devopsschool.com\/tutorials\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.devopsschool.com\/tutorials\/wp-json\/wp\/v2\/comments?post=710"}],"version-history":[{"count":0,"href":"https:\/\/www.devopsschool.com\/tutorials\/wp-json\/wp\/v2\/posts\/710\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.devopsschool.com\/tutorials\/wp-json\/wp\/v2\/media?parent=710"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.devopsschool.com\/tutorials\/wp-json\/wp\/v2\/categories?post=710"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.devopsschool.com\/tutorials\/wp-json\/wp\/v2\/tags?post=710"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}