{"id":38149,"date":"2023-08-12T07:32:20","date_gmt":"2023-08-12T07:32:20","guid":{"rendered":"https:\/\/www.devopsschool.com\/blog\/?p=38149"},"modified":"2023-09-22T07:34:25","modified_gmt":"2023-09-22T07:34:25","slug":"what-is-google-cloud-dataflow-and-use-cases-of-google-cloud-dataflow","status":"publish","type":"post","link":"https:\/\/www.devopsschool.com\/blog\/what-is-google-cloud-dataflow-and-use-cases-of-google-cloud-dataflow\/","title":{"rendered":"What is Google Cloud Dataflow and use cases of Google Cloud Dataflow?"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\">What is Google Cloud Dataflow?<\/h2>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-327-1024x536.png\" alt=\"\" class=\"wp-image-38155\" style=\"width:737px;height:386px\" width=\"737\" height=\"386\" srcset=\"https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-327-1024x536.png 1024w, https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-327-300x157.png 300w, https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-327-768x402.png 768w, https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-327.png 1200w\" sizes=\"auto, (max-width: 737px) 100vw, 737px\" \/><figcaption class=\"wp-element-caption\"><strong><em>What is Google Cloud Dataflow<\/em><\/strong><\/figcaption><\/figure>\n<\/div>\n\n\n<p>Google Cloud Dataflow is a fully managed data processing service provided by Google Cloud Platform. It allows you to design, deploy, and manage data processing pipelines for both batch and stream processing tasks. Dataflow offers a unified programming model that supports both batch processing (processing data in fixed-size chunks) and stream processing (processing data as it arrives).<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Top 10 Use Cases of Google Cloud Dataflow:<\/h2>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Real-Time Analytics:<\/strong> Perform real-time analytics on streaming data, extracting insights and patterns as data arrives.<\/li>\n\n\n\n<li><strong>ETL (Extract, Transform, Load):<\/strong> Ingest, transform, and load data from various sources into a target data store for analysis.<\/li>\n\n\n\n<li><strong>Data Enrichment:<\/strong> Enrich streaming or batch data with additional information from external sources, such as APIs or reference datasets.<\/li>\n\n\n\n<li><strong>Fraud Detection:<\/strong> Analyze transaction data in real-time to detect fraudulent activities and anomalies.<\/li>\n\n\n\n<li><strong>Clickstream Analysis:<\/strong> Analyze user clickstream data to gain insights into user behavior and website performance.<\/li>\n\n\n\n<li><strong>Log Analysis and Monitoring:<\/strong> Process logs from applications, servers, and devices in real-time to identify issues and troubleshoot problems.<\/li>\n\n\n\n<li><strong>IoT Data Processing:<\/strong> Process and analyze data from IoT devices, sensors, and connected devices in real-time.<\/li>\n\n\n\n<li><strong>Recommendation Engines:<\/strong> Build recommendation systems that provide personalized recommendations to users based on their preferences and behavior.<\/li>\n\n\n\n<li><strong>Market Basket Analysis:<\/strong> Analyze customer purchasing behavior to identify associations between products for cross-selling and upselling.<\/li>\n\n\n\n<li><strong>Data Quality and Cleansing:<\/strong> Cleanse and validate data in real-time or in batches to ensure data quality and accuracy.<\/li>\n<\/ol>\n\n\n\n<p>These use cases highlight the versatility of Google Cloud Dataflow in handling a wide range of data processing scenarios, whether they involve real-time streaming data or batch processing of large datasets. Dataflow&#8217;s ability to handle both modes of processing simplifies development and deployment for data engineering tasks.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">What are the feature of Google Cloud Dataflow?<\/h2>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-329.png\" alt=\"\" class=\"wp-image-38157\" style=\"width:840px;height:279px\" width=\"840\" height=\"279\" srcset=\"https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-329.png 1000w, https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-329-300x100.png 300w, https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-329-768x255.png 768w\" sizes=\"auto, (max-width: 840px) 100vw, 840px\" \/><figcaption class=\"wp-element-caption\"><strong><em>Feature of Google Cloud Dataflow<\/em><\/strong><\/figcaption><\/figure>\n<\/div>\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Unified Model:<\/strong> Dataflow provides a unified programming model for both batch and stream processing, simplifying development and reducing code duplication.<\/li>\n\n\n\n<li><strong>Auto-Scaling:<\/strong> Dataflow automatically scales up or down based on the processing requirements, ensuring optimal resource utilization and performance.<\/li>\n\n\n\n<li><strong>Managed Service:<\/strong> It&#8217;s a fully managed service, which means Google handles infrastructure provisioning, monitoring, and maintenance.<\/li>\n\n\n\n<li><strong>Windowing:<\/strong> For stream processing, Dataflow supports windowing, allowing you to group and analyze data within specific time intervals.<\/li>\n\n\n\n<li><strong>Exactly-Once Processing:<\/strong> Dataflow offers exactly-once processing semantics, ensuring that data is processed reliably without duplication or loss.<\/li>\n\n\n\n<li><strong>Integration:<\/strong> Seamlessly integrates with other Google Cloud services like BigQuery, Pub\/Sub, Cloud Storage, and more.<\/li>\n\n\n\n<li><strong>Flexible Sinks and Sources:<\/strong> Supports various data sources and sinks, making it easy to ingest and export data to\/from different systems.<\/li>\n\n\n\n<li><strong>Monitoring and Logging:<\/strong> Provides comprehensive monitoring, logging, and debugging tools to help you understand and optimize your data pipelines.<\/li>\n\n\n\n<li><strong>Custom Transformations:<\/strong> You can create custom transformations and functions to perform complex data processing.<\/li>\n\n\n\n<li><strong>Templates:<\/strong> Dataflow allows you to create reusable templates for common processing patterns, simplifying pipeline deployment.<\/li>\n<\/ol>\n\n\n\n<h2 class=\"wp-block-heading\">How Google Cloud Dataflow Works and Architecture?<\/h2>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-326.png\" alt=\"\" class=\"wp-image-38154\" style=\"width:702px;height:395px\" width=\"702\" height=\"395\" srcset=\"https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-326.png 850w, https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-326-300x169.png 300w, https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-326-768x432.png 768w, https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-326-355x199.png 355w\" sizes=\"auto, (max-width: 702px) 100vw, 702px\" \/><figcaption class=\"wp-element-caption\"><strong><em>Google Cloud Dataflow Works and Architecture<\/em><\/strong><\/figcaption><\/figure>\n<\/div>\n\n\n<p>Google Cloud Dataflow processes data using a directed acyclic graph (DAG) of transformations. Here&#8217;s a clarified outline of how it works:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Pipeline Definition:<\/strong> You define your data processing pipeline using the Dataflow SDK. This includes defining data sources, transformations, and data sinks.<\/li>\n\n\n\n<li><strong>Distributed Execution:<\/strong> Dataflow takes the pipeline definition and dynamically optimizes and distributes the work across a cluster of virtual machines.<\/li>\n\n\n\n<li><strong>Data Processing:<\/strong> Each element of input data goes through a series of transformations defined in the pipeline. Transformations can include mapping, filtering, aggregating, and more.<\/li>\n\n\n\n<li><strong>Parallelism and Scaling:<\/strong> Dataflow automatically scales up or down based on the processing needs. It breaks down data into smaller chunks and processes them in parallel to achieve efficient processing.<\/li>\n\n\n\n<li><strong>Windowing (Stream Processing):<\/strong> For stream processing, Dataflow supports windowing, allowing you to group data into time-based intervals for analysis.<\/li>\n\n\n\n<li><strong>Data Sinks:<\/strong> Processed data is sent to defined data sinks, which can be storage systems like Google Cloud Storage, BigQuery, or external systems.<\/li>\n\n\n\n<li><strong>Exactly-Once Processing:<\/strong> Dataflow ensures exactly-once processing by tracking the state of each element processed, making it resilient to failures.<\/li>\n\n\n\n<li><strong>Optimization:<\/strong> Dataflow optimizes the execution plan to minimize data shuffling and optimize resource usage.<\/li>\n\n\n\n<li><strong>Monitoring and Debugging:<\/strong> Dataflow provides tools for monitoring pipeline progress, performance, and identifying bottlenecks or errors.<\/li>\n<\/ol>\n\n\n\n<p>Dataflow&#8217;s architecture abstracts much of the complexity of distributed data processing, allowing you to focus on defining your processing logic and transformations. Under the hood, it uses Google&#8217;s internal data processing technology to efficiently manage resources and deliver reliable processing capabilities.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">How to Install Google Cloud Dataflow?<\/h2>\n\n\n\n<p>To install Google Cloud Dataflow, you will need to:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Create a Google Cloud Platform project and enable the Dataflow API.<\/li>\n\n\n\n<li>Install the Apache Beam SDK for your programming language.<\/li>\n\n\n\n<li>Create a Cloud Storage bucket to store your data and output files.<\/li>\n\n\n\n<li>Write your Dataflow pipeline code.<\/li>\n\n\n\n<li>Submit your Dataflow pipeline to the Dataflow service.<\/li>\n<\/ol>\n\n\n\n<p>Here are the detailed steps on how to install Google Cloud Dataflow:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Create a Google Cloud Platform project and enable the Dataflow API.<\/li>\n<\/ol>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Open the Google Cloud Platform Console: https:\/\/console.cloud.google.com\/.<\/li>\n\n\n\n<li>Click the <strong>Create Project<\/strong> button.<\/li>\n\n\n\n<li>Enter a project name and click the <strong>Create<\/strong> button.<\/li>\n\n\n\n<li>Click the <strong>APIs &amp; Services<\/strong> tab.<\/li>\n\n\n\n<li>Search for &#8220;Dataflow&#8221; and click the <strong>Enable<\/strong> button.<\/li>\n<\/ul>\n\n\n\n<p>    2. Install the Apache Beam SDK for your programming language.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>The Apache Beam SDK is available for a variety of programming languages, including Java, Python, Go, and JavaScript.<\/li>\n\n\n\n<li>You can download the SDK for your programming language from the Apache Beam website: https:\/\/beam.apache.org\/releases\/.<\/li>\n<\/ul>\n\n\n\n<p>    3. Create a Cloud Storage bucket to store your data and output files.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>A Cloud Storage bucket is a place to store your data and output files.<\/li>\n\n\n\n<li>You can create a Cloud Storage bucket from the Cloud Storage Console: https:\/\/console.cloud.google.com\/storage\/.<\/li>\n<\/ul>\n\n\n\n<p>    4. Write your Dataflow pipeline code.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Your Dataflow pipeline code is a program that describes how to process your data.<\/li>\n\n\n\n<li>You can write your Dataflow pipeline code in any programming language that supports the Apache Beam SDK.<\/li>\n\n\n\n<li>There are many examples of Dataflow pipeline code available online.<\/li>\n<\/ul>\n\n\n\n<p>    5. Submit your Dataflow pipeline to the Dataflow service.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Once you have written your Dataflow pipeline code, you can submit it to the Dataflow service.<\/li>\n\n\n\n<li>To do this, you will need to use the gcloud command-line tool: https:\/\/cloud.google.com\/sdk\/gcloud\/.<\/li>\n\n\n\n<li>For more information on how to submit a Dataflow pipeline, please see the Dataflow documentation: https:\/\/cloud.google.com\/dataflow\/docs\/quickstarts\/create-pipeline-java.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\">Basic Tutorials of Google Cloud Dataflow: Getting Started<\/h2>\n\n\n\n<p>Here are some step-by-step basic tutorials of Google Cloud Dataflow:<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-328.png\" alt=\"\" class=\"wp-image-38156\" style=\"width:669px;height:405px\" width=\"669\" height=\"405\" srcset=\"https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-328.png 1024w, https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-328-300x182.png 300w, https:\/\/www.devopsschool.com\/blog\/wp-content\/uploads\/2023\/08\/image-328-768x465.png 768w\" sizes=\"auto, (max-width: 669px) 100vw, 669px\" \/><figcaption class=\"wp-element-caption\"><strong><em>Basic Tutorials of Google Cloud Dataflow<\/em><\/strong><\/figcaption><\/figure>\n<\/div>\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Dataflow quickstart using Python:<\/strong> This tutorial shows you how to create a simple Dataflow pipeline using the Python SDK.<ol><li>Create a Google Cloud Platform project and enable the Dataflow API.<\/li><li>Install the Apache Beam SDK for Python.<\/li><li>Create a Cloud Storage bucket to store your data and output files.<\/li><li>Write your Dataflow pipeline code.<\/li><li>Submit your Dataflow pipeline to the Dataflow service.<\/li><\/ol>\n<ul class=\"wp-block-list\">\n<li>Here is an example of a simple Dataflow pipeline code in Python:<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-1\" data-shcb-language-name=\"JavaScript\" data-shcb-language-slug=\"javascript\"><span><code class=\"hljs language-javascript\"><span class=\"hljs-keyword\">import<\/span> apache_beam <span class=\"hljs-keyword\">as<\/span> beam\n\n<span class=\"hljs-keyword\">with<\/span> beam.Pipeline() <span class=\"hljs-keyword\">as<\/span> pipeline:\n  (pipeline\n    | <span class=\"hljs-string\">'Read data'<\/span> &gt;&gt; beam.io.ReadFromText(<span class=\"hljs-string\">'gs:\/\/my-bucket\/my-data.txt'<\/span>)\n    | <span class=\"hljs-string\">'Count words'<\/span> &gt;&gt; beam.Map(lambda line: len(line.split()))\n    | <span class=\"hljs-string\">'Write results'<\/span> &gt;&gt; beam.io.WriteToText(<span class=\"hljs-string\">'gs:\/\/my-bucket\/my-results.txt'<\/span>))\n\npipeline.run()<\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-1\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">JavaScript<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">javascript<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-2\" data-shcb-language-name=\"PHP\" data-shcb-language-slug=\"php\"><span><code class=\"hljs language-php\">* This code reads the data from the file `gs:<span class=\"hljs-comment\">\/\/my-bucket\/my-data.txt`, counts the number of words in each line, and writes the results to the file `gs:\/\/my-bucket\/my-results.txt`.<\/span>\n\n* To run this code, you can <span class=\"hljs-keyword\">use<\/span> <span class=\"hljs-title\">the<\/span> <span class=\"hljs-title\">following<\/span> <span class=\"hljs-title\">command<\/span>:<\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-2\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">PHP<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">php<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-3\" data-shcb-language-name=\"CSS\" data-shcb-language-slug=\"css\"><span><code class=\"hljs language-css\"><span class=\"hljs-selector-tag\">gcloud<\/span> <span class=\"hljs-selector-tag\">dataflow<\/span> <span class=\"hljs-selector-tag\">jobs<\/span> <span class=\"hljs-selector-tag\">run<\/span> <span class=\"hljs-selector-tag\">my-job<\/span> <span class=\"hljs-selector-tag\">--python-file<\/span> <span class=\"hljs-selector-tag\">my_pipeline<\/span><span class=\"hljs-selector-class\">.py<\/span><\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-3\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">CSS<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">css<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-4\" data-shcb-language-name=\"JavaScript\" data-shcb-language-slug=\"javascript\"><span><code class=\"hljs language-javascript\">* This command will create a Dataflow job called <span class=\"hljs-string\">`my-job`<\/span> and run it using the Python pipeline code <span class=\"hljs-keyword\">in<\/span> the file <span class=\"hljs-string\">`my_pipeline.py`<\/span>.<\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-4\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">JavaScript<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">javascript<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Dataflow quickstart using Java:<\/strong> This tutorial shows you how to create a simple Dataflow pipeline using the Java SDK.\n<ul class=\"wp-block-list\">\n<li>The steps are similar to the Python tutorial, but the code is written in Java.<\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Dataflow quickstart using Go:<\/strong> This tutorial shows you how to create a simple Dataflow pipeline using the Go SDK.\n<ul class=\"wp-block-list\">\n<li>The steps are similar to the Python tutorial, but the code is written in Go.<\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Dataflow quickstart using a template:<\/strong> This tutorial shows you how to create a Dataflow pipeline using a template.\n<ul class=\"wp-block-list\">\n<li>A template is a pre-written Dataflow pipeline that you can use as a starting point for your own pipelines.<\/li>\n\n\n\n<li>To use a template, you need to first download the template from the Dataflow website.<\/li>\n\n\n\n<li>Once you have downloaded the template, you can edit it to customize it for your own needs.<\/li>\n\n\n\n<li>Finally, you can submit the template to the Dataflow service to run it.<\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Dataflow with Cloud Pub\/Sub:<\/strong> This tutorial shows you how to use Dataflow to process streaming data from Cloud Pub\/Sub.\n<ul class=\"wp-block-list\">\n<li>Cloud Pub\/Sub is a messaging service that can be used to send and receive streaming data.<\/li>\n\n\n\n<li>Dataflow can be used to process streaming data from Cloud Pub\/Sub in real time.<\/li>\n\n\n\n<li>To use Dataflow with Cloud Pub\/Sub, you need to first create a Cloud Pub\/Sub topic.<\/li>\n\n\n\n<li>Once you have created a topic, you can send data to the topic using the Cloud Pub\/Sub API.<\/li>\n\n\n\n<li>Dataflow can then be used to process the data from the topic in real time.<\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Dataflow with BigQuery:<\/strong> This tutorial shows you how to use Dataflow to load data into BigQuery.\n<ul class=\"wp-block-list\">\n<li>BigQuery is a cloud data warehouse that can be applied to analyze and store huge amounts of data.<\/li>\n\n\n\n<li>Dataflow can be used to load data into BigQuery in a batch or streaming fashion.<\/li>\n\n\n\n<li>To use Dataflow with BigQuery, you need to first create a BigQuery dataset.<\/li>\n\n\n\n<li>Once you have created a dataset, you can use Dataflow to load data into the dataset.<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>What is Google Cloud Dataflow? Google Cloud Dataflow is a fully managed data processing service provided by Google Cloud Platform. It allows you to design, deploy, and manage data processing&#8230; <\/p>\n","protected":false},"author":25,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_joinchat":[],"footnotes":""},"categories":[2],"tags":[],"class_list":["post-38149","post","type-post","status-publish","format-standard","hentry","category-uncategorised"],"_links":{"self":[{"href":"https:\/\/www.devopsschool.com\/blog\/wp-json\/wp\/v2\/posts\/38149","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.devopsschool.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.devopsschool.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.devopsschool.com\/blog\/wp-json\/wp\/v2\/users\/25"}],"replies":[{"embeddable":true,"href":"https:\/\/www.devopsschool.com\/blog\/wp-json\/wp\/v2\/comments?post=38149"}],"version-history":[{"count":3,"href":"https:\/\/www.devopsschool.com\/blog\/wp-json\/wp\/v2\/posts\/38149\/revisions"}],"predecessor-version":[{"id":38162,"href":"https:\/\/www.devopsschool.com\/blog\/wp-json\/wp\/v2\/posts\/38149\/revisions\/38162"}],"wp:attachment":[{"href":"https:\/\/www.devopsschool.com\/blog\/wp-json\/wp\/v2\/media?parent=38149"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.devopsschool.com\/blog\/wp-json\/wp\/v2\/categories?post=38149"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.devopsschool.com\/blog\/wp-json\/wp\/v2\/tags?post=38149"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}