<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Mohamed Rasvi</title>
    <description>The latest articles on DEV Community by Mohamed Rasvi (@mohamed_rasvi_9f19a0ec9c9).</description>
    <link>https://dev.to/mohamed_rasvi_9f19a0ec9c9</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/mohamed_rasvi_9f19a0ec9c9"/>
    <language>en</language>
    <item>
      <title>Deploy Kafka connector on GKE cluster</title>
      <dc:creator>Mohamed Rasvi</dc:creator>
      <pubDate>Mon, 14 Oct 2024 17:12:27 +0000</pubDate>
      <link>https://dev.to/mohamed_rasvi_9f19a0ec9c9/deploy-kafka-connector-on-gke-cluster-4g0k</link>
      <guid>https://dev.to/mohamed_rasvi_9f19a0ec9c9/deploy-kafka-connector-on-gke-cluster-4g0k</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fae1qe8cafm18o0oiydqn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fae1qe8cafm18o0oiydqn.png" alt="Kafkta connector deploy on GKE and sync with pubsub" width="800" height="502"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here are some scenarios in which you might use the Pub/Sub Group Kafka Connector:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;You are migrating a Kafka-based architecture to Google Cloud.&lt;/li&gt;
&lt;li&gt;You have a frontend system that stores events in Kafka outside of Google Cloud, but you also use Google Cloud to run some of your backend services, which need to receive the Kafka events.&lt;/li&gt;
&lt;li&gt;You collect logs from an on-premises Kafka solution and send them to Google Cloud for data analytics.&lt;/li&gt;
&lt;li&gt;You have a frontend system that uses Google Cloud, but you also store data on-premises using Kafka.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;As with Kafka, you can use Pub/Sub to communicate between components in your cloud architecture.&lt;/p&gt;

&lt;p&gt;The Pub/Sub Group Kafka Connector allows you to integrate these two systems. The following connectors are packaged in the Connector JAR:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The sink connector reads records from one or more Kafka topics and publishes them to Pub/Sub.&lt;/li&gt;
&lt;li&gt;The source connector reads messages from a Pub/Sub topic and publishes them to Kafka.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This document we are going to walk through how we can set up &lt;strong&gt;sink connector&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://cloud.google.com/pubsub/docs/connect_kafka#config-options" rel="noopener noreferrer"&gt;More information&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This section walks you through the following tasks:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Configure the Pub/Sub Group Kafka Connector.&lt;/li&gt;
&lt;li&gt;Send events from Kafka to Pub/Sub.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Authenticate&lt;br&gt;
The Pub/Sub Group Kafka Connector must authenticate with Pub/Sub in order to send Pub/Sub messages. To set up authentication, perform the following steps:&lt;/p&gt;

&lt;p&gt;Grant roles to your Google Service Account, IAM roles: roles/pubsub.admin&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git clone https://github.com/googleapis/java-pubsub-group-kafka-connector.git
cd java-pubsub-group-kafka-connector
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Download the connector JAR&lt;/p&gt;

&lt;p&gt;&lt;a href="https://repo1.maven.org/maven2/com/google/cloud/pubsub-group-kafka-connector/0.1.2/pubsub-group-kafka-connector-0.1.2.jar" rel="noopener noreferrer"&gt;Download JAR&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cp config/* [path to Kafka installation]/config/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Update your Kafka Connect configuration&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Navigate to your Kafka directory.&lt;/li&gt;
&lt;li&gt;Open the file named config/connect-standalone.properties in a text editor.&lt;/li&gt;
&lt;li&gt;If the plugin.path property is commented out, uncomment it.&lt;/li&gt;
&lt;li&gt;Update the plugin.path property to include the path to the connector JAR. Example:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;plugin.path=/home/PubSubKafkaConnector/pubsub-group-kafka-connector-1.0.0.jar
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Set the offset.storage.file.filename property to a local file name. In standalone mode, Kafka uses this file to store offset data.&lt;br&gt;
Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;offset.storage.file.filename=/tmp/connect.offsets
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Forward events from Kafka to Pub/Sub&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open the file /config/cps-sink-connector.properties in a text editor. Add values for the following properties, which are marked "TODO" in the comments:&lt;/li&gt;
&lt;li&gt;topics=KAFKA_TOPICS&lt;/li&gt;
&lt;li&gt;cps.project=PROJECT_ID&lt;/li&gt;
&lt;li&gt;cps.topic=PUBSUB_TOPIC&lt;/li&gt;
&lt;li&gt;gcp.credentials.file.path=PATH&lt;/li&gt;
&lt;li&gt;gcp.credentials.json  = JSON_FILE&lt;/li&gt;
&lt;li&gt;Replace the following:&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;KAFKA_TOPICS: A comma-separated list of Kafka topics to read from.&lt;/li&gt;
&lt;li&gt;PROJECT_ID: The Google Cloud project that contains your Pub/Sub topic.&lt;/li&gt;
&lt;li&gt;SUB_TOPIC: The Pub/Sub topic to receive the messages from Kafka&lt;/li&gt;
&lt;li&gt;PATH String Optional. The path to a file that stores Google Cloud credentials for authenticating Pub/Sub  (DOC says pub/sub lite but I guess should work on pub/sub also)&lt;/li&gt;
&lt;li&gt;JSON_FILE  Optional. A JSON blob that contains Google Cloud for authenticating Pub/Sub (DOC says pub/sub lite but I guess should work on pub/sub also)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/pubsub/docs/connect_kafka#config-options" rel="noopener noreferrer"&gt;More info about settings &lt;/a&gt;&lt;br&gt;
File: cps-sink-connector.properties&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Copyright 2022 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#      http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# Unique name for the Pub/Sub sink connector.
name=CPSSinkConnector
# Tha Java class for the Pub/Sub sink connector.
connector.class=com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
# The maximum number of tasks that should be created for this connector.
tasks.max=10
# Set the key converter for the Pub/Sub sink connector.
key.converter=org.apache.kafka.connect.storage.StringConverter
# Set the value converter for the Pub/Sub sink connector.
value.converter=org.apache.kafka.connect.converters.ByteArrayConverter
# A comma-seperated list of Kafka topics to use as input for the connector.
# TODO (developer): update to your Kafka topic name(s).
topics=scotia-wealth
cps.project=PROJECT-ID
# TODO (developer): update to your Pub/Sub topic ID, e.g.
# where data should be written.
cps.topic=kafka-topic-consumer
# Optional. A JSON file path and JSON blob that contains Google Cloud for authenticating Pub/Sub Lite.
gcp.credentials.file.path=PATH
gcp.credentials.json=JSON_FILE
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;From the Kafka directory, run the following command:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;bin/connect-standalone.sh \
  config/connect-standalone.properties \
  config/cps-sink-connector.properties
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Follow the steps in the Apache Kafka quickstart to write some events to your Kafka topic.&lt;/li&gt;
&lt;li&gt;Use the gcloud CLI to read the events from Pub/Sub.&lt;/li&gt;
&lt;li&gt;gcloud pubsub subscriptions pull PUBSUB_SUBSCRIPTION --auto-ack&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://kafka.apache.org/quickstart" rel="noopener noreferrer"&gt;more info about kafka&lt;/a&gt;&lt;br&gt;
GCP repo :&lt;a href="https://github.com/googleapis/java-pubsub-group-kafka-connector" rel="noopener noreferrer"&gt;https://github.com/googleapis/java-pubsub-group-kafka-connector&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fya2nkdj4ne57z9q368i7.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fya2nkdj4ne57z9q368i7.gif" alt="Yes, we have tested the Kafka connector locally with Pub/Sub." width="306" height="284"&gt;&lt;/a&gt;&lt;br&gt;
Ok, we have tested locally. Now it is a time to containerize and deploy the Kafka connector on GKE&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Deploy kafka connector on GKE  as following&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Build the image from the following based image
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[https://github.com/bitnami/containers/tree/main/bitnami/kafka](https://github.com/bitnami/containers/tree/main/bitnami/kafka)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Built a custom Image with connector :- Pub/Sub Group Kafka Connector .Jar&lt;br&gt;
example dockerfile :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;FROM bitnami/kafka:3.4.0-debian-11-r15 AS build-stage
COPY YOUR_pubsubKafkaConnector_FOLDER /opt/bitnami/kafka/
COPY setup /opt/bitnami/kafka/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Create a pubsub &lt;br&gt;
  Topic &lt;br&gt;
  Subscription&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Deploy the helm chart :&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Installing the Chart

Build the image from Dockerfile
## Prerequisites
https://cloud.google.com/artifact-registry/docs/docker/pushing-and-pulling

docker build -t LOCATION-docker.pkg.dev/PROJECT-ID/REPOSITORY/IMAGE:TAG .
docker push LOCATION-docker.pkg.dev/PROJECT-ID/REPOSITORY/IMAGE:TAG

## Helm installs the Kafka instance with a pub/sub connector from a custom image (a custom image that we have built previously)
helm repo add bitnami https://charts.bitnami.com/bitnami

change the docker image in kafka chart (kafka/value.yaml)

## Bitnami Kafka image version
## ref: https://hub.docker.com/r/bitnami/kafka/tags/
## @param image.registry Kafka image registry
## @param image.repository Kafka image repository
## @param image.tag Kafka image tag (immutable tags are recommended)
## @param image.digest Kafka image digest in the way sha256:aa.... Please note this parameter, if set, will override the tag
## @param image.pullPolicy Kafka image pull policy
## @param image.pullSecrets Specify docker-registry secret names as an array
## @param image.debug Specify if debug values should be set
##
image:
  registry: LOCATION-docker.pkg.dev/PROJECT-ID/REPOSITORY
  repository: IMAGE
  tag: TAG
  digest: ""
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Your GKE cluster looks like After deploying the helm chart&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;1. Statefulset
     Kafka instance 
     Kafka zookeeper

2. K8s service 
     Exposed GKE service to publish and consume (kafka topic and messages)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;impersonate GCP Service account via workload identity&lt;/p&gt;

&lt;p&gt;Service account which has permissions to publish message (k8s &lt;br&gt;
SA will impersonate GCP SA via workload identity ) &lt;a href="https://cloud.google.com/kubernetes-engine/docs/how-%20&amp;lt;br&amp;gt;%0Ato/workload-identity" rel="noopener noreferrer"&gt;more info&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;following example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;## Workload identity to impersonate gcp SA
gcloud iam service-accounts add-iam-policy-binding $GSA \
    --role roles/iam.workloadIdentityUser \
    --member "serviceAccount:$PROJECT_ID.svc.id.goog[kafka/kafka]"


## Annotate k8s serviceaccount kafka
kubectl annotate serviceaccount kafka \
    --namespace kafka \
    iam.gke.io/gcp-service-account=$GSA
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Create the ConfigMap from previously downloaded config map files&lt;/li&gt;
&lt;li&gt;Following the example, we have created a folder called config and saved config files
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;kubectl create configmap pubsub-connector --from-file=config/ -n kafka
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>distributedsystems</category>
      <category>eventdriven</category>
      <category>kafka</category>
    </item>
    <item>
      <title>AI powered chat agent to manage public cloud</title>
      <dc:creator>Mohamed Rasvi</dc:creator>
      <pubDate>Wed, 09 Oct 2024 20:17:23 +0000</pubDate>
      <link>https://dev.to/mohamed_rasvi_9f19a0ec9c9/ai-powered-chat-agent-to-manage-public-cloud-i3l</link>
      <guid>https://dev.to/mohamed_rasvi_9f19a0ec9c9/ai-powered-chat-agent-to-manage-public-cloud-i3l</guid>
      <description>&lt;p&gt;I'm thinking to launch a product, AI powered chat aget to manage public cloud infrastructure.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://mohamedrasvi.pythonanywhere.com/" rel="noopener noreferrer"&gt;AI powered chat agent to manage public cloud&lt;/a&gt;&lt;br&gt;
Join Our Waiting List!&lt;br&gt;
Be the first to know when our product launches. Sign up now!&lt;br&gt;
&lt;a href="https://mohamedrasvi.pythonanywhere.com/" rel="noopener noreferrer"&gt;AI powered chat agent to manage public cloud&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>chatagent</category>
      <category>googlecloud</category>
      <category>devops</category>
    </item>
    <item>
      <title>Deploying a stateless container on cloud run</title>
      <dc:creator>Mohamed Rasvi</dc:creator>
      <pubDate>Mon, 07 Oct 2024 21:43:21 +0000</pubDate>
      <link>https://dev.to/mohamed_rasvi_9f19a0ec9c9/deploying-a-stateless-container-in-cloud-run-2j11</link>
      <guid>https://dev.to/mohamed_rasvi_9f19a0ec9c9/deploying-a-stateless-container-in-cloud-run-2j11</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fae1vhb4di8q7mzdsq3ad.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fae1vhb4di8q7mzdsq3ad.png" alt="Deploying a stateless container on cloud run"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I will demonstrate how to deploy a simple container on cloud run.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Cloud Run is a fully managed platform that enables you to run your code directly on top of Google’s scalable infrastructure. Cloud Run is simple, automated, and designed to make you more productive.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ol&gt;
&lt;li&gt;Create a simple hello world application using fastapi library (python)&lt;/li&gt;
&lt;li&gt;Containerize the application&lt;/li&gt;
&lt;li&gt;Configure the workflow with GCP&lt;/li&gt;
&lt;li&gt;Deploy the container onto cloud run service via github workflow&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I followed official fastapi &lt;a href="https://fastapi.tiangolo.com/deployment/docker/" rel="noopener noreferrer"&gt;doc&lt;/a&gt; to spin up a hello world app&lt;br&gt;
Create a &lt;strong&gt;&lt;u&gt;requirements.txt&lt;/u&gt;&lt;/strong&gt; file&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;fastapi[standard]
pydantic&amp;gt;=2.7.0,&amp;lt;3.0.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Create an app directory and enter it&lt;/li&gt;
&lt;li&gt;Create an empty file &lt;strong&gt;init&lt;/strong&gt;.py&lt;/li&gt;
&lt;li&gt;Create a main.py file with:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from typing import Union

from fastapi import FastAPI

app = FastAPI()


@app.get("/")
def read_root():
    return {"Hello": "World"}


@app.get("/items/{item_id}")
def read_item(item_id: int, q: Union[str, None] = None):
    return {"item_id": item_id, "q": q}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create a Dockerfile&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;FROM python:3.9

WORKDIR /code

COPY ./requirements.txt /code/requirements.txt

RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt

COPY ./app /code/app

CMD ["fastapi", "run", "app/main.py", "--port", "80"]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;GitHub Action&lt;/strong&gt;&lt;br&gt;
In order for the GitHub actions process to pick up the YAML file, there’s specific location for it to live. Each repository using actions requires a directory structure called /.github/workflows&lt;/p&gt;

&lt;p&gt;*&lt;em&gt;Configure this workflow with GCP &lt;a href="https://github.com/actions/starter-workflows/blob/main/deployments/google-cloudrun-docker.yml#L5" rel="noopener noreferrer"&gt;more info&lt;/a&gt; *&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# This workflow build and push a Docker container to Google Artifact Registry
# and deploy it on Cloud Run when a commit is pushed to the $default-branch
# branch.
#
# To configure this workflow:
#
# 1. Enable the following Google Cloud APIs:
#
#    - Artifact Registry (artifactregistry.googleapis.com)
#    - Cloud Run (run.googleapis.com)
#    - IAM Credentials API (iamcredentials.googleapis.com)
#
#    You can learn more about enabling APIs at
#    https://support.google.com/googleapi/answer/6158841.
#
# 2. Create and configure a Workload Identity Provider for GitHub:
#    https://github.com/google-github-actions/auth#preferred-direct-workload-identity-federation.
#
#    Depending on how you authenticate, you will need to grant an IAM principal
#    permissions on Google Cloud:
#
#    - Artifact Registry Administrator (roles/artifactregistry.admin)
#    - Cloud Run Developer (roles/run.developer)
#
#    You can learn more about setting IAM permissions at
#    https://cloud.google.com/iam/docs/manage-access-other-resources
#
# 3. Change the values in the "env" block to match your values.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create a file google-cloudrun-docker.yml&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;name: 'Build and Deploy to Cloud Run'

on:
  push:
    branches:
      - '$default-branch'

env:
  PROJECT_ID: 'my-project' # TODO: update to your Google Cloud project ID
  REGION: 'us-central1' # TODO: update to your region
  SERVICE: 'my-service' # TODO: update to your service name
  WORKLOAD_IDENTITY_PROVIDER: 'projects/123456789/locations/global/workloadIdentityPools/my-pool/providers/my-provider' # TODO: update to your workload identity provider

jobs:
  deploy:
    runs-on: 'ubuntu-latest'

    permissions:
      contents: 'read'
      id-token: 'write'

    steps:
      - name: 'Checkout'
        uses: 'actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332' # actions/checkout@v4

      # Configure Workload Identity Federation and generate an access token.
      #
      # See https://github.com/google-github-actions/auth for more options,
      # including authenticating via a JSON credentials file.
      - id: 'auth'
        name: 'Authenticate to Google Cloud'
        uses: 'google-github-actions/auth@f112390a2df9932162083945e46d439060d66ec2' # google-github-actions/auth@v2
        with:
          workload_identity_provider: '${{ env.WORKLOAD_IDENTITY_PROVIDER }}'

      # BEGIN - Docker auth and build
      #
      # If you already have a container image, you can omit these steps.
      - name: 'Docker Auth'
        uses: 'docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567' # docker/login-action@v3
        with:
          username: 'oauth2accesstoken'
          password: '${{ steps.auth.outputs.auth_token }}'
          registry: '${{ env.REGION }}-docker.pkg.dev'

      - name: 'Build and Push Container'
        run: |-
          DOCKER_TAG="$${{ env.REGION }}-docker.pkg.dev/${{ env.PROJECT_ID }}/${{ env.SERVICE }}:${{ github.sha }}"
          docker build --tag "${DOCKER_TAG}" .
          docker push "${DOCKER_TAG}"
      - name: 'Deploy to Cloud Run'

        # END - Docker auth and build

        uses: 'google-github-actions/deploy-cloudrun@33553064113a37d688aa6937bacbdc481580be17' # google-github-actions/deploy-cloudrun@v2
        with:
          service: '${{ env.SERVICE }}'
          region: '${{ env.REGION }}'
          # NOTE: If using a pre-built image, update the image name below:

          image: '${{ env.REGION }}-docker.pkg.dev/${{ env.PROJECT_ID }}/${{ env.SERVICE }}:${{ github.sha }}'
      # If required, use the Cloud Run URL output in later steps
      - name: 'Show output'
        run: |2-

          echo ${{ steps.deploy.outputs.url }}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Directory Structure&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;You should now have a directory structure like:&lt;br&gt;
&lt;/p&gt;
&lt;/blockquote&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;├── app
│   ├── __init__.py
│   └── main.py
├── Dockerfile
└── requirements.txt
└── requirements.txt
├── .github
│   ├── workflows
         ├── google-cloudrun-docker.yml


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
&amp;gt; 1. Create a new repo in gitHUb
&amp;gt; 2. Push your exisisting code to new repository on default branch
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>devops</category>
      <category>python</category>
      <category>fastapi</category>
      <category>containerapps</category>
    </item>
    <item>
      <title>Google identity Platform</title>
      <dc:creator>Mohamed Rasvi</dc:creator>
      <pubDate>Fri, 04 Oct 2024 17:01:34 +0000</pubDate>
      <link>https://dev.to/mohamed_rasvi_9f19a0ec9c9/google-idenity-platform-40gn</link>
      <guid>https://dev.to/mohamed_rasvi_9f19a0ec9c9/google-idenity-platform-40gn</guid>
      <description>&lt;p&gt;This post I would like to explain how google identity platform works in a simple manner. &lt;/p&gt;

&lt;p&gt;Most of the time, the Google Identity platform works as a proxy to configure identity providers. Google identity platform is little different to Azure ID, where you can create your own SAML and OIDC identity providers.&lt;/p&gt;

&lt;p&gt;For the user authentication In this article, we will use federated identity provider Google.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Few625rupla7zfd7ltqf0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Few625rupla7zfd7ltqf0.png" alt="Image description" width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Above User flow in details&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Users access the web application&lt;/li&gt;
&lt;li&gt;Application: check whether the user is logged in or not&lt;/li&gt;
&lt;li&gt;If you are not authenticated, the application will redirect to a federated identity provider : Google&lt;/li&gt;
&lt;li&gt;Once the user is authenticated successfully and redirects to the &lt;strong&gt;/__auth/handler&lt;/strong&gt; endpoint, this service is hosted by Google&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The /__auth/handler&lt;/strong&gt; service redirects back to the application&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;We need to understand the Main 5 concepts behind this google identity platform.&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;ol&gt;
&lt;li&gt;Authentication&lt;/li&gt;
&lt;li&gt;Users&lt;/li&gt;
&lt;li&gt;Admin Auth API&lt;/li&gt;
&lt;li&gt;Multi-tenancy&lt;/li&gt;
&lt;li&gt;Differences between Identity Platform and Firebase Authentication&lt;/li&gt;
&lt;/ol&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;I will explain the main most important concept for this post; keep the article short and simple.&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Authentication&lt;/strong&gt;&lt;br&gt;
Identity Platform allows users to authenticate to your apps and services, like multi-tenant SaaS apps, mobile/web apps, games, APIs and more. Identity Platform provides secure, easy-to-use authentication if you're building a service on Google Cloud, on your own backend or on another platform.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How it works?&lt;/strong&gt;&lt;br&gt;
To sign a user into your app, you first get authentication credentials from the user. These credentials can be the user's &lt;br&gt;
email address and password, a SAML assertion, or an OAuth token from a federated identity provider. &lt;br&gt;
In the case of federated identity providers, the providers return those tokens to Identity Platform's authentication handler on the &lt;strong&gt;/__auth/handler&lt;/strong&gt; endpoint. This service is hosted by Google, so you don't have to receive and validate the authentication artifact. After the tokens are received, our backend services will verify them and return a response to the client.&lt;br&gt;
After a successful sign in, you can access the user's basic profile information, and you can control the user's access to data stored in Google Cloud or other products. You can also use the provided authentication token to verify the identity of users in your own backend services.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;You can read here to understand the above all concepts &lt;a href="https://cloud.google.com/identity-platform/docs/concepts" rel="noopener noreferrer"&gt;&lt;code&gt;here&lt;/code&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Setting up a Google IDP on google idenity platform&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to google idenity platform&lt;/li&gt;
&lt;li&gt;Click Add A Provider and select Google&lt;/li&gt;
&lt;li&gt;Enter your Google Web Client ID and Web Secret. If you don't already have an ID and secret, you can obtain one from the A&lt;a href="https://console.cloud.google.com/apis/credentials?_ga=2.238208832.11633201.1728050152-1979371586.1726089943" rel="noopener noreferrer"&gt;PI's &amp;amp; Services&lt;/a&gt; page.&lt;/li&gt;
&lt;li&gt;Configure the URI listed under Configure Google as a valid OAuth redirect URI for your Google app. If you configured a custom domain in Identity Platform, update the redirect URI in your Google app configuration to use the custom domain instead of the default domain. For example, change &lt;a href="https://myproject.firebaseapp.com/__/auth/handler" rel="noopener noreferrer"&gt;https://myproject.firebaseapp.com/__/auth/handler&lt;/a&gt; to &lt;a href="https://auth.myownpersonaldomain.com/__/auth/handler" rel="noopener noreferrer"&gt;https://auth.myownpersonaldomain.com/__/auth/handler&lt;/a&gt;.
This is where confusion we see on the Google cloud document. Most of the time we don't read the concepts, so we assume our application must have this callback URL, &lt;strong&gt;/__/auth/handler&lt;/strong&gt;, but we don't need this because &lt;strong&gt;/__/auth/handler&lt;/strong&gt; is hosted in Google; it is a Google-hosted backend service that does the magic like creating JWT, etc.&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Register your app's domains by clicking Add Domain under Authorized Domains. For development purposes, localhost is already enabled by default.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click Save.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I'm not going to show the entire code, I just insert some example code snippets.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;code&gt;import { getAuth, signInWithRedirect, getRedirectResult, GoogleAuthProvider } from "firebase/auth";&lt;br&gt;
const auth = getAuth();&lt;br&gt;
signInWithRedirect(auth, provider);&lt;br&gt;
getRedirectResult(auth)&lt;br&gt;
  .then((result) =&amp;gt; {&lt;br&gt;
    // This gives you a Google Access Token. You can use it to access Google APIs.&lt;br&gt;
    const credential = GoogleAuthProvider.credentialFromResult(result);&lt;br&gt;
    const token = credential.accessToken;&lt;br&gt;
    // The signed-in user info.&lt;br&gt;
    const user = result.user;&lt;br&gt;
    // IdP data available using getAdditionalUserInfo(result)&lt;br&gt;
    // ...&lt;br&gt;
  }).catch((error) =&amp;gt; {&lt;br&gt;
    // Handle Errors here.&lt;br&gt;
    const errorCode = error.code;&lt;br&gt;
    const errorMessage = error.message;&lt;br&gt;
    // The email of the user's account used.&lt;br&gt;
    const email = error.customData.email;&lt;br&gt;
    // The AuthCredential type that was used.&lt;br&gt;
    const credential = GoogleAuthProvider.credentialFromError(error);&lt;br&gt;
    // ...&lt;br&gt;
  });&lt;/code&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/identity-platform/docs/web/google#signing_in_users_with_the" rel="noopener noreferrer"&gt;Signing in users with the Client SDK&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let me know if you want to dive deep into the Google identity platform.&lt;/p&gt;

&lt;p&gt;For example, integration with gateways like Kong. &lt;/p&gt;

</description>
      <category>googlecloud</category>
      <category>identity</category>
      <category>oauth</category>
      <category>devops</category>
    </item>
  </channel>
</rss>
