DEV Community

Alvaro David
Alvaro David

Posted on

GCP Cloud Logging: The Basis

Recently the Google Cloud Platform Blog posted an interesting article about Tips and tricks for using new RegEx support in Cloud Logging, this is a powerful feature but in some way many developers are not even familiar with Logging!

Logging and monitoring are an important part of your project, you have to know what is going on inside your application and I'm not just talking about a print() or console.log() or System.out.println() or etc.

Rich log content (obviously anonymized and following the data protection laws) will help us to understand the behavior of our applications and improve them.

So, this post is about how to create and manage our logs on GCP Cloud Logging in a simple way for devs.

Architecture

This is an architecture that shows a basic pipeline for our logs.

Architecture

Architecture description:

  • Cloud Run: The example is a Docker and Cloud Run is the easiest way to deploy a container on GCP.

  • Cloud Logging: Here is where we are going to manage our logs.

  • Logs export (Sink): One advantage to use Cloud Logging is the ability to export our logs to:

    • Pub/Sub: Basically, we can export our logs wherever we want, just publish the log and we are done.
    • BigQuery: Logs also can be analyzed to get insights about our applications and create ML models with BigQuery ML
    • Cloud Storage: Save our logs in a cheaper way, this could be a good option for systems audit.
  • Create Metrics: We can create dashboard and alerts based on logs. For example: if we get too many 403 errors we can create an alert to send us an email or notification.

For this post we only will focus on Cloud Logging, for the other resources I will leave links for documentation at the end of the post or create more posts about it if you want :)

GCP Resources and permissions

The Cloud Logging client could be used in any place (on-premise for example) and the best way to represent that is using docker. However, we need to enable the Cloud Logging API and create a Service Account to grant logWriter permission to our application.



# Get project information
export PROJECT_ID=$(gcloud config list \
  --format 'value(core.project)')
export PROJECT_NUMBER=$(gcloud projects list \
  --filter="$PROJECT_ID" \
  --format="value(PROJECT_NUMBER)")

# Enable services on GCP project
gcloud services enable logging.googleapis.com

# Creating a Service Account for our API
gcloud iam service-accounts create logging-basis-cred \
  --description "Service Account for Logging Basis Service" \
  --display-name "Logging Basis Service Account"

gcloud projects add-iam-policy-binding $PROJECT_ID \
  --member=serviceAccount:logging-basis-cred@$PROJECT_ID.iam.gserviceaccount.com \
  --role="roles/logging.logWriter"

gcloud iam service-accounts keys create \
  --iam-account logging-basis-cred@$PROJECT_ID.iam.gserviceaccount.com logging-basis-cred.json

# For CI/CD you can encrypt this service account 
# using KMS (https://cloud.google.com/security-key-management).

# NEVER save a service account on a repository.


Enter fullscreen mode Exit fullscreen mode

The Code

I use python in this example but you can use the client that you prefer, the concepts are the same.

main.py



from flask import Flask
from google.cloud import logging
from google.cloud.logging.resource import Resource
import random
import os

app = Flask(__name__)

# resource_type: "cloud_run_revision" (from Dockerfile),
# GCP Resource that we use.
# If not set this log can be found in
# the Cloud Logging console under 'Custom Logs'.
# or using the less efficient global restriction
resource_type = os.environ['RESOURCE_TYPE']

# service_name: "logging-basis" (from Dockerfile),
# service_name is a Cloud Run property
service_name = os.environ['SERVICE']

# region: "us-east1" (from Dockerfile)
region = os.environ['REGION']

# Log Name
log_name = 'choose_side'

'''Stackdriver Logging config (This could be a Class)'''
logging_client = logging.Client()
logger = logging_client.logger(log_name)  
resource = Resource(
    type=resource_type,
    labels={
        "service_name": service_name,
        "location": region
    })


@app.route("/", methods=['GET'])
def choose_side():
    """
    This method chooses a random side and return a simple message.
    :return: String message to the client
    """

    # Getting the side
    side_random = random.randrange(2)
    side = 'Dark side' if side_random == 0 else 'Light side'
    struct = {
        'sideRandom': side_random,
        'side': side,
    }

    # Sending log to Stackdriver Logging
    logger.log_struct(struct, resource=resource, severity='INFO')

    # Response
    return "You're the {} [{}]".format(side, side_random)


Enter fullscreen mode Exit fullscreen mode

Dockerfile, requirements and other files are on the GitHub repository of this post: https://github.com/AlvarDev/Stackdriver-Loggin-Basis

Build and deploy

This part is not required by the Cloud Logging, I just wanted to show you how easy is to deploy a container on GCP :)



gcloud auth configure-docker

docker build -t gcr.io/$PROJECT_ID/logging-basis:v0.1 .

docker push gcr.io/$PROJECT_ID/logging-basis:v0.1

gcloud run deploy logging-basis </span>
--image gcr.io/$PROJECT_ID/logging-basis:v0.1 </span>
--region us-east1 </span>
--platform managed </span>
--allow-unauthenticated

# Expected response:
# Service [logging-basis] revision [logging-basis-00001-tij]
# has been deployed and is serving 100 percent of traffic at
# https://logging-basis-[your-cloud-run-hash]-ue.a.run.app

Enter fullscreen mode Exit fullscreen mode




Test

The code is working

Dark side

Logging

This is why we made all that work, now we have the ability to query our logs and export them if needed.

On the GCP Console, Logs Viewer section, we can run the following query:

Logging query

And this is the result...

Query result

There are our logs! In this example a log is a jsonPayload but they could be just a string if you prefer.

Now we can query our logs, the post that inspired this tutorial is a good example. As I told you, it's possible to export this logs to different resource, I think monitoring is a mandatory one because setting an alert for undesired events will help us a lot.

Hope it helps.


Some links to complement this post:

Top comments (0)