<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Paul Rondeau</title>
    <description>The latest articles on DEV Community by Paul Rondeau (@parondeau).</description>
    <link>https://dev.to/parondeau</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/parondeau"/>
    <language>en</language>
    <item>
      <title>Server Side Authentication with Firebase and Next.js</title>
      <dc:creator>Paul Rondeau</dc:creator>
      <pubDate>Mon, 27 Apr 2020 14:27:05 +0000</pubDate>
      <link>https://dev.to/parondeau/server-side-authentication-with-firebase-and-next-js-bag</link>
      <guid>https://dev.to/parondeau/server-side-authentication-with-firebase-and-next-js-bag</guid>
      <description>&lt;p&gt;In this tutorial I will go over how to set up server-side authentication for an application using Firebase Auth and Next.js. The motivation behind server-side authentication was to permit &lt;a href="https://nextjs.org/docs/basic-features/pages#server-side-rendering"&gt;server-side rendering&lt;/a&gt; (SSR), which is one of the huge benefits of using a framework like Next.js, for an application that requires rendering some user data.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;tl;dr Store the Firebase user &lt;code&gt;idToken&lt;/code&gt; as a cookie. Send that cookie in &lt;code&gt;getInitialProps&lt;/code&gt; to an API function to validate it with the Firebase Admin SDK. If valid, query your database for the data you require on page load and pass that data to the app as a prop.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This does not work out of the box when using Firebase Auth because all of their login logic happens on the client-side when using their &lt;a href="https://github.com/firebase/firebase-js-sdk"&gt;Javascript SDK&lt;/a&gt;. The typical flow of an application that uses Firebase Auth is as follows:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;User logs in first time, Firebase SDK writes a cookie that only the Firebase SDK can access.&lt;/li&gt;
&lt;li&gt;When the user returns to the site, the Firebase SDK automatically verifies a valid token or refreshes an expired one, thereby logging the user in automatically.&lt;/li&gt;
&lt;li&gt;Depending on the app, the UI may flash once the user has been logged in and taken into the application.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Where this breaks down
&lt;/h3&gt;

&lt;p&gt;If we are using a framework such as Next.js, you most likely want to utilize the benefits of SSR (no screen flash or client-side JS execution). For performance reasons, Next.js will try to render as much of a page as it can on the server-side, and send the resulting HTML to the browser. However, if your application has a blocking requirement for user data, than you cannot pre-render any of the application until the Firebase SDK logs the user in.&lt;/p&gt;

&lt;h3&gt;
  
  
  Introducing Server Side Authentication
&lt;/h3&gt;

&lt;p&gt;If, during the initial page load, we can log a user in on the server and query the DB for that user info, we should be able to pre-render the site for the user (no screen flash or client-side JS execution). How can we verify if a page request is coming from a logged in user? Cookies!&lt;/p&gt;

&lt;p&gt;If we write our own cookie with some Firebase-verifiable user token, we should be able to verify a user is who they say they are on the server. Let's dive into the details.&lt;/p&gt;

&lt;h3&gt;
  
  
  User Login/Logout Flow
&lt;/h3&gt;

&lt;p&gt;Once the user logs in using the Firebase SDK, we will write a cookie that can be access by the server. Using a simple library called &lt;a href="https://github.com/js-cookie/js-cookie"&gt;js-cookie&lt;/a&gt;, we can read/write cookies to the browser for a user. Now the actual cookie we write needs to be something that firebase can verify is a valid user credential. For this, we will use the &lt;code&gt;idToken&lt;/code&gt; retrieved from a Firebase user object. This is a short-lived user token that can be passed to a server and validated by the Firebase Admin SDK. You can read more about the specifics &lt;a href="https://firebase.google.com/docs/auth/admin/verify-id-tokens"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import cookie from 'js-cookie';
import 'firebase' from 'firebase/app';
import 'firebase/auth';

const tokenName = 'tokenName';

firebase.auth().onAuthStateChanged(async (user: firebase.User) =&amp;gt; {
  if (user) {
    const token = await user.getIdToken();
    cookie.set(tokenName, token, { expires: 1 });
    this.setState({ user });
  } else {
    cookie.remove(tokenName);
    this.setState({ user: null, userData: null });
  }
});
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;h3&gt;
  
  
  User Returns to Site
&lt;/h3&gt;

&lt;p&gt;Here is where the magic happens. We need two things. First, update the &lt;code&gt;_app.tsx&lt;/code&gt; to attempt to retrieve the cookie we set during login. Second, validate that cookie using the &lt;a href="https://firebase.google.com/docs/auth/admin/verify-id-tokens#verify_id_tokens_using_the_firebase_admin_sdk"&gt;Firebase Admin SDK&lt;/a&gt;. We will be using a Next framework called &lt;a href="https://www.npmjs.com/package/next-cookies"&gt;next-cookies&lt;/a&gt; to automatically retrieve a cookie for us from local storage. The great thing about &lt;code&gt;next-cookies&lt;/code&gt; is the ability to run on the client or server-side. In the App's &lt;code&gt;getInitialProps&lt;/code&gt; method we will make an API request to validate the specified token and if valid, will populate the props for the app. Here's the code in &lt;code&gt;app.tsx&lt;/code&gt;  to get said cookie and make a request to the api function &lt;code&gt;/api/validate&lt;/code&gt;.&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import fetch from 'isomorphic-unfetch';

class MyApp extends App {

  constructor(props) {
    super(props);
    this.state = {
      user: this.props.user,
      userData: this.props.userData,
    }
  }

  render() {
    const { Component, pageProps } = this.props;
    const { user, userData } this.state;
    return &amp;lt;Component {...pageProps} user={user} userData={thisuserData} /&amp;gt;;
  }

  static async getInitialProps(appContext: AppContext) {
    const { ctx } = appContext;
    // calls page's `getInitialProps` and fills `appProps.pageProps`
    const appProps = await App.getInitialProps(appContext);

    // only run on server-side, user should be auth'd if on client-side
    if (typeof window === 'undefined') {
      const { tokenName } = nextCookie(ctx);

      // if a token was found, try to do SSA
      if (tokenName) {
        try {
          const headers: HeadersInit = {
            'Content-Type': 'application/json',
             Authorization = JSON.stringify({ token: tokenName });
          };
          const result = await fetch('/api/validate', { headers });
          return { ...result, ...appProps };
        } catch (e) {
          // let exceptions fail silently
          // could be invalid token, just let client-side deal with that
        }
      }
    }
    return { ...appProps };
  }
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Using the Firebase Admin SDK we can validate that token securely. After we validate the token, we can query some backend data store to retrieve the initial props we want to populate our app with, in this case it is user data retrieved from Firestore. We will need an API function called &lt;code&gt;/pages/api/validate.tsx&lt;/code&gt;. &lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import * as admin from 'firebase-admin';

interface UserData {
  // arbitrary user data you are storing in your DB
}

interface ValidateResponse {
  user: {
    uid: string;
    email: string;
  };
  userData: UserData
}

const admin = admin.initializeApp({
  // your admin app creds
});

const validate = (token: string) =&amp;gt; 
  const decodedToken = await admin.auth().verifyIdToken(token, true);
  console.log('Valid token.');

  // get user data from your DB store
  const data = (
    await admin
      .firestore()
      .doc(`/users/${decodedToken.uid}`)
      .get()
  ).data() as UserData;

  const user = await admin.auth().getUser(decodedToken.uid);
  const result = {
    user: {
      uid: user.uid,
      email: user.email,
    },
    userData: data,
  };
  return result;
};

export default async (req: NextApiRequest, res: NextApiResponse) =&amp;gt; {
  console.log('Validating token...');
  try {
    const { token } = JSON.parse(req.headers.authorization || '{}');
    if (!token) {
      return res.status(403).send({
        errorCode: 403,
        message: 'Auth token missing.'
      });
    }
    const result = await validate(token);
    return res.status(200).send(result);
  } catch (err) {
    return res.status(err.code).send({
      errorCode: err.code,
      message: err.message,
    });
  }
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Now that Next has the initial props we want to render, it can go ahead SSR the entire app for this specific user with all of their data populated in it. Success!&lt;/p&gt;

&lt;p&gt;There's quite a lot to unpack in those snippets, but there's a few key parts. In the &lt;code&gt;getInitialProps&lt;/code&gt; method, if there is no token found, the validate API call is skipped entirely. If there is a token, we make an HTTP request to &lt;code&gt;/api/validate&lt;/code&gt; with a header that includes the token. If that API request returns data, then the token was validated and the props can be populated with the result of the validate call. If the API call returns a rejected promise, we render the page without any user data. Since this is a short lived token, the token may have once been valid, but is no longer valid. In this case, the API function returns a rejected promise and we cannot SSR the page for the user. Luckily, on the client-side, the Firebase SDK will refresh the user's credentials, at which point a new cookie will be saved for future requests.&lt;/p&gt;

&lt;p&gt;You may have noticed I used &lt;code&gt;state&lt;/code&gt; instead of &lt;code&gt;props&lt;/code&gt; to pass data into the components. This is to handle changing data. If for instance your user logs out, you probably want to update your local state and stop rendering user data in the child components for that user.&lt;/p&gt;

&lt;p&gt;Hope you enjoyed this tutorial and helped elevate your Next.js and Firebase project!&lt;/p&gt;

&lt;p&gt;If you have any questions, hit me up on Twitter.&lt;/p&gt;




&lt;p&gt;This article was originally published on &lt;a href="https://parondeau.com/blog/server-auth-firebase-nextjs"&gt;parondeau.com&lt;/a&gt; on 2020-04-09.&lt;/p&gt;

</description>
      <category>tutorial</category>
      <category>serverless</category>
      <category>firebase</category>
      <category>nextjs</category>
    </item>
    <item>
      <title>GCP Credentials &amp; Next.js</title>
      <dc:creator>Paul Rondeau</dc:creator>
      <pubDate>Tue, 14 Apr 2020 18:58:47 +0000</pubDate>
      <link>https://dev.to/parondeau/gcp-credentials-next-js-3a0d</link>
      <guid>https://dev.to/parondeau/gcp-credentials-next-js-3a0d</guid>
      <description>&lt;p&gt;Let's begin with the problem. You're running a Next.js app on ZEIT Now, you've created some API functions and now you want to make an authenticated call to some GCP service using their client libraries (Firebase, GCS, Big Query, etc). For apps not running on GCP, you must provide your own set of credentials to make authenticated requests to your GCP services, however, we don't want to just go and store those credentials in our repo as plaintext (🚨 danger️️ous 🚨)! We &lt;em&gt;should&lt;/em&gt; use some secure data-store for this. Luckily, if you're deploying on Now, it has support for &lt;a href="https://zeit.co/docs/v2/serverless-functions/env-and-secrets#adding-secrets"&gt;secrets&lt;/a&gt;, but they only allow the secret value to be a string.&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ now secrets add &amp;lt;secret-name&amp;gt; &amp;lt;secret-value&amp;gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Wouldn't it be nice if we could store our service account JSON  in that secret? Turns out we can with the power of &lt;code&gt;base64&lt;/code&gt; encoding. With a few commands we can turn our JSON key into a secret that can be consumed in our API functions. &lt;/p&gt;

&lt;h3&gt;
  
  
  Here are the steps we'll need to do
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Create a GCP service account with the appropriate permissions.&lt;/li&gt;
&lt;li&gt;Download a JSON credential for that service account.&lt;/li&gt;
&lt;li&gt;Convert that service account into a &lt;code&gt;base64&lt;/code&gt; encoded string and save it as a Now Secret.&lt;/li&gt;
&lt;li&gt;Configure the build processes (remote and local) of Now to access this secret and store it as an environment variable.&lt;/li&gt;
&lt;li&gt;Configure Next.js to expose this environment variable to the application.&lt;/li&gt;
&lt;li&gt;Read the environment variable in our API function, &lt;code&gt;base64&lt;/code&gt; decode it and create a Google Credential from that key.&lt;/li&gt;
&lt;li&gt;Make authenticated requests to your GCP services like Firebase, GCS, BigQuery, etc. &lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Creating and managing the Service account
&lt;/h3&gt;

&lt;p&gt;First you'll need to create a service account in GCP with the appropriate permissions granted to it, Google has a &lt;a href="https://cloud.google.com/iam/docs/creating-managing-service-accounts"&gt;simple guide&lt;/a&gt; explaining how to do this. Next, &lt;a href="https://cloud.google.com/iam/docs/creating-managing-service-account-keys"&gt;download a JSON key&lt;/a&gt; of this account. Once you have your JSON key, you can rename it if you'd like to match the following shell command. Now that we have our JSON key, we can create a Now secret using the data in it.&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ now secret add &amp;lt;secret-name&amp;gt; $(cat service-account.json | base64)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;h3&gt;
  
  
  ZEIT Now configuration
&lt;/h3&gt;

&lt;p&gt;Now that your service account is encoded via &lt;code&gt;base64&lt;/code&gt; and stored in Now. We need to set up a few more things in the build process for that secret to be accessible by your API function. We have two cases we need to cover, one, your local development build will need to read that secret and two, the remote deployment of Next.js on Now. Using &lt;a href="https://zeit.co/docs/configuration#project/build-env"&gt;Now build configuration&lt;/a&gt; we will tell the Now deployment to mount that secret into our Next app configuration so we can access the secret as an environment variable. &lt;/p&gt;

&lt;p&gt;For the local case, we will create a new file called &lt;code&gt;.env.build&lt;/code&gt; at your project root. You will need to copy the base64 encoded secret into this file. Be sure to add this file to your &lt;code&gt;.gitignore&lt;/code&gt; or else your secret may become public!&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ echo GOOGLE_APPLICATION_CREDENTIALS=$(cat service-account.json | base64) &amp;gt;&amp;gt; .env.build
$ echo .env.build &amp;gt;&amp;gt; .gitignore
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Now instead of starting up your service with &lt;code&gt;npm run dev&lt;/code&gt; or &lt;code&gt;yarn dev&lt;/code&gt; you will need to start using &lt;code&gt;now dev&lt;/code&gt; check this &lt;a href="https://zeit.co/blog/now-dev"&gt;blog post&lt;/a&gt; for more info.&lt;/p&gt;

&lt;p&gt;For the remote case, you will need to create a file in the root called &lt;code&gt;now.json&lt;/code&gt; and populate it as follows.&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "build": {
    "env": {
      "GOOGLE_APPLICATION_CREDENTIALS": "@secret-name"
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Be sure to note the "@" symbol, this tells Now to use a secret of this name instead of the raw string.&lt;/p&gt;

&lt;h3&gt;
  
  
  Next.js Configuration
&lt;/h3&gt;

&lt;p&gt;Next up we want to configure Next to expose this environment variable to the application. To do so, modify your &lt;code&gt;next.config.js&lt;/code&gt;. If you don't already have one, create an empty file at the root again and name it &lt;code&gt;next.config.js&lt;/code&gt;. Add the following to that file. Check out the &lt;a href="https://nextjs.org/docs/api-reference/next.config.js/introduction"&gt;Next docs&lt;/a&gt; for more info on using a custom &lt;code&gt;next.config.js&lt;/code&gt;.&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;module.exports = {
  env: {
    GOOGLE_APPLICATION_CREDENTIALS: process.env.GOOGLE_APPLICATION_CREDENTIALS,
  },
};
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;h3&gt;
  
  
  Accessing the Service Account in the API Function
&lt;/h3&gt;

&lt;p&gt;We have one final step before we can make authenticated calls to GCP. That is reading the environment variable where our secret is stored, and turning it back (&lt;code&gt;base64&lt;/code&gt; decode) into a credential that can be consumed by the various GCP SDKs. Once we do this, we can make all the authenticated requests we like!&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const credential = JSON.parse(
    Buffer.from(process.env.GOOGLE_APPLICATION_CREDENTIALS, 'base64').toString()
);

// Authenticate with the GCS SDK
import { Storage } from '@google-cloud/storage';

const storage = new Storage({
    projectId: '&amp;lt;gcp-project-id&amp;gt;',
    credentials: credential,
});

// Authenticate with the Firebase Admin SDK
import * as admin from 'firebase-admin';

admin.initializeApp({
    ...appOptions,
    credential: admin.credential.cert(credential),
});
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;That just about sums it up. Hope this helps you on your Next project!&lt;/p&gt;

&lt;p&gt;If you have any questions, hit me up on &lt;a href="https://twitter.com/parondeau_"&gt;Twitter&lt;/a&gt;.&lt;/p&gt;




&lt;p&gt;This article was originally published on &lt;a href="https://parondeau.com/blog/gcp-credentials-nextjs"&gt;parondeau.com&lt;/a&gt; on 2020-04-03.&lt;/p&gt;

</description>
      <category>tutorial</category>
      <category>gcp</category>
      <category>nextjs</category>
      <category>javascript</category>
    </item>
    <item>
      <title>Self-Hosting Ghost on GCP</title>
      <dc:creator>Paul Rondeau</dc:creator>
      <pubDate>Thu, 09 Apr 2020 18:04:55 +0000</pubDate>
      <link>https://dev.to/parondeau/self-hosting-ghost-on-gcp-497o</link>
      <guid>https://dev.to/parondeau/self-hosting-ghost-on-gcp-497o</guid>
      <description>&lt;p&gt;As most software engineers will attest, I think there's some law that every programmer must completely redo their blog/website every 2 years. So this is me, paying my dues.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;tldr; Run a Ghost docker image on Cloud Run, configure a custom domain and connect it to Cloud SQL MySQL for ~$5/month.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;In this post I will try and explain how this site was configured and served. I had a few goals when redesigning my website.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;I wanted the cleanest writing experience possible&lt;/li&gt;
&lt;li&gt;I wanted to run this as cheaply as possible&lt;/li&gt;
&lt;li&gt;I wanted minimal maintenance once configured&lt;/li&gt;
&lt;li&gt;I wanted the ability to tweak the UI to my heart's content&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;With these restrictions, I started searching for a blogging platform. Previously I had a static-site generated blog set up using Markdown, but I found, as with many DIY blog platforms, the editing experience to be poor and deployment to be cumbersome.&lt;/p&gt;

&lt;h3&gt;
  
  
  Prerequisites
&lt;/h3&gt;

&lt;p&gt;There are a few things this tutorial assumes you have already completed.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;code&gt;docker&lt;/code&gt; installed&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;gcloud&lt;/code&gt; configured&lt;/li&gt;
&lt;li&gt;Google Container Registry, Cloud Run and Cloud SQL APIs initialized&lt;/li&gt;
&lt;li&gt;GCP Billing configured&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Helpful links &lt;a href="https://cloud.google.com/run/docs/setup" rel="noopener noreferrer"&gt;[1]&lt;/a&gt;, &lt;a href="https://cloud.google.com/billing/docs/how-to/manage-billing-account" rel="noopener noreferrer"&gt;[2]&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Enter Ghost Stage Right
&lt;/h3&gt;

&lt;p&gt;As described by themselves Ghost is "&lt;em&gt;...a powerful platform for creating an online blog or publication&lt;/em&gt;". After playing around with the product I found that Ghost certainly passes goals (1) and (3) with flying colours, and does a decent job at (4), depending on how you configure it. But when we look at (2), using Ghost Pro doesn't do so well. Coming in at $29/month, this was quite a bit more expensive than I was willing to pay for a personal blog that I make $0/month on. After diving through their developer documentation I found a few options available to self-host Ghost. This could significantly lower the costs.&lt;/p&gt;

&lt;p&gt;There exists a sanctioned, but community run &lt;a href="https://ghost.org/docs/install/docker/" rel="noopener noreferrer"&gt;unofficial Docker image&lt;/a&gt; containing Ghost and the necessary dependencies to run the service. After playing around with this locally I decided it was time to try putting this &lt;a href="https://www.youtube.com/watch?v=jxvdsdGavq0" rel="noopener noreferrer"&gt;on the line&lt;/a&gt;. Coming from a ton of experience in GCP, the easiest way for me to get the docker image out there was to use &lt;a href="https://cloud.google.com/run" rel="noopener noreferrer"&gt;Cloud Run&lt;/a&gt;. Simply pass in a docker image and Cloud Run manages it for you. It will scale it up and down and manage access via GCP IAMs. You can even connect it to a private domain you own.&lt;/p&gt;

&lt;h3&gt;
  
  
  Configuring Cloud Run
&lt;/h3&gt;

&lt;p&gt;You can only deploy images on Cloud Run that you have created and stored in Google Container Registry (GCR). So let's grab that Ghost docker image and upload it to our own GCP. &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker pull ghost:3.12.0
docker tag ghost:3.12.0 gcr.io/&amp;lt;GCP_PROJECT_NAME&amp;gt;/ghost:3.12.0
docker push gcr.io/&amp;lt;GCP_PROJECT_NAME&amp;gt;/ghost:3.12.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Fantastic, we now have our own docker image stored on GCR. Let's set up Cloud Run to deploy this image. You can follow along this &lt;a href="https://cloud.google.com/run/docs/deploying" rel="noopener noreferrer"&gt;Google tutorial&lt;/a&gt; and these screenshots, but it is fairly straightforward.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-run.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-run.png" alt="Cloud Run Tutorial 1"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Follow the prompts and be sure to select:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Cloud Run (fully managed)&lt;/li&gt;
&lt;li&gt;A region that suits your needs&lt;/li&gt;
&lt;li&gt;Allow unauthenticated invocations&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Once you make it to the second screen select the newly created docker image in your GCR and be sure to select the container port to be &lt;code&gt;2368&lt;/code&gt;, this is the default port Ghost will be listening on. I kept most of the defaults, make sure the memory allocated is at least 256 MiB or else you run the risk of Ghost running out of memory and crashing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-run-0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-run-0.png" alt="Cloud Run Tutorial 2"&gt;&lt;/a&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-run-2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-run-2.png" alt="Cloud Run Tutorial 3"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click create and you're well on your way to running your own blog. If you'd like to connect your own domain you can follow this &lt;a href="https://cloud.google.com/run/docs/mapping-custom-domains" rel="noopener noreferrer"&gt;Google tutorial&lt;/a&gt;. It is fairly trivial and just involves updating a few DNS records.&lt;/p&gt;

&lt;p&gt;Now navigate to the newly created URL (or the custom url if you configured that) to see your &lt;em&gt;brand spankin' new&lt;/em&gt; Ghost blog self-hosted for a fraction of the cost of Ghost Pro. To access the admin panel navigate to &lt;code&gt;&amp;lt;blog-url&amp;gt;.com/ghost&lt;/code&gt;. Here you will configure the first admin user, be sure to pick a strong password! You can now write posts and have them appear on &lt;code&gt;&amp;lt;blog-url&amp;gt;.com&lt;/code&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Narrator: Little did they know...&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Enter the First Problem Stage Left
&lt;/h3&gt;

&lt;p&gt;I noticed this one the hard way, after re-deploying my Ghost image a few times to update some config I found that I was forced to create new admin user every time and no post that I had written would persist. After some digging, I found the problem. By default, the Ghost docker image uses a local installation of SQLite to store all of the data. When you deploy the image on Cloud Run, it instantiates the SQLite DB, however whenever we turn the container off or start a new one the database is effectively re-created from scratch, thereby deleting all of your data. &lt;/p&gt;

&lt;p&gt;This is not good.&lt;/p&gt;

&lt;p&gt;What if we had some external database that our Ghost image could connect to on startup. This would certainly deal with our data persistence problem.&lt;/p&gt;

&lt;h3&gt;
  
  
  Enter Cloud SQL from Above
&lt;/h3&gt;

&lt;p&gt;Since we're already so deep in Google land, why don't we delve a bit deeper. Google offers a &lt;a href="https://cloud.google.com/sql" rel="noopener noreferrer"&gt;Cloud SQL&lt;/a&gt; service which effectively runs a relational database (PostgreSQL, MySQL, or SQL Server) on the internet for you with a ton of security, backup and integrations to make your life easy.&lt;/p&gt;

&lt;p&gt;Now after poking around the Ghost documentation I found that they support a variety of &lt;a href="https://ghost.org/docs/concepts/config/#database" rel="noopener noreferrer"&gt;DB connections&lt;/a&gt;. If we can configure a Cloud SQL instance to run MySQL and connect our Cloud Run instance to that instance we should have all the pieces we need for a self-hosted Ghost blog that actually persists your data.&lt;/p&gt;

&lt;p&gt;Cloud SQL is easy to configure, follow the prompts like so and be sure to generate a strong password for your instance. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Note: When selecting a machine type, I choose the &lt;strong&gt;db-f1-micro&lt;/strong&gt; to save costs. Be sure to select a DB type that will support your needs.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-sql-0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-sql-0.png" alt="Cloud SQL Tutorial 1"&gt;&lt;/a&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-sql-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-sql-1.png" alt="Cloud SQL Tutorial 2"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click create and there you have it. Your a new Cloud SQL instance for all your SQL needs. Now comes the tricky part: connecting Cloud Run to Cloud SQL.&lt;/p&gt;

&lt;h3&gt;
  
  
  Connecting the Dots
&lt;/h3&gt;

&lt;p&gt;Lucky for us, Cloud Run has first-party support for connecting to a Cloud SQL instance so we do not even need to really think about Cloud SQL Proxy. Again, props to Google because they have a tutorial for just about all of these steps, to connect your Cloud Run instance to Cloud SQL follow &lt;a href="https://cloud.google.com/sql/docs/mysql/connect-run" rel="noopener noreferrer"&gt;this one&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-sql-5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-sql-5.png" alt="Cloud SQL Tutorial 3"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you add the Cloud SQL connection you will need to do one final step. Unlucky for me (but lucky for you) I had to figure out how to tell Ghost to connect to a remote SQL instance. We can do this by passing the database connection info into the docker image via Environment Variables. For security purposes Cloud SQL can only be accessed over a Unix Socket. However, in probably the biggest gap in Ghost's documentation, they do not explain how to connect to a DB via Unix Socket whatsoever. Luckily after poking around the &lt;a href="https://github.com/TryGhost/Ghost/blob/master/core/server/data/db/connection.js#L33" rel="noopener noreferrer"&gt;DB connection code&lt;/a&gt; in Ghost (+1 for open source), I found the magic solution. They blindly pass in any configuration defined in the &lt;code&gt;database&lt;/code&gt; config block to &lt;a href="https://knexjs.org/" rel="noopener noreferrer"&gt;knex&lt;/a&gt;, and lucky for us again, knex supports Unix Sockets! So all we need to do is setup the database configuration to use the Unix Socket that Cloud SQL is running on. Add the following Environment Variables to your Cloud Run instance. Make sure you replace &lt;code&gt;&amp;lt;DB_PASSWORD&amp;gt;&lt;/code&gt;, &lt;code&gt;&amp;lt;CLOUD_SQL_INSTANCE&amp;gt;&lt;/code&gt; and &lt;code&gt;&amp;lt;BLOG_URL&amp;gt;&lt;/code&gt; with the appropriate values that you've previously defined.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-sql-2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fstorage.googleapis.com%2Fparondeau.com%2Fblog%2Fphotos%2Fcloud-sql-2.png" alt="Cloud SQL Tutorial 3"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once that is setup you can deploy a revision to your Cloud Sql instance and voilà! &lt;/p&gt;

&lt;p&gt;Navigate to the Cloud Run URL and you should see your new blog running, backed by a Cloud SQL database. You can verify this by logging in creating an admin account, adding a test post then creating a new revision (with the same configuration) of your Cloud Run instance. You should be able to log back in with the same admin credentials and see the old posts you've written.&lt;/p&gt;

&lt;h3&gt;
  
  
  –– End Scene ––
&lt;/h3&gt;

&lt;p&gt;Now that your instances are configured and running, you can start to configure your Ghost installation as you'd please. You can find some &lt;a href="https://ghost.org/marketplace/" rel="noopener noreferrer"&gt;themes&lt;/a&gt; and templates that you like.&lt;/p&gt;

&lt;p&gt;Well that wraps up this tutorial. I hope you were able to follow along and didn't come across too many hurdles. Happy blogging!&lt;/p&gt;

&lt;p&gt;If you have any questions, hit me up on &lt;a href="https://twitter.com/parondeau_" rel="noopener noreferrer"&gt;Twitter&lt;/a&gt;.&lt;/p&gt;




&lt;p&gt;This article was originally published on &lt;a href="https://parondeau.com/blog/self-hosting-ghost-gcp" rel="noopener noreferrer"&gt;parondeau.com&lt;/a&gt; on 2020-04-05.&lt;/p&gt;

</description>
      <category>tutorial</category>
      <category>gcp</category>
      <category>ghost</category>
      <category>blog</category>
    </item>
  </channel>
</rss>
