DEV Community

Cover image for Azure vs GCP part 6: Storage (GCP)
Kenichiro Nakamura
Kenichiro Nakamura

Posted on

Azure vs GCP part 6: Storage (GCP)

In the previous article, I explained what Azure Storage offers and how I use Azure Storage for simple Web API. In this article, I try GCP Storage.

GCP Storage

GCP Storage is linked to project, same as any other resources. When I say "Storage" in GCP, it may include SQL or Datastore which you may store structured data, but in this blog, I only talk about Storage which store unstructured data.

storage

It requires "bucket" to store data so I need to create a bucket.

GCP Storage Classes

There are several classes for storage which determine service level and pricing.

  • Multi-Regional Storage: 99.95% availability SLA with Geo-redundant. Store frequently accessed object around the world.
  • Regional Storage: 99.9% availability SLA with a region. Store frequently access object in a region.
  • Nearline Storage: 99% availability SLA with around one per month access.
  • Coldline Storage: 99% availability SLA with around one per year access.

This time, as I need to access my data frequently, I use either Multi-Region or Region. But its good idea to use other classes for backup or archive data which requires way less access.

See the detail at
Comparison of storage classes.

Bucket Location

Once you decide which class you want to use, then you need to decide where you want to store your data. See Bucket Locations for more detail.

Let's code!!

Okay, enough talk!! Let's write code as that's what I am interested in.

1. First of all, I create bucket. There are several ways to create it, but I use CLI in portal. Go to Google Cloud Console and select "Cloud Shell".
portal

2. Run the code to create bucket. I kept using "gcloud" cli, but for storage, I need to use "gsutil". This creates Multi-Region storage as I specify location to "US".

gsutil mb -l us gs://cloudcomparestorage20180306
Enter fullscreen mode Exit fullscreen mode

storage

3. Now I need to authenticate to service. There seems to be several ways to do it, but I directly authenticate by running the following command locally. If you don't have gcloud yet, install it from here.

It launches the browser, so just authenticate.

gcloud auth application-default login
Enter fullscreen mode Exit fullscreen mode

4. As a bucket is created and authentication is done, open Visual Studio and create new project. Select "ASP.NET Core Web Application" and create it.
code

5. Select "Web API" and click OK.
code

6. Add "Google.Cloud.Storage.V1" NuGet package.
code

7. Rename existing "ValueController.cs" to "StorageController.cs", then paste the code. Replace the storage key at line 20.

using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using System;
using System.Threading.Tasks;
using Google.Cloud.Storage.V1;
using System.IO;

namespace Storage.Controllers
{
    [Route("api/[controller]")]
    public class StorageController : Controller
    {
        StorageClient storageClient = StorageClient.Create();
        static string bucketName = "cloudcomparestorage20180306";
        static string folderName = "images";

        public StorageController()
        {
        }

        // GET api/storage/filename
        [HttpGet("{filename}")]
        public async Task<IActionResult> Get(string filename)
        {
            try
            {
                MemoryStream image = new MemoryStream();
                await storageClient.DownloadObjectAsync(bucketName, $"{folderName}/{filename}", image);
                image.Seek(0, SeekOrigin.Begin);
                return new ObjectResult(image);
            }
            catch (Exception ex)
            {
                return NotFound();
            }
        }

        // POST api/storage
        [HttpPost]
        public async Task<IActionResult> Post(IFormFile file)
        {
            try
            {
                await storageClient.UploadObjectAsync(bucketName, $"{folderName}/{file.FileName}", file.ContentType, file.OpenReadStream());
                return Ok();
            }
            catch (Exception ex)
            {
                return StatusCode(503);
            }
        }

        // DELETE api/storage/filename
        [HttpDelete("{filename}")]
        public async Task<IActionResult> Delete(string filename)
        {
            try
            {
                await storageClient.DeleteObjectAsync(bucketName, $"{folderName}/{filename}");
                return Ok();
            }
            catch (Exception ex)
            {
                return NotFound();
            }
        }
    }
}


Enter fullscreen mode Exit fullscreen mode

Test it

1. Hit F5 to debug the applicaiton. Check the address and port.

2. Open Postman and select "POST" as verb, enter the endpoint address, then click "Body".
test

3. Select "File" from key dropdown.
test

4. Enter "file" to key and choose any image file from your local PC and hit "Send". I selected "apple-touch-icon.png".
test

5. Switch verb from "POST" to "GET", then add "/ to the address.
test

6. Go to Google Cloud Console and navigate to storage to see file exists.
test

7. Click "Share publicly" button and you can access it from anywhere.
test

https://storage.googleapis.com///

8. In Postman, change verb to "DELETE" and hit send.
test

Deploy to each platform

Let's deploy to Azure Web Apps and GCP App Engine. See part 1 for how to detail.

GCP App Engine

Yes it works as expected, as it's GCP anyway.

Azure Web Apps

Oops, all the request fails with internal server error. The reason is authentication. GCP handles authentication to the storage if it is accessed inside GCP, otherwise I need to create service account.

Add authentication

At the moment, I just my own credential to test the application. It is okay while testing, but when I deploy the application, I need to enable authentication.

1. Run the following command to create service account. I gave project onwer, but you can limit it further depending on scenario.

serviceName=storageserviceaccount
projectId=<your project id>
gcloud iam service-accounts create ${serviceName}
gcloud projects add-iam-policy-binding ${projectId} --member "serviceAccount:${serviceName}@${projectId}.iam.gserviceaccount.com" --role "roles/owner"
Enter fullscreen mode Exit fullscreen mode

2. Then run the following command to generate key.

gcloud iam service-accounts keys create keys.json --iam-account ${serviceName}@${projectId}.iam.gserviceaccount.com
Enter fullscreen mode Exit fullscreen mode

3. Once the file is generated, copy it to local and add to Visual Studio 2017 project root.

key

4. Change the StorageClient initialization code to pass the key.

StorageClient storageClient = StorageClient.Create(GoogleCredential.FromFile("keys.json"));
Enter fullscreen mode Exit fullscreen mode

5. Re-deploy to Azure.

6. Test again. This time, it works.

When we consider security, this is not good idea for sure. We may want to keep the file in secure environment, but I keep this for future article.

Summary

Compare Azure and GCP Storage, they are very similar from development experience. The authentication is slightly different but most features are similar including SLA, locations, and types. I should look into authentication part deeper.

Ken

Top comments (0)