DEV Community

Cover image for Your First Job In The Cloud
Yulia Zvyagelskaya
Yulia Zvyagelskaya

Posted on • Updated on

Your First Job In The Cloud

Chapter 1. The First Job In The Cloud


On a cloud I saw a child,
And he laughing said to me: “...
— William Blake

Nowadays one should live in a cave on an uninhabited island lost in Arctic ocean to have never heard of “Artificial Intelligence,” “Machine Learning,” “NLP” and the family of buzzwords. Having a masters in Data Science, I feel a bit less excited about tomorrow’s AI revolution. That does not mean DS is boring or undue—rather it requires a lot of effort to be put into and I really like that feeling of being doing stuff on the bleeding edge.

As a relatively new industry, ML has not set up the processes yet. I have heard something opposite about Google and Facebook, but we are still considered nerds in small businesses. The role developers used to play twenty years ago. That’s great to see that more and more people are getting into ML, either being excited by Google slides on a last conference, or just being curious whether the neural nets can indeed distinguish between cats and dogs seen on a photo.

Big corps prepare and share (thanks God it’s XXI century) huge datasets, trained models and everything the junior data scientist might use to play in the sandbox. After we made sure that models trained on Google or Facebook data somehow work and even might predict things (in some cases under some very eccentric circumstances, but it’s still so thrilling,) we usually want to try to train our own model ourselves. It takes hours on our own laptop, even despite the dataset is limited to tweets from our forty two friends for the last single year. Results usually look promising, but unsatisfactory. There is no way the laptop could proceed with the whole tweet feed for the last decade without exploding the SDD and blowing up.

That is the time we get to the magical words: cloud calculus. Or how do you name it. Let’s Google servers do explode instead of our lovely laptops, right? Right. Our next job will be in the cloud. Pun intended.

There are not that many resources, explaining how one might stop procrastinating starring at the laptop monitor for when the model is built and start getting benefits of living in the 2018 AD. There are Google ML, Amazon SageMaker, Azure Machine Learning Studio, but the documentation everywhere was written by developers for gray-bearded geeks. There is an enormous threshold to execute the very first job in the cloud. And this writing is supposed to bridge that gap.

That is not a rocket science and there is nothing really complex. Just few steps to make and several things to take into consideration. That’s the breathtaking journey and once done, the subsequent trips will seem a cakewalk. Let’s go.

All the below is written for Google ML Engine, but it might be applied to any cloud computing system almost as is. I will try not to go deeply into details, concentrating more on whats rather than on hows.

Before We Start

First of all, I want to reference the paper that helped me a lot to move my job into the cloud. Tensorflow beginner guide by Fuyang Liu's Blog is almost perfect, save for it does not cover pitfalls and does not suggest shortpaths where it could have made sense.

Google also has a documentation on ML Engine, I wish I were as smart as to use it as a guide. We still need it though to quickly look up this and that.

First we need to set up our cloud environment. I refer to Google guide here because things tend to change within time and I hope they will keep this info up-to-date.

After we have the account enabled for ML, we should set up our local environment. I strongly advise using Linux, MacOS is more or less robust, Windows will make you cry. Once we are to run jobs in the cloud, I believe you have python installed and configured. What we need to install is Google SDK. It’s pretty straightforward though, download it from the page linked and install.

Now we need to setup our credentials. gcloud init should do.

Let’s check it works as expected:

$ gcloud ml-engine models list
Listed 0 items.
Enter fullscreen mode Exit fullscreen mode

Wow. We are all set.

Our First Job

That is the important part. Don’t try to upload and run your fancy last project. It’ll fail and you’ll get frustrated. Let’s enter cold water slowly. Let’s make your first job completed successfully, showing a fascinating green light icon when you’ll check your jobs status.

The cloud expects the python package to be uploaded and the main module to execute it specified. So, let’s go with a pretty simple python package. Let’s assume it’s named and resides in the directory named test1.

# coding: utf-8
import logging
import argparse

if __name__ == "__main__":
  parser = argparse.ArgumentParser()

    help='GCS job directory (required by GoogleML)',
    help='Test argument',
  arguments = parser.parse_args().__dict__
  job_dir = arguments.pop('job_dir')
  arg = arguments.pop('arg')"Hey, ML Engine, you are not scary!")  
  logging.warn("Argument received: {}.".format(arg))  
Enter fullscreen mode Exit fullscreen mode

We use logging because unlike simple stdout logs are available through the web interface.

Also you’ll need a cloud configuration file on your local. It might be placed everywhere, I prefer to have a config file per project. Put test1.yml in the same directory:

  scaleTier: CUSTOM
  # 1 GPU
  masterType: standard_gpu
  # 4 GPUs
  # complex_model_m_gpu
  runtimeVersion: "1.9"
  pythonVersion: "3.5"
Enter fullscreen mode Exit fullscreen mode

I am not sure who took that decision, but the default python version for ML Engine is 2.7, that’s why two last lines are mandatory.

Also you would need to create a file, containing the description of our project. It will be processed by Google SDK.

from setuptools import find_packages
from setuptools import setup

    description='My First Job'
Enter fullscreen mode Exit fullscreen mode

Well, that is it. Let’s try (this file should be on the same level as the package folder.)


export BUCKET_NAME=foo-bar-baz-your-bucket-name
export REGION=us-east1
export JOB_NAME="test1_$(date +%Y%m%d_%H%M%S)"

gcloud ml-engine jobs submit training $JOB_NAME \
    --staging-bucket gs://$BUCKET_NAME \
    --job-dir gs://$BUCKET_NAME/$JOB_NAME \
    --region $REGION \
    --runtime-version 1.9 \
    --module-name test1.test1 \
    --package-path ./test1 \
    --config=test1/test1.yaml \
    -- \
Enter fullscreen mode Exit fullscreen mode

NB! you have to specify your bucket name and you might need to change the region as well.

I strongly advise to create a shell script to run (schedule/queue) a job from the very beginning. It’s much easier to tackle with when it comes to modifications.

There are three ‘subsections’ of arguments there: first four are job-specific and remain unchanged from job to job. The second is job-specific settings. The third one (after --) contains argument that will be passed to the __main__ function of your package.

Go try it:


Job [test1_20180818_085812] submitted successfully.
Your job is still active. You may view the status of your job with the command

  $ gcloud ml-engine jobs describe test1_20180818_085812

or continue streaming the logs with the command

  $ gcloud ml-engine jobs stream-logs test1_20180818_085812
jobId: test1_20180818_085812
state: QUEUED
Enter fullscreen mode Exit fullscreen mode

Now you might execute gcloud ml-engine jobs describe ... as suggested. It’ll spit out another portion of text. Copy the last link and paste in into your browser address line. You should see...

What should you see there I will describe in the next chapter. Happy clouding!

Top comments (4)

jrtibbetts profile image
Jason R Tibbetts

If you think that there's no process to ML now, imagine what we had to work with in the late 90s & early 2000s! I spent more than a decade doing NLP/ML/AI, and I actually left that area because I was disillusioned by how little the technology matured in that decade. Now that I've left it behind, it's exploded in popularity. Ah, well.

And I just want to add how excited I was to see a William Blake quote lead off a DEV article. :)

yzvyagelskaya profile image
Yulia Zvyagelskaya

Thank you, Jason.
I think I know exactly what you are talking about, since I started with NLP in early 2000s. Well, it was computational linguistics. I am just very persistent (or stubborn :)).

redaalaoui profile image
Réda Housni Alaoui

I am not in the ML field.

Is it mandatory to rely on the cloud service of a GAFA to build a model in a reasonable amount time? If yes, I find this very concerning.
Also don’t you think that your model training could be saved by Google that can reuse it if it pleases it without you knowing about it?

ben profile image
Ben Halpern

Super great first post Yulia!