DEV Community

Cover image for Deploying the Model from BigQuery
Cris Crawford
Cris Crawford

Posted on • Updated on

Deploying the Model from BigQuery

This didn't go so well. Well actually it did! I'm editing this post to say that I found a Dockerfile that worked on my Macbook Pro with an M1 chip. First of all, I had to copy the machine learning model that I created in the previous post from BigQuery to a Google cloud storage bucket. In the video, the instructor said to use gcloud auth login and then bq --project_id taxi-rides-ny extract -m nytaxi.tip_model gs://taxi_ml_model/tip_model (his bucket and model). I did this, but I got the error message:

BigQuery error in extract operation: Access Denied: Project data-
engineering-2024: User does not have bigquery.jobs.create permission in project
data-engineering-2024.
Enter fullscreen mode Exit fullscreen mode

After trying to set permissions for bigquery.jobs.create for about ten minutes, I gave up and went to BigQuery, selected the model, and exported it to cris-crawfords-week3-bucket/tip_model.

Now the fun began. I set up a directory for the model on my computer, following the commands the instructor used.

% mkdir /tmp/model
% gsutil cp -r gs://cris-crawfords-week3-bucket/tip_model /tmp/model
% mkdir -p serving_dir/tip_model/1
% cp -r /tmp/model/tip_model/\* serving_dir/tip_model/1
Enter fullscreen mode Exit fullscreen mode

So far so good. Now I tried to make a Dockerfile with tensorflow. I pulled a Dockerfile from Docker Hub, and tried to run it:

docker pull tensorflow/serving

docker run -p 8500:8500 -p 8501:8501 --mount type=bind,source=`pwd`/serving_dir/tip_model,target=/model -e MODEL_NAME=tip_model -t tensorflow/serving &
Enter fullscreen mode Exit fullscreen mode

This didn't work, because the Dockerfile for Mac, which I have, expected an Intel chip, which I don't have. I have an Apple chip. After about an hour of frustration I asked what I could do on the Slack channel. I found another Dockerfile that I could pull from Docker Hub: bitnami/tensorflow-serving. I did this, but more errors, this time about the directory "model". I had to use model-data, not model. Finally I got it to run:

docker pull bitnami/tensorflow-serving:latest

docker run -p 8500:8500 -p 8501:8501 --mount type=bind,source=`pwd`/serving_dir/tip_model,target=/bitnami/model-data -e MODEL_NAME=tip_model -t bitnami/tensorflow-serving &
Enter fullscreen mode Exit fullscreen mode

The next and final step was to run a command that would ask the model for a prediction.

curl -d '{"instances": [{"passenger_count":1, "trip_distance":12.2, "pu_location_id":"193", "do_location_id":"264", "payment_type":"2","fare_amount":20.4,"tolls_amount":0.0}]}' -X POST http://localhost:8501/v1/bitnami/model-data/tip_model:predict
Enter fullscreen mode Exit fullscreen mode

This led to the error "Malformed request: GET /v1/bitnami/model-data/tip_model". After struggling for about two hours, I gave up. I looked at the model, and it looked like everything was there. For a linear regression, the model consists of the coefficients of the inputs, so it should be small, which it was. The way I spelled the inputs matched the variables in the model. I looked at the example on Google cloud, and it looked correct. It matched what the instructor had in the video.

I learned:

  • That you can create an ML model on BigQuery from a data set.
  • You can deploy the model using Tensorflow.
  • The Tensorflow Dockerfile doesn't work on Macs with an Apple chip (all of the new Macs have an Apple chip)
  • I could pull a Dockerfile from Docker Hub that did run tensorflow on a Mac with an Apple chip.

For now, I'm admitting defeat. But I learned a lot. If I have time, I'll go back and try to figure out what went wrong.

So the good people on the Slack for Data Talks Club's Data Engineering Zoomcamp pointed me to the Dockerfile emacski/tensorflow-serving. Because I already had my serving_dir directory set up with the model in it, all I had to do was type docker pull emacski/tensorflow-serving and then docker run. Here's my modified command:

docker run -p 8500:8500 -p 8501:8501 --mount type=bind,source=`pwd`/serving_dir/tip_model,target=/emacski/model -e MODEL_NAME=tip_model -t emacski/tensorflow-serving &
Enter fullscreen mode Exit fullscreen mode

Then run the curl command:

curl -d '{"instances": [{"passenger_count":1, "trip_distance":12.2, "pu_location_id":"193", "do_location_id":"264", "payment_type":"2","fare_amount":20.4,"tolls_amount":0.0}]}' -X POST http://localhost:8501/v1/emacski/model/tip_model:predict
Enter fullscreen mode Exit fullscreen mode

And it worked! I got a prediction for the tip amount. So I remain undefeated!

Top comments (0)