<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Bozhao</title>
    <description>The latest articles on DEV Community by Bozhao (@yubozhao).</description>
    <link>https://dev.to/yubozhao</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/yubozhao"/>
    <language>en</language>
    <item>
      <title>BentoML: From model in Jupyter Notebook to deployment in 5 mins</title>
      <dc:creator>Bozhao</dc:creator>
      <pubDate>Sun, 14 Apr 2019 22:01:00 +0000</pubDate>
      <link>https://dev.to/yubozhao/bentoml-from-model-in-jupyter-notebook-to-deployment-in-5-mins-3030</link>
      <guid>https://dev.to/yubozhao/bentoml-from-model-in-jupyter-notebook-to-deployment-in-5-mins-3030</guid>
      <description>&lt;p&gt;Hi guys, I want to share an &lt;a href="//www.github.com/bentoml/bentoml"&gt;open source project&lt;/a&gt; that we made for data scientists.  It is a machine learning toolkit for packaging and deploying models.&lt;/p&gt;

&lt;p&gt;BentoML is a python library for packaging and deploying machine learning models. It does two things without changing your model training workflow:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Standardize how to package your ML model for production, including its preprocessing/feature-fetching code, dependencies and configurations.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Easily distribute your ML model as PyPI package, API Server(in a Docker Image) , command line tool or Spark/Flink UDF.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We build BentoML because we think there should be a much simpler way for machine learning teams to ship models for production. They should not wait for engineering teams to re-implement their models for production environment or build complex feature pipelines for experimental models.&lt;/p&gt;

&lt;p&gt;Our vision is to empower Machine Learning scientists to build and ship their own models end-to-end as production services, just like software engineers do. BentoML is essentially this missing 'build tool' for Machine Learning projects.&lt;/p&gt;

&lt;p&gt;With that in mind, here is the top design goals for BentoML:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Multiple framework support - BentoML should supports a wide range of ML frameworks out-of-the-box including Tensorflow, PyTorch, Scikit-Learn, xgboost and can be easily extended to work with new or custom frameworks.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Best Practice built-in - BentoML users can easily customize telemetrics and logging for their model, and make it easy to integrate with production systems.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Streamlines deployment workflows - BentoML supports deploying models into REST API endpoints with Docker, Kubernetes, AWS EC2, ECS, Google Cloud Platform, AWS SageMaker, and Azure ML.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Custom model runtime - Easily integrate your python code with high-performance model runtime backend(e.g. tf-serving, tensorrt-inference-server) in real-time model serving.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here is a quick example:&lt;/p&gt;

&lt;p&gt;We have a very simple model from Scikit-learn:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from sklearn import svm
from sklearn import datasets

clf = svm.SVC(gamma='scale')
iris = datasets.load_iris()
X, y = iris.data, iris.target
clf.fit(X, y)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;To package this model with BentoML, you will need to create a new BentoService by subclassing it, and provides artifacts and env definition for it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;%%writefile iris_classifier.py
from bentoml import BentoService, api, env, artifacts
from bentoml.artifact import PickleArtifact
from bentoml.handlers import DataframeHandler

@artifacts([PickleArtifact('model')])
@env(conda_dependencies=["scikit-learn"])
class IrisClassifier(BentoService):

    @api(DataframeHandler)
    def predict(self, df):
        return self.artifacts.model.predict(df)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Now, to save your trained model for prodcution use, simply import your BentoService class and pack it with required artifacts:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from iris_classifier import IrisClassifier

svc = IrisClassifier.pack(model=clf)

svc.save('./saved_bento', version='v0.0.1') # Saving archive to ./saved_bento/IrisClassifier/v0.0.1/
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;That's it. Now you have created your first BentoArchive. It's a directory containing all the source code, data and configurations files required to run this model in production. &lt;/p&gt;

&lt;h4&gt;
  
  
  How to use packaged archive
&lt;/h4&gt;

&lt;h5&gt;
  
  
  Loading BentoArchive in Python
&lt;/h5&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import bentoml

bento_svc = bentoml.load('./saved_bento/IrisClassifier/v0.0.1/')
bento_svc.predict(X[0])
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;h5&gt;
  
  
  Install BentoArchive as PyPI package
&lt;/h5&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install ./saved_bento/IrisClassifier/v0.0.1/
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Import it and used it as a python module:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from IrisClassifier import IrisClassifier

installed_svc = IrisClassifier()
installed_svc.predict(X[0])
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Love to hear you guys feedback and thoughts!&lt;/p&gt;

&lt;p&gt;Cheers&lt;/p&gt;

</description>
      <category>python</category>
      <category>machinelearning</category>
      <category>datascience</category>
    </item>
  </channel>
</rss>
