<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Yhary Arias</title>
    <description>The latest articles on DEV Community by Yhary Arias (@yharyarias).</description>
    <link>https://dev.to/yharyarias</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/yharyarias"/>
    <language>en</language>
    <item>
      <title>Orchestrating Models: Machine Learning with Docker Compose</title>
      <dc:creator>Yhary Arias</dc:creator>
      <pubDate>Tue, 01 Oct 2024 01:35:47 +0000</pubDate>
      <link>https://dev.to/yharyarias/orquestando-modelos-machine-learning-con-docker-compose-i31</link>
      <guid>https://dev.to/yharyarias/orquestando-modelos-machine-learning-con-docker-compose-i31</guid>
      <description>&lt;p&gt;&lt;strong&gt;Docker Compose&lt;/strong&gt; is a powerful tool for easily and efficiently defining and managing multi-container Docker applications. In this article, we will explore the basic concepts of Docker Compose and how you can start using it to orchestrate your applications.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is Docker Compose?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Docker Compose is a tool that allows you to define and run multi-container Docker applications using a YAML file to configure your application's services. Then, with a single command, you can create and run all the defined containers. This simplifies the creation and configuration of complex development and production environments where multiple services need to interact with each other.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Optional&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;Installing Docker Compose Before getting started, make sure you have Docker Compose installed on your machine. You can install it by following the &lt;a href="https://docs.docker.com/compose/install/" rel="noopener noreferrer"&gt;official Docker instructions&lt;/a&gt;. &lt;br&gt;
If you are using a Mac, you can install Docker Compose with the following command. Before running it, make sure you have Docker Desktop installed on your machine.&lt;br&gt;
&lt;code&gt;$ brew install docker-compose&lt;/code&gt; &lt;br&gt;
Now, verify the version you have installed: &lt;br&gt;
&lt;code&gt;$ docker-compose --version&lt;/code&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Antes de probar Docker Compose con un proyecto de Machine Learning, vamos aclarar la diferencia entre Docker Compose y Kubernetes&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Docker Compose:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What it's for:&lt;/strong&gt; Docker Compose is a tool that lets you run multiple containers together. It's mainly designed for development and testing. It's ideal if you want to quickly spin up multiple services on your machine, like a database, an API, and a web app, all running locally.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How it works:&lt;/strong&gt; You use a file called docker-compose.yml to define which containers you are going to use and how they connect to each other. For example, you can say, "I want to launch my application and connect it to a database." Compose will take care of that for you with a single command.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Ideal for:&lt;/strong&gt; Small projects or development environments where you don't need a very complex system and just want to quickly test on your machine.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Kubernetes:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What it's for:&lt;/strong&gt; Kubernetes is much larger and more powerful than Docker Compose. It not only helps you launch containers but also helps you manage applications in production, on real servers, efficiently and at scale.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How it works:&lt;/strong&gt; Kubernetes ensures that your application is always running, has enough resources, and can handle many users. If one of your containers fails, Kubernetes will automatically replace it. It can also scale (increase or decrease the number of containers) based on what your application needs at any given moment.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Ideal for:&lt;/strong&gt; Large production applications that need to be always available and handle high traffic. Large companies or projects that plan to grow significantly often use Kubernetes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;In summary:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Docker Compose is for quickly launching and managing multiple containers on your local machine, ideal for development or testing. Kubernetes is for managing large production applications that need more control, stability, and scalability. Compose is like a small engine that helps you work on your project. Kubernetes is like a large machine that keeps your application running smoothly, even with lots of users and traffic.&lt;/p&gt;

&lt;p&gt;Now, let's get to what we're here for 😎&lt;br&gt;
I’ll show you how to apply Docker Compose in an ML project. We’re going to create a simple application that trains a machine learning model and exposes a web service for making predictions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Project Goal&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Our project will consist of two services:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;ML Service:&lt;/strong&gt; A machine learning model trained using scikit-learn, exposed through a web API using Flask.&lt;br&gt;
Database Service: A PostgreSQL database to store prediction results.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Project Structure&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The basic file structure will be as follows:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ml_project/
│
├── docker-compose.yml
├── ml_service/
│   ├── Dockerfile
│   ├── app.py
│   ├── model.py
│   ├── requirements.txt
└── db/
    ├── init.sql
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;1. Define the docker-compose.yml&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The first step is to define the services in a &lt;code&gt;docker-compose.yml&lt;/code&gt; file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: '3'
services:
  ml_service:
    build: ./ml_service
    ports:
      - "5000:5000"
    depends_on:
      - db
  db:
    image: postgres
    environment:
      POSTGRES_DB: ml_results
      POSTGRES_USER: ml_user
      POSTGRES_PASSWORD: ml_password
    volumes:
      - ./db/init.sql:/docker-entrypoint-initdb.d/init.sql
    ports:
      - "5432:5432"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;2. Create the Machine Learning Service&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Inside the &lt;code&gt;ml_service/&lt;/code&gt; folder, we create a &lt;code&gt;Dockerfile&lt;/code&gt; that will install the necessary Python dependencies, train a model, and expose the service.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Dockerfile&lt;/strong&gt; (for the ML service)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;FROM python:3.8

WORKDIR /app

COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt

COPY . .

CMD ["python", "app.py"]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;requirements.txt&lt;/code&gt;&lt;br&gt;
Here we define the dependencies we will use, such as Flask for creating the web server and scikit-learn for the ML model.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Flask==2.1.0
scikit-learn==1.0.2
psycopg2-binary==2.9.3  # Para conectar con PostgreSQL
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;model.py&lt;/code&gt;&lt;br&gt;
This file contains the code to train the machine learning model. We’ll use a simple classification model like Logistic Regression.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;sklearn.datasets&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;load_iris&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;sklearn.linear_model&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;LogisticRegression&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;pickle&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;train_model&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="c1"&gt;# Cargar dataset de ejemplo
&lt;/span&gt;    &lt;span class="n"&gt;iris&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;load_iris&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;iris&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;iris&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;target&lt;/span&gt;

    &lt;span class="c1"&gt;# Entrenar modelo
&lt;/span&gt;    &lt;span class="n"&gt;model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;LogisticRegression&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fit&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# Guardar el modelo en un archivo
&lt;/span&gt;    &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="nf"&gt;open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;model.pkl&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;wb&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;pickle&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dump&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;__name__&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;__main__&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="nf"&gt;train_model&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This code trains a simple classification model and saves it as a model.pkl file.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;app.py&lt;/code&gt;&lt;br&gt;
This is the Flask file that creates the API to make predictions using the trained model. We will also store predictions in the database.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;flask&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Flask&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;request&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;jsonify&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;pickle&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;psycopg2&lt;/span&gt;

&lt;span class="n"&gt;app&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Flask&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;__name__&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Cargar el modelo de ML entrenado
&lt;/span&gt;&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="nf"&gt;open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;model.pkl&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;rb&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pickle&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;load&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Conectar con la base de datos PostgreSQL
&lt;/span&gt;&lt;span class="n"&gt;conn&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;psycopg2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;connect&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;dbname&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ml_results&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;user&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ml_user&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;password&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ml_password&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;host&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;db&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;cur&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;conn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cursor&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="nd"&gt;@app.route&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;/predict&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;methods&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;POST&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;predict&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="n"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;request&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;json&lt;/span&gt;
    &lt;span class="n"&gt;X_new&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;features&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]]&lt;/span&gt;

    &lt;span class="c1"&gt;# Hacer predicción
&lt;/span&gt;    &lt;span class="n"&gt;prediction&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;predict&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_new&lt;/span&gt;&lt;span class="p"&gt;)[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;

    &lt;span class="c1"&gt;# Guardar predicción en la base de datos
&lt;/span&gt;    &lt;span class="n"&gt;cur&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;execute&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;INSERT INTO predictions (input, result) VALUES (%s, %s)&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_new&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="nf"&gt;int&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;prediction&lt;/span&gt;&lt;span class="p"&gt;)))&lt;/span&gt;
    &lt;span class="n"&gt;conn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;commit&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;jsonify&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;prediction&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;int&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;prediction&lt;/span&gt;&lt;span class="p"&gt;)})&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;__name__&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;__main__&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;run&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;host&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;0.0.0.0&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;port&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;5000&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This Flask service listens on port 5000 and, upon receiving a POST request with input features, returns a prediction and saves the result in PostgreSQL.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Set Up the Database&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In the &lt;code&gt;db/&lt;/code&gt; directory, we create an &lt;code&gt;init.sql&lt;/code&gt; file to initialize the database with a table for storing predictions.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;init.sql&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;CREATE TABLE predictions (
    id SERIAL PRIMARY KEY,
    input TEXT,
    result INTEGER
);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This script will automatically run when the PostgreSQL container starts and will create a table named &lt;code&gt;predictions&lt;/code&gt;.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Running the Project&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now that everything is set up, we can run the entire project using Docker Compose. From the project’s root directory, run the following command: &lt;code&gt;$ docker-compose up&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;This will have Docker Compose:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Build the machine learning service image.&lt;/li&gt;
&lt;li&gt;Start the ML and database containers.&lt;/li&gt;
&lt;li&gt;Run the ML model and the Flask web server on port 5000.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;5. Testing the API&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To make a prediction, you can send a POST request to &lt;code&gt;http://localhost:5000/predict&lt;/code&gt; with a JSON body containing the dataset features.&lt;/p&gt;

&lt;p&gt;Example &lt;code&gt;curl&lt;/code&gt; command: &lt;code&gt;$ curl -X POST http://localhost:5000/predict -H "Content-Type: application/json" -d '{"features": [5.1, 3.5, 1.4, 0.2]}'&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;If everything is configured correctly, you will receive a response like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "prediction": 0
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;With Docker Compose, we’ve created a machine learning project that includes a web service for making predictions with an ML model, as well as a PostgreSQL database to store results. Docker Compose simplifies managing these services in both development and production environments, allowing you to work with multiple containers in a coordinated way.&lt;/p&gt;

&lt;p&gt;This example is just a starting point, and you can expand it by adding more services, connecting other machine learning models, or integrating tools like Redis for caching or Celery for asynchronous tasks.&lt;/p&gt;

&lt;p&gt;Now you're ready to use Docker Compose in more complex machine learning projects!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Autor:&lt;/strong&gt; Yhary Arias.&lt;br&gt;
&lt;strong&gt;LinkedIn:&lt;/strong&gt; &lt;a class="mentioned-user" href="https://dev.to/yharyarias"&gt;@yharyarias&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Instagram:&lt;/strong&gt; &lt;a class="mentioned-user" href="https://dev.to/ia"&gt;@ia&lt;/a&gt;.fania&lt;/p&gt;

</description>
      <category>docker</category>
      <category>api</category>
      <category>flask</category>
      <category>machinelearning</category>
    </item>
  </channel>
</rss>
