<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Sven Varkel</title>
    <description>The latest articles on DEV Community by Sven Varkel (@svenvarkel).</description>
    <link>https://dev.to/svenvarkel</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/svenvarkel"/>
    <language>en</language>
    <item>
      <title>Async/await MongoDB in Python</title>
      <dc:creator>Sven Varkel</dc:creator>
      <pubDate>Wed, 13 Nov 2019 14:32:21 +0000</pubDate>
      <link>https://dev.to/svenvarkel/async-await-mongodb-in-python-25bk</link>
      <guid>https://dev.to/svenvarkel/async-await-mongodb-in-python-25bk</guid>
      <description>&lt;h1&gt;
  
  
  Intro
&lt;/h1&gt;

&lt;p&gt;By now almost everyone knows or at least has heard about "&lt;em&gt;async/await&lt;/em&gt;" paradigm on JavaScript/&lt;a href="https://nodejs.org/en/about"&gt;Node.JS&lt;/a&gt;. But it seems that &lt;em&gt;async&lt;/em&gt; processing is a lot less common in &lt;a href="https://www.python.org/"&gt;Python&lt;/a&gt;. Python feels like "old school" language when compared to JavaScript, it really feels old sometimes. But it's very clear, concise and developers can be really productive in it. It's also very similar to JS with its somewhat loose-but-not-so-loose typing, variable assignment etc.&lt;br&gt;
One big difference is syntax - in JS whitespaces in the code (almost) don't matter while in Python these are important.&lt;/p&gt;

&lt;p&gt;However, I try to keep this post short and will focus on a specific task:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- how to use MongoDB asynchronously in Python&lt;/strong&gt;&lt;/p&gt;
&lt;h1&gt;
  
  
  &lt;em&gt;Async/await&lt;/em&gt; - why bother at all?
&lt;/h1&gt;

&lt;p&gt;First and foremost reason for me is concurrency. With &lt;em&gt;async&lt;/em&gt; processing developer can run things in parallel without too much hassle and tinkering with threads or processes. It has its limits and risks but for simple cases it does the job. &lt;em&gt;Async/await&lt;/em&gt; will come extra handy in those cases when there's a lot of I/O operations involved. Be it disk reads/writes or accessing the database. Here we are talking about accessing the database, and namely MongoDB.&lt;/p&gt;
&lt;h1&gt;
  
  
  MongoDB and Python
&lt;/h1&gt;

&lt;p&gt;MongoDB itself feels very "javascriptish" so to say. If you're a seasoned Node.js developer like myself it feels very comfortable and cozy. MongoDB shell has the same language that your code runs in - JavaScript. You can copy-paste code and objects between your IDE/project and MongoDB shell or MongoDB IDE. And so on and on. &lt;br&gt;
For Python-only developers it may be a bit hard to switch between languages-contexts but it's doable. My brain still has 2 halves but these are not called "left half" and "right half" any more but "JS half" and "Python half" ;) Half a brain still does the job, haaa:)&lt;/p&gt;

&lt;p&gt;There's an old good Python API for MongoDB - it's called &lt;a href="https://api.mongodb.com/python/current/"&gt;PyMongo&lt;/a&gt;. It gets the job done and even more but where's the fun in all that old school serial, non-&lt;em&gt;async&lt;/em&gt; processing? &lt;/p&gt;
&lt;h1&gt;
  
  
  Motor to the rescue
&lt;/h1&gt;

&lt;p&gt;There's a really good &lt;em&gt;async&lt;/em&gt; driver API for MongoDB: &lt;a href="https://motor.readthedocs.io/en/stable/"&gt;Motor&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here are some examples.&lt;/p&gt;
&lt;h2&gt;
  
  
  Connecting to database
&lt;/h2&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from motor.motor_asyncio import AsyncIOMotorClient

uri = "mongodb://dev:dev@localhost:27017/mydatabase?authSource=admin"
client = AsyncIOMotorClient(uri)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h3&gt;
  
  
  The same with some connection args
&lt;/h3&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from motor.motor_asyncio import AsyncIOMotorClient

uri = "mongodb://dev:dev@localhost:27017/?authSource=admin"
connection_args = {
    "zlibCompressionLevel": 7,
    "compressors": "zlib"
}
client = AsyncIOMotorClient(uri, **connection_args)

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h2&gt;
  
  
  Getting database and collection instance from client:
&lt;/h2&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;db = client.get_database("mydatabase")
collection = db.get_collection("mycollection")
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h2&gt;
  
  
  Most common database operations
&lt;/h2&gt;
&lt;h3&gt;
  
  
  Find
&lt;/h3&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;async def find():
    """
    This method finds items from MongoDB collection and
    processes these by using another asynchronous method
    :return: 
    """
    collection = db.get_collection("mycollection")

    filter_ = {
        "someField": "someValue"
    }
    projection_ = {
        "_id": False  # don't return the _id
    }
    cursor = collection.find(filter=filter_, projection=projection_)

    # it gets interesting here now. Iterate over the cursor asynchronously
    async for item in cursor:
        await do_something_in_an_async_worker_method(item)


async def find_cursor_to_list():
    """
    This method finds items from MongoDB collection and
    asynchronously converts cursor to a list with items 
    :return: 
    """
    collection = db.get_collection("mycollection")

    filter_ = {
        "someField": "someValue"
    }
    projection_ = {
        "_id": False  # don't return the _id
    }
    cursor = collection.find(filter=filter_, projection=projection_)
    # Convert the cursor to a list of items right away.
    # NB! Dangerous with large result sets
    items = await cursor.to_list(length=500)
    return items

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h3&gt;
  
  
  Update / upsert
&lt;/h3&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from datetime import datetime


async def update():
    """
    This method updates data in MongoDB asynchronously
    :return: 
    """
    collection = db.get_collection("mycollection")

    filter_ = {
        "someField": "someValue"
    }
    data = {
        "someNewField": "aNewValue"
    }

    update_ = {
        "$set": data,
        "$currentDate": {
            "updatedAt": True  # set field updatedAt to current date automagically. Good practice ;)
        },
        "$setOnInsert": {
            "createdAt": datetime.utcnow()
            # set field createdAt to current date automagically ONLY IF it's a new record
        }

    }
    # if upsert=True and record is not found then it is created
    await collection.update_one(filter_, update_, upsert=True)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h2&gt;
  
  
  Aggregation
&lt;/h2&gt;

&lt;p&gt;It's also possible to run aggregation pipeline with Motor.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;async def aggregate():
    """
    This method run async aggregation
    :return: 
    """
    collection = db.get_collection("mycollection")

    pipeline = [
        {
            "$match": {
                "foo": "bar"
            }
        },
        {
            "$group": {
                "_id": None,
                "total": {"$sum": 1}
            }
        },
        {
            "$sort": {"foo": -1}
        }
    ]
    # mind the use of **allowDisk** argument. It's necessary for all
    # bigger result sets that are sorted
    cursor = collection.aggregate(pipeline=pipeline, allowDiskUse=True)
    results = await cursor.to_list(length=1000)
    return results

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Bulk writes
&lt;/h2&gt;

&lt;p&gt;Motor can be used for bulk writes that would improve performance&lt;br&gt;
when storing may items from big lists of results of some kind of processing.&lt;/p&gt;

&lt;p&gt;Here's an example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from pymongo import UpdateOne
from datetime import datetime


async def bulk_save(very_big_list: list = list()):
    """

    :param very_big_list: List with a lot of items. Or an item generator ... ? 
    :return: 
    """
    collection = db.get_collection("mycollection")

    # this defines batch size that will be added to database at once
    BATCH_SIZE = 500

    updates = list()

    for item in very_big_list:
        filter_ = {
            "foo": item.get("foo")
        }
        update_ = {
            "$set": item,
            "$currentDate": {
                "updatedAt": True
            },
            "$setOnInsert": {
                "createdAt": datetime.utcnow()
            }
        }
        u = UpdateOne(filter=filter_, update=update_, upsert=True)
        updates.append(u)

        # if list of updates is filled up to BATCH_SIZE push data to database
        if len(updates) &amp;gt;= BATCH_SIZE:
            await collection.bulk_write(updates, ordered=False)
            # re-initialize list of updates
            updates = list()

    # add all remaining items to database
    if len(updates) &amp;gt; 0:
        await collection.bulk_write(updates, ordered=False)

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Summary
&lt;/h1&gt;

&lt;p&gt;Since Python is by default "synchronous" and not event-loop-based language it may be a bit hard to get accustomed to async/await pattern in Python.&lt;br&gt;
In this article I gave an overview how to access MongoDB asynchronously in Python by giving examples on connecting the database with Motor client library, on querying and updating data, running aggregations and bulk updates.&lt;/p&gt;

&lt;p&gt;Thanks for reading!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Dockerized SailsJS/ReactJS/MongoDB/Redis/RabbitMQ/Nginx denvironment</title>
      <dc:creator>Sven Varkel</dc:creator>
      <pubDate>Fri, 11 Oct 2019 21:04:23 +0000</pubDate>
      <link>https://dev.to/svenvarkel/dockerized-sailsjs-reactjs-mongodb-redis-rabbitmq-nginx-denvironment-325n</link>
      <guid>https://dev.to/svenvarkel/dockerized-sailsjs-reactjs-mongodb-redis-rabbitmq-nginx-denvironment-325n</guid>
      <description>&lt;p&gt;This post describes steps to set up expendable full stack &lt;strong&gt;denvironment&lt;/strong&gt;. What's a &lt;em&gt;denvironment&lt;/em&gt;, you may ask? It's &lt;em&gt;development environment&lt;/em&gt;. That is just tooooo long to say and write:)&lt;/p&gt;

&lt;p&gt;Take time and prepare your dev machine if you want to play along right away.&lt;/p&gt;

&lt;h1&gt;
  
  
  Description of the project
&lt;/h1&gt;

&lt;p&gt;This project with made-up name "&lt;em&gt;World's largest bass players database&lt;/em&gt;" consists of:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;ReactJS frontend&lt;/li&gt;
&lt;li&gt;SailsJS JSON API &lt;/li&gt;
&lt;li&gt;MongoDB for database&lt;/li&gt;
&lt;li&gt;RabbitMQ for queue and async processing&lt;/li&gt;
&lt;li&gt;Redis for cache&lt;/li&gt;
&lt;li&gt;Nginx for reverse proxy that fronts the API.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let's call it "&lt;em&gt;players&lt;/em&gt;", for short.&lt;/p&gt;

&lt;p&gt;Let this project have it's main git repository be at &lt;a href="https://github.com/svenvarkel/players"&gt;https://github.com/svenvarkel/players&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;(it's time to create yours, now).&lt;/p&gt;

&lt;h1&gt;
  
  
  Pre-requisites
&lt;/h1&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Create 2 names in your /etc/hosts file.&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# /etc/hosts

127.0.0.1 api.players.local #for the API
127.0.0.1 app.players.local #for the web APP
&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Install Docker Desktop&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Get it &lt;a href="https://hub.docker.com/editions/community/docker-ce-desktop-mac"&gt;from here&lt;/a&gt; and follow the instructions.&lt;/p&gt;

&lt;h1&gt;
  
  
  Directory layout
&lt;/h1&gt;

&lt;p&gt;The directory layout reflects the stack. On top level there are all familiar names that help the developer to navigate to a component quickly and not waste time on searching for things in obscurely named subfolders or elsewhere. Also - each component is a real component, self-containing and complete. All output or config files or anything that a component would need are placed into the component's directory. &lt;/p&gt;

&lt;p&gt;The folder of your development projects is the /.&lt;/p&gt;

&lt;p&gt;So here is the layout:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/
/api
    /sails bits and pieces
    /.dockerignore
    /Dockerfile
/mongodb
/nginx
    /Dockerfile
    /conf.d/
        /api.conf
        /app.conf
/rabbitmq
/redis
/web
    /react bits and pieces
    /.dockerignore
    /Dockerfile
/docker-compose.yml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;It is all set up as an umbrella git repository with api and web as git submodules. Nginx, MongoDB, Redis and RabbitMQ don't need to have their own repositories. &lt;/p&gt;

&lt;p&gt;From now on you have choice either to clone my demo repository or create your own.&lt;/p&gt;

&lt;p&gt;If you decide to use my example repository, then run commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git clone git@github.com:svenvarkel/players.git
cd players
git submodule init
git submodule update
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Steps
&lt;/h1&gt;

&lt;h2&gt;
  
  
  First step - create &lt;em&gt;docker-compose.yml&lt;/em&gt;
&lt;/h2&gt;

&lt;p&gt;In &lt;em&gt;docker-compose.yml&lt;/em&gt; you define your stack in full.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: "3.7"
services:
  rabbitmq:
    image: rabbitmq:3-management
    environment:
      RABBITMQ_DEFAULT_VHOST: "/players"
      RABBITMQ_DEFAULT_USER: "dev"
      RABBITMQ_DEFAULT_PASS: "dev"
    volumes:
      - type: volume
        source: rabbitmq
        target: /var/lib/rabbitmq/mnesia
    ports:
      - "5672:5672"
      - "15672:15672"
    networks:
      - local
  redis:
    image: redis:5.0.5
    volumes:
      - type: volume
        source: redis
        target: /data
    ports:
      - "6379:6379"
    command: redis-server --appendonly yes
    networks:
      - local
  mongodb:
    image: mongo:4.2
    ports:
      - "27017:27017"
    environment:
      MONGO_INITDB_DATABASE: "admin"
      MONGO_INITDB_ROOT_USERNAME: "root"
      MONGO_INITDB_ROOT_PASSWORD: "root"
    volumes:
      - type: bind
        source: ./mongodb/docker-entrypoint-initdb.d
        target: /docker-entrypoint-initdb.d
      - type: volume
        source: mongodb
        target: /data
    networks:
      - local
  api:
    build: ./api
    image: players-api:latest
    ports:
      - 1337:1337
      - 9337:9337
    environment:
      PORT: 1337
      DEBUG_PORT: 9337
      WAIT_HOSTS: rabbitmq:5672,mongodb:27017,redis:6379
      NODE_ENV: development
      MONGODB_URL: mongodb://dev:dev@mongodb:27017/players?authSource=admin
    volumes:
      - type: bind
        source: ./api/api
        target: /var/app/current/api
      - type: bind
        source: ./api/config
        target: /var/app/current/config
    networks:
      - local
    depends_on:
      - "rabbitmq"
      - "mongodb"
      - "redis"
  web:
    build: ./web
    image: players-web:latest
    ports:
      - 3000:3000
    environment:
      REACT_APP_API_URL: http://api.players.local
    volumes:
      - type: bind
        source: ./web/src
        target: /var/app/current/src
      - type: bind
        source: ./web/public
        target: /var/app/current/public
    networks:
      - local
    depends_on:
      - "api"
  nginx:
    build: nginx
    image: nginx-wait:latest
    restart: on-failure
    environment:
      WAIT_HOSTS: api:1337,web:3000
    volumes:
      - type: bind
        source: ./nginx/conf.d
        target: /etc/nginx/conf.d
      - type: bind
        source: ./nginx/log
        target: /var/log/nginx
    ports:
      - 80:80
    networks:
      - local
    depends_on:
      - "api"
      - "web"
networks:
  local:
    driver: overlay

volumes:
  rabbitmq:
  redis:
  mongodb:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  A few comments about features and tricks used here.
&lt;/h3&gt;

&lt;p&gt;My favorite docker trick that I learnt just a few days ago is the use of &lt;a href="https://github.com/ufoscout/docker-compose-wait"&gt;wait&lt;/a&gt;. You will see it in api and nginx Dockerfiles. It's a special app that let's the docker container wait for dependencies until a service actually comes available at a port. The Docker's own "depends_on" is good but it just waits until a dependence container becomes available, not when the actual service is started inside a container. For example - rabbitmq is quite slow to start and it may cause the API behave erratically if it starts up before rabbitmq or mongodb have been fully started.&lt;/p&gt;

&lt;p&gt;The second trick you'll see in &lt;em&gt;docker-compose.yml&lt;/em&gt; is the use of &lt;a href="https://docs.docker.com/compose/compose-file/#volumes"&gt;bind mounts&lt;/a&gt;. The code from the dev machine is mounted as a folder inside docker container. It's good for rapid development. Whenever the source code is changed in the editor on developer machine the SailsJS application (or actually - nodemon) in container can detect the changes and restart the application. More details about setting up SailsJS app will follow in future posts, I hope.&lt;/p&gt;

&lt;h1&gt;
  
  
  Second step - create API and add it as git submodule
&lt;/h1&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sails new api --fast
cd api
git init
git remote add origin &amp;lt;your api repo origin&amp;gt;
git add .
git push -u origin master
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then create Dockerfile for API project:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;FROM node:10

ADD https://github.com/ufoscout/docker-compose-wait/releases/download/2.6.0/wait /wait
RUN chmod +x /wait
RUN mkdir -p /var/app/current

# Copy application sources
COPY . /var/app/current

WORKDIR /var/app/current

RUN npm i

RUN chown -R node:node /var/app/current
USER node

# Set the workdir /var/app/current

EXPOSE 1337

# Start the application
CMD /wait &amp;amp;&amp;amp; npm run start
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then move up and add it as your main project's submodule&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cd ..
git submodule add &amp;lt;your api repo origin&amp;gt; api
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Third step - create web app and add it as git submodule
&lt;/h1&gt;

&lt;p&gt;This step is almost a copy of step 2, but it's necessary.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npx create-react-app my-app
cd web
git init
git remote add origin &amp;lt;your web repo origin&amp;gt;
git add .
git push -u origin master
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then create Dockerfile for WEB project:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;FROM node:10

ADD https://github.com/ufoscout/docker-compose-wait/releases/download/2.6.0/wait /wait
RUN chmod +x /wait
RUN mkdir -p /var/app/current

# Copy application sources
COPY . /var/app/current

WORKDIR /var/app/current

RUN npm i

RUN chown -R node:node /var/app/current
USER node

# Set the workdir /var/app/current

EXPOSE 3000

# Start the application
CMD /wait &amp;amp;&amp;amp; npm run start
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As you can see the Dockerfiles for api and web are almost identical. Only the port number is different.&lt;/p&gt;

&lt;p&gt;Then move up and add it as your main project's submodule&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cd ..
git submodule add &amp;lt;your web repo origin&amp;gt; web
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;For both projects, api and web, it's also advisable to create .dockerignore file with just two lines:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;node_modules
package-lock.json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We want the npm modules inside the container being built fresh every time we rebuild the docker container.&lt;/p&gt;

&lt;h1&gt;
  
  
  It's time for our first smoke test!
&lt;/h1&gt;

&lt;p&gt;Run docker-compose:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker-compose up
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;After Docker grinding a while you should have a working stack! It doesn't do much yet but it's there.&lt;/p&gt;

&lt;p&gt;Check with docker-compose:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker-compose ps
   Name                     Command               State                                                                   Ports
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
players_api_1        docker-entrypoint.sh /bin/ ...   Up      0.0.0.0:1337-&amp;gt;1337/tcp, 0.0.0.0:9337-&amp;gt;9337/tcp
players_mongodb_1    docker-entrypoint.sh mongod      Up      0.0.0.0:27017-&amp;gt;27017/tcp
players_nginx_1      /bin/sh -c /wait &amp;amp;&amp;amp; exec n ...   Up      0.0.0.0:80-&amp;gt;80/tcp
players_rabbitmq_1   docker-entrypoint.sh rabbi ...   Up      0.0.0.0:15671-&amp;gt;15671/tcp, 0.0.0.0:15672-&amp;gt;15672/tcp, 0.0.0.0:25672-&amp;gt;25672/tcp, 4369/tcp, 0.0.0.0:5671-&amp;gt;5671/tcp, 0.0.0.0:5672-&amp;gt;5672/tcp
players_redis_1      docker-entrypoint.sh redis ...   Up      0.0.0.0:6379-&amp;gt;6379/tcp
players_web_1        docker-entrypoint.sh /bin/ ...   Up      0.0.0.0:3000-&amp;gt;3000/tcp
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;As you can see you have:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;API running on port 1337 (9337 also exposed for debugging)&lt;/li&gt;
&lt;li&gt;MongoDB running on port 27017&lt;/li&gt;
&lt;li&gt;RabbitMQ running on many ports, where AMQP port 5672 is of our interest. 15672 is for management - check it out &lt;a href="http://localhost:15672/"&gt;in your browser&lt;/a&gt; (use &lt;em&gt;dev&lt;/em&gt; as username and password)!&lt;/li&gt;
&lt;li&gt;Redis running on port 6379&lt;/li&gt;
&lt;li&gt;Web app running on port 3000&lt;/li&gt;
&lt;li&gt;Nginx running on port 80. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Nginx proxies both API and web app. So now it's time to give it a look in your browser.&lt;/p&gt;

&lt;p&gt;Open &lt;a href="http://api.players.local"&gt;http://api.players.local&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There it is!&lt;/p&gt;

&lt;p&gt;Open &lt;a href="http://app.players.local"&gt;http://app.players.local&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And there is the ReactJS app.&lt;/p&gt;

&lt;p&gt;With this post we won't go into depths of the applications but we focus rather on stack and integration. &lt;/p&gt;

&lt;p&gt;So how can services access each other in this Docker setup, you may ask.&lt;/p&gt;

&lt;p&gt;Right - it's very straightforward - the services can access each other on a common shared network by calling each other with exactly the same names that are defined in &lt;em&gt;docker-compose.yml&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Redis is at "redis:6379", MongoDB is at "mongodb:27017" etc.&lt;/p&gt;

&lt;p&gt;See &lt;em&gt;docker-compose.yml&lt;/em&gt; for a tip on how to connect your SailsJS API to MongoDB. &lt;/p&gt;

&lt;h1&gt;
  
  
  A note about storage
&lt;/h1&gt;

&lt;p&gt;You may have a question like "where is mongodb data stored?". There are 3 volumes defined in &lt;em&gt;docker-compose.yml&lt;/em&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mongodb
redis
rabbitmq
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;These are special docker volumes that hold the data for each component. It's convenient way of storing data outside of application container but still under control and management of Docker.&lt;/p&gt;

&lt;h1&gt;
  
  
  A word of warning
&lt;/h1&gt;

&lt;p&gt;There's something I learnt the hard way (not that hard, though) during my endeavour towards full stack dev env. I used command&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;docker-compose up&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;lightly and it created temptation to use command&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;docker-compose down&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;as lightly because "what goes up must come down", right? Not so fast! Beware that if you run &lt;em&gt;docker-compose down&lt;/em&gt; it &lt;strong&gt;will destroy&lt;/strong&gt; your stack &lt;strong&gt;including data volumes&lt;/strong&gt;. So - be careful and better read docker-compose manuals first. Use &lt;strong&gt;docker-compose start&lt;/strong&gt;, &lt;strong&gt;stop&lt;/strong&gt; and &lt;strong&gt;restart&lt;/strong&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  Wrapping it up
&lt;/h1&gt;

&lt;p&gt;More details could follow in similar posts in the future if there's interest for such guides. Shall I continue to add more examples on how to integrate RabbitMQ and Redis within such stack, perhaps? Let me know.&lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;In this post there is a step by step guide on how to set up &lt;strong&gt;full stack SailsJS/ReactJS application denvironment&lt;/strong&gt; (development environment) by using Docker. The denvironment consists of multiple components that are integrated with the API - database, cache and queue. User-facing applications are fronted by the Nginx reverse proxy.&lt;/p&gt;

</description>
      <category>docker</category>
      <category>sailsjs</category>
      <category>mongodb</category>
      <category>react</category>
    </item>
  </channel>
</rss>
