<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: enbis</title>
    <description>The latest articles on DEV Community by enbis (@enbis).</description>
    <link>https://dev.to/enbis</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/enbis"/>
    <language>en</language>
    <item>
      <title>Automatically switching Git Identities and SSH Keys on the same machine</title>
      <dc:creator>enbis</dc:creator>
      <pubDate>Mon, 24 Nov 2025 20:18:23 +0000</pubDate>
      <link>https://dev.to/enbis/automatically-switching-git-identities-and-ssh-keys-on-the-same-machine-75n</link>
      <guid>https://dev.to/enbis/automatically-switching-git-identities-and-ssh-keys-on-the-same-machine-75n</guid>
      <description>&lt;p&gt;When you need to use multiple Git accounts on the same computer (to simplify for the purpose of this post let's use two accounts that we will call &lt;em&gt;personal&lt;/em&gt; and &lt;em&gt;work&lt;/em&gt;), you may need Git to automatically use different identities depending on which repository you are working on. Git makes this possible through conditional configuration using the &lt;code&gt;includeIf&lt;/code&gt; directive in the &lt;code&gt;.gitconfig&lt;/code&gt; file.&lt;/p&gt;

&lt;p&gt;This guide explains how to set up multiple Git accounts, assign each account to a specific directory, and ensure that the correct SSH key and identity are used automatically.&lt;/p&gt;

&lt;h2&gt;
  
  
  Do's &amp;amp; Don'ts
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Don't define multiple &lt;code&gt;[user]&lt;/code&gt; blocks in a single .gitconfig file.&lt;/li&gt;
&lt;li&gt;Don't use the same SSH keys for different authentications.&lt;/li&gt;
&lt;li&gt;Do avoid manual &lt;code&gt;git config user.email&lt;/code&gt; overrides per repository.&lt;/li&gt;
&lt;li&gt;Do keep personal and work repository directories cleanly separated.&lt;/li&gt;
&lt;li&gt;Do use the &lt;code&gt;includeIf&lt;/code&gt; directive in the .gitconfig file.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The setup I will present here is fully supported by Git and scales well if you later add more accounts.  &lt;/p&gt;

&lt;h2&gt;
  
  
  Where to start: the &lt;code&gt;gitconfig&lt;/code&gt; file
&lt;/h2&gt;

&lt;p&gt;As written before, Git does not allow you to have multiple &lt;code&gt;[user]&lt;/code&gt; blocks in the same &lt;code&gt;.gitconfig&lt;/code&gt; file. The first step here is to add two more Git configuration files and call them &lt;em&gt;personal&lt;/em&gt; and &lt;em&gt;work&lt;/em&gt;: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;~/.gitconfig-personal&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[user]
    name = Your Personal Name
    email = your.personal.email@example.com
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;~/.gitconfig-work&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[user]
    name = Your Work Name
    email = your.work.email@example.com
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, update the &lt;code&gt;.gitconfig&lt;/code&gt; (the global one) to load the appropriate account configuration based on the directory where the repository you are working in is located. For simplicity, let's consider your &lt;em&gt;personal&lt;/em&gt; projects are in &lt;code&gt;~/personal&lt;/code&gt; directory and your &lt;em&gt;work&lt;/em&gt; related project are in &lt;code&gt;~/work&lt;/code&gt; directory.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;~/.gitconfig&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[includeIf "gitdir:~/personal/"]
    path = ~/.gitconfig-personal

[includeIf "gitdir:~/work/"]
    path = ~/.gitconfig-work
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Important note&lt;/em&gt;, do not forget the path should end with &lt;code&gt;/&lt;/code&gt; to also include subfolders. &lt;/p&gt;

&lt;h2&gt;
  
  
  And now the SSH keys
&lt;/h2&gt;

&lt;p&gt;Every SSH key pair consists of:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A private key&lt;/li&gt;
&lt;li&gt;A public key&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let's generate two different SSH keys: &lt;em&gt;personal&lt;/em&gt; and &lt;em&gt;work&lt;/em&gt;. This is possible thanks to the &lt;code&gt;-f&lt;/code&gt; option of the &lt;code&gt;ssh-keygen&lt;/code&gt; command. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;ssh personal&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ssh-keygen -t ed25519 -C "your_personal_email@example.com" -f ~/.ssh/id_ed25519_personal
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;ssh work&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ssh-keygen -t ed25519 -C "your_work_email@example.com" -f ~/.ssh/id_ed25519_work
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now we have the keys, the .pub is the one that should be uploaded to GitLab/GitHub/etc, while the private one is stored locally. &lt;br&gt;
One last step, we need to force Git to use a specific private key according to the directory your repo is hosted. To do that we have a custom SSH command called &lt;code&gt;[core]&lt;/code&gt; that must be used in each gitconfig files we previously created:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;~/.gitconfig-personal&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[user]
    name = Your Personal Name
    email = your.personal.email@example.com
[core]
    sshCommand = ssh -i ~/.ssh/id_ed25519_personal
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;~/.gitconfig-work&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[user]
    name = Your Work Name
    email = your.work.email@example.com
[core]
    sshCommand = ssh -i ~/.ssh/id_ed25519_work
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h2&gt;
  
  
  Verifying the correct account is active
&lt;/h2&gt;

&lt;p&gt;Inside each project directory, run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git config --list
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should see either your &lt;em&gt;personal&lt;/em&gt; or &lt;em&gt;work&lt;/em&gt; identity depending on the directory.&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;Setting up multiple Git accounts is easy and clean using Git’s conditional configuration:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use one main &lt;code&gt;.gitconfig&lt;/code&gt; with the &lt;code&gt;[includeIf]&lt;/code&gt; to load per-account config files.&lt;/li&gt;
&lt;li&gt;Define account-specific user info and &lt;code&gt;[core]&lt;/code&gt; SSH command in separate files.&lt;/li&gt;
&lt;li&gt;Generate a separate SSH keys per account for better security, traceability and clean separation.
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>git</category>
      <category>productivity</category>
      <category>tutorial</category>
      <category>programming</category>
    </item>
    <item>
      <title>my git tricks</title>
      <dc:creator>enbis</dc:creator>
      <pubDate>Thu, 12 Jan 2023 21:25:27 +0000</pubDate>
      <link>https://dev.to/enbis/my-git-tricks-1fd7</link>
      <guid>https://dev.to/enbis/my-git-tricks-1fd7</guid>
      <description>&lt;h1&gt;
  
  
  intro
&lt;/h1&gt;

&lt;p&gt;one of the things I've learned since stepping back from coding, is that git never gives up on you. it's always necessary having a solid foundation of git when working in IT, whatever your role is. I decided to start this page (maybe pages in the future) mainly for my personal use. if this comes useful to other even better.&lt;/p&gt;

&lt;h1&gt;
  
  
  what's on the menu today
&lt;/h1&gt;

&lt;p&gt;we can say it's something that is normally not recommended, but that may be useful in some specific cases.&lt;br&gt;
let's say you are progressing on your tasks, and you have a list of commits ready to be submitted to git server. at this point you realize there is something wrong in one of them. yes, of course you can add the fix on top, staging it and create a new commit but in some cases you need to keep the history as clean as possible, and modifying the wrong commit is the only solution you have.&lt;/p&gt;
&lt;h2&gt;
  
  
  how to modify the content of one commit
&lt;/h2&gt;

&lt;p&gt;the easiest case would be editing the last commit of the history. this is a simple job and it can be done thanks to the soft reset. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;git reset --soft HEAD~&lt;/code&gt; &lt;/p&gt;

&lt;p&gt;the command will send the last commit back to staging, and you are now able to apply the changes required.&lt;/p&gt;

&lt;p&gt;what if the commit is not the last one, but one in the middle?&lt;/p&gt;

&lt;p&gt;we can consider the following history, with three very simple commits.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ git log --oneline 
8e7390d (HEAD -&amp;gt; main) third commit
26f3ff5 second commit
85c87d4 first commit
7acf898 (origin/main, origin/HEAD) Initial commit
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;if you need to edit the second one, on the condition of not adding a new commit on top of the history, the rebase with the interactive subcommand argument is for you!&lt;/p&gt;

&lt;h2&gt;
  
  
  git rebase interactively
&lt;/h2&gt;

&lt;p&gt;&lt;code&gt;git rebase -i&lt;/code&gt; &lt;/p&gt;

&lt;p&gt;in the first place, the command shows the commits history&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pick 85c87d4 first commit
pick 26f3ff5 second commit
pick 8e7390d third commit
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and once you found the commit you want to modify, the only thing to do is to change the prefix &lt;code&gt;pick&lt;/code&gt; into &lt;code&gt;edit&lt;/code&gt;. you will end up with this&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ git status 
interactive rebase in progress; onto 7acf898
Last commands done (2 commands done):
   pick 85c87d4 first commit
   edit 26f3ff5 second commit
Next command to do (1 remaining command):
   pick 8e7390d third commit
  (use "git rebase --edit-todo" to view and edit)
You are currently editing a commit while rebasing branch 'main' on '7acf898'.
  (use "git commit --amend" to amend the current commit)
  (use "git rebase --continue" once you are satisfied with your changes)

nothing to commit, working tree clean
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;now you are able to update your code, and apply the changes to the &lt;em&gt;second commit&lt;/em&gt; just staging the file &lt;/p&gt;

&lt;p&gt;&lt;code&gt;git add &amp;lt;file&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;and finalizing the rebase:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;git rebase --continue&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;one point of focus here: in case the changes affect code touched by other commits you risk to have one or more conflicts to resolve. no panic, once fixed, mark them as resolved with &lt;code&gt;git add/rm &amp;lt;conflicted_files&amp;gt;&lt;/code&gt;, then you can execute &lt;code&gt;git rebase --continue&lt;/code&gt; safely.&lt;/p&gt;

&lt;p&gt;at the end of the rebase, the &lt;em&gt;second commit&lt;/em&gt; contains the changes and the git history is intact.&lt;/p&gt;

&lt;p&gt;alternatively, there is another workflow to get the same result. you can implement the changes directly on top of the branches and staging them. thanks to the fixup and the rebase (how you can't love the rebase) it's possible to place the changes in one specific commit. let's see how&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git add &amp;lt;files_changed&amp;gt;
git commit --fixup=&amp;lt;commit_hash&amp;gt;
git rebase -i origin/&amp;lt;branch&amp;gt; --autosquash
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;since the fixup creates a new commit, we need to take the patch of the changes introduced into the fixup commit and re-apply them on top of the original commit ( the &lt;em&gt;second commit&lt;/em&gt; ). this is why we use again the rebase with --autosquash ( from git-rebase page: the commit marked for squashing comes right after the commit to be modified, and change the action of the moved commit from pick to &lt;del&gt;squash or&lt;/del&gt; fixup ).    &lt;/p&gt;

&lt;p&gt;this was the trick of the day. and you, have you already used one of these two workflows? which one do you prefer? any other suggestion?   &lt;/p&gt;

</description>
      <category>git</category>
      <category>devops</category>
    </item>
    <item>
      <title>how to use build tags to control GO testing with a GitLab CI use case</title>
      <dc:creator>enbis</dc:creator>
      <pubDate>Sun, 03 Oct 2021 20:09:39 +0000</pubDate>
      <link>https://dev.to/enbis/how-to-use-build-tags-to-control-go-testing-with-a-gitlab-ci-use-case-584b</link>
      <guid>https://dev.to/enbis/how-to-use-build-tags-to-control-go-testing-with-a-gitlab-ci-use-case-584b</guid>
      <description>&lt;h2&gt;
  
  
  abstract
&lt;/h2&gt;

&lt;p&gt;The importance of testing code in programming is undoubtedly a fact. There cannot be good quality development without proper testing coverage. Sometimes, however, it is necessary to distinguish the tests or even launch them grouped together rather than all at once. This is the purpose of these few lines: present a valid solution for GO developers who wants to run their tests separately. &lt;/p&gt;

&lt;p&gt;The use case I prepared involves a dummy GO project with test files. It has been pushed on a GitLab repository to run the Continuous Integration, while diversifying tests. This is the link to inspect the solution directly on &lt;a href="https://gitlab.com/enbis/testex" rel="noopener noreferrer"&gt;GitLab&lt;/a&gt;. &lt;/p&gt;

&lt;h3&gt;
  
  
  prerequisites
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;basic knowledge of GO programming and testing&lt;/li&gt;
&lt;li&gt;basic knowledge of the GitLab-CI&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  the GO project
&lt;/h2&gt;

&lt;p&gt;Let's start our tour presenting the pilot GO project used to evaluate the usefulness of &lt;strong&gt;build tags&lt;/strong&gt;. It is a simple Web Server with the sole purpose of handling http requests based on this URL query string &lt;code&gt;/?value=10&lt;/code&gt;. Assuming the value is numeric, the web server returns a JSON payload with a success status and the value increased by one unit.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{"status":"success","result":"11"}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In case of invalid request it returns JSON payload with a fail status and the error.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{"status":"fail","error":"strconv.Atoi: parsing \"10a\": invalid syntax"}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  the project layout
&lt;/h3&gt;

&lt;p&gt;In this scenario, the GO project has two packages besides the main: &lt;code&gt;handler&lt;/code&gt; and &lt;code&gt;helper&lt;/code&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://gitlab.com/enbis/testex/-/blob/main/handler/handler.go" rel="noopener noreferrer"&gt;handler&lt;/a&gt;: decodes the http request and returns the answer after processing the data received&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://gitlab.com/enbis/testex/-/blob/main/helper/helper.go" rel="noopener noreferrer"&gt;helper&lt;/a&gt;: exposes the functions to manipulate the data &lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  the test files
&lt;/h3&gt;

&lt;p&gt;The layout presented allows me to separate the context for the testing stage. &lt;br&gt;
The &lt;code&gt;handler&lt;/code&gt; can be tested using the &lt;code&gt;net/http/httptest&lt;/code&gt; GO std library package. The &lt;a href="https://gitlab.com/enbis/testex/-/blob/main/handler/handler_internal_test.go" rel="noopener noreferrer"&gt;handler_internal_test&lt;/a&gt; contains the httptest functions NewRequest and NewRecorder to prepare the http.Request and evaluate the http.Response. This could be identified as a sort of &lt;strong&gt;integration test&lt;/strong&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;func TestHandler(t *testing.T) {
    var response Response

    req := httptest.NewRequest("GET", "/?value=10", nil)
    w := httptest.NewRecorder()
    Handler(w, req)

    res := w.Result()
    defer res.Body.Close()

    assert.Equal(t, http.StatusOK, res.StatusCode)

    data, err := ioutil.ReadAll(res.Body)
    assert.NoError(t, err)

    err = json.Unmarshal(data, &amp;amp;response)
    assert.NoError(t, err)

    assert.Equal(t, "11", response.Result)
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;helper&lt;/code&gt; package contains the functions used to process the data. Those functions can be tested as a pure &lt;strong&gt;unit tests&lt;/strong&gt;. This is the &lt;a href="https://gitlab.com/enbis/testex/-/blob/main/helper/helper_internal_test.go" rel="noopener noreferrer"&gt;helper_internal_test&lt;/a&gt; created.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;func TestConvert(t *testing.T) {
    tables := []struct {
        in  string
        out int
    }{
        {"1", 1},
        {"100", 100},
    }

    for _, table := range tables {
        converted, err := Convert(table.in)
        assert.NoError(t, err)
        assert.Equal(t, table.out, converted)
    }
}

func TestIncrement(t *testing.T) {
    tables := []struct {
        in  int
        out int
    }{
        {1, 2},
        {100, 101},
    }

    for _, table := range tables {
        converted := Increment(table.in)
        assert.Equal(t, table.out, converted)
    }
}

func TestToString(t *testing.T) {
    tables := []struct {
        in  int
        out string
    }{
        {1, "1"},
        {100, "100"},
    }

    for _, table := range tables {
        str := ToString(table.in)
        assert.Equal(t, table.out, str)
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  the build tags ( or build constraints )
&lt;/h2&gt;

&lt;p&gt;What we produced so far is the Web Server and its test files. As a GO developer I just have to run the command &lt;code&gt;go test ./...&lt;/code&gt; and my whole list of tests is executed. &lt;br&gt;
The point is that I cannot distinguish between &lt;code&gt;integration test&lt;/code&gt; and &lt;code&gt;unit test&lt;/code&gt; or even have a separate report in a GitLab pipeline.&lt;br&gt;
Here is where the &lt;strong&gt;build tags&lt;/strong&gt; become useful. On top of that, they are easy to integrate into the code. &lt;br&gt;
The syntax is a comment on top of the test file, with the special word &lt;em&gt;+build&lt;/em&gt; followed by the keyword that identifies the tag. The constrains are injected into GO using &lt;em&gt;-tags&lt;/em&gt; flag in the test command. For more details, here the &lt;a href="https://pkg.go.dev/go/build" rel="noopener noreferrer"&gt;doc&lt;/a&gt;.&lt;br&gt;
Let's see what happens in practical terms:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;integration test: the chosen keyword is integration, so I need to add the comment &lt;code&gt;// +build integration&lt;/code&gt; on top of the handler_internal_test file and run the command &lt;code&gt;go test ./... -tags=integration&lt;/code&gt; to execute it.
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// +build integration

package handler

import (
    "encoding/json"
    "io/ioutil"
    "net/http"
    "net/http/httptest"
    "testing"

    "github.com/stretchr/testify/assert"
)

func TestHandler(t *testing.T) {
....
....
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;unit test: the keyword in this case is unit, I need to add the comment &lt;code&gt;// +build unit&lt;/code&gt; on top of the helper_internal_test file and run the command &lt;code&gt;go test ./... -tags=unit&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// +build unit

package helper

import (
    "testing"

    "github.com/stretchr/testify/assert"
)

func TestConvert(t *testing.T) {
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;unit and integration test: tags syntax allows to use AND and OR, as well as negative expressions. To run both at the same time, the command to use is &lt;code&gt;go test ./... -tags=integration,unit&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  the GitLab CI pipeline
&lt;/h2&gt;

&lt;p&gt;Last but not least, the integration of the build tags into the GitLab CI pipeline. In this case the purpose could be to see the result of the tests grouped by tags. The case presented here contains only two tags: integration and unit. The &lt;a href="https://gitlab.com/enbis/testex/-/blob/main/.gitlab-ci.yml" rel="noopener noreferrer"&gt;gitlab-ci.yml&lt;/a&gt; file will have two stages, each with the proper script.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;stages:
  - unit_test
  - integration_test

unit:
  stage: unit_test
  script:
    - go test -v ./... -tags=unit

integration:
  stage: integration_test
  script:
    - go test -v ./... -tags=integration

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Which brings this result:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faftj1522ugiwchoe29t6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faftj1522ugiwchoe29t6.png" alt="Alt Text" width="612" height="263"&gt;&lt;/a&gt;  &lt;/p&gt;

&lt;p&gt;Thanks to the build tags in the GitLab pipeline presented above I'm able to distinguish tests by context, run them separately and inspect the result individually. &lt;/p&gt;

</description>
      <category>testing</category>
      <category>tutorial</category>
      <category>test</category>
      <category>go</category>
    </item>
    <item>
      <title>Setting up GitHub Actions Workflows to automate the API tests</title>
      <dc:creator>enbis</dc:creator>
      <pubDate>Fri, 08 Jan 2021 12:20:29 +0000</pubDate>
      <link>https://dev.to/enbis/setting-up-github-actions-workflows-to-automate-the-api-tests-56m8</link>
      <guid>https://dev.to/enbis/setting-up-github-actions-workflows-to-automate-the-api-tests-56m8</guid>
      <description>&lt;p&gt;In the previous "episode", I presented a quick solution to &lt;a href="https://dev.to/enbis/automate-the-api-test-in-gitlab-ci-with-postman-newman-docker-image-455n"&gt;automate the API Tests using GitLab pipelines&lt;/a&gt;. Here, I'd like to discuss the same solution working on GitHub repository instead. GitHub provides built-in support for CI/CD as well as GitLab, so I expect that by applying the same principles I should have no problems (or at least it shouldn't be a complicated task).&lt;/p&gt;

&lt;h2&gt;
  
  
  What we have at disposal
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;GitHub repository to perform the continuous integration. We are going to use Newman, the command-line collection runner for Postman.&lt;/li&gt;
&lt;li&gt;The extracted Postman Collection. Here is the same &lt;a href="https://gitlab.com/enbis/generic-api-gitlabci/-/blob/master/test/HttpbinForNewman.json" rel="noopener noreferrer"&gt;HttpbinForNewman.json&lt;/a&gt; file used in the GitLab case. The purpose, is to compare the results obtained previously in GitLab with the one we are going to see in GitHub.
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Continuous Integration on GitHub
&lt;/h2&gt;

&lt;p&gt;As described in the &lt;a href="https://docs.github.com/en" rel="noopener noreferrer"&gt;GitHub Docs&lt;/a&gt;, is possible to create a CI workflow directly in the GitHub repository using the &lt;strong&gt;GitHub Actions&lt;/strong&gt;. A workflow is a collection of jobs that can perform tasks of continuous integration. Each job consist of a small operations called steps. The workflow can be configured to run right after a certain type of event, for instance when a push is made to a specific branch, and it can directly run on GitHub-hosted virtual machines.  &lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up GitHub Actions
&lt;/h2&gt;

&lt;p&gt;A custom workflow can be created opening the Actions pane of the repository. The main page contains a list of popular workflows / CI template,  clicking on the &lt;em&gt;Configure a workflow yourself&lt;/em&gt; button a new commit with an empty workflow file will be created in the repository.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F8dm8ghl0bfoz6rqmlwt7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F8dm8ghl0bfoz6rqmlwt7.png" alt="Alt Text" width="800" height="705"&gt;&lt;/a&gt;  &lt;/p&gt;

&lt;p&gt;All workflows must reside inside a specific path of the GitHub repository: &lt;strong&gt;.github/workflows&lt;/strong&gt;. Inside that folder there is the YAML file that describes the jobs. The name of that file has no particular conventions, but the structure is well specified. Below, I will present two different solutions, with different approach: &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;with postman/newman docker image&lt;/li&gt;
&lt;li&gt;without postman/newman docker image&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  The structure of the YAML file
&lt;/h2&gt;

&lt;p&gt;As a first step, right after the name of the workflow (CI), is necessary to define the event that will trigger the Action. In that case is set for each push event on the &lt;em&gt;master&lt;/em&gt; branch.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;name: CI

on:
  push:
    branches: 
    - master
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As you can see below, the name of the workflow corresponds to the attribute specified on the yaml file. Each job executed, is linked to the commit message, in that case &lt;em&gt;docker run&lt;/em&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fvaeji340cia22zd85zhk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fvaeji340cia22zd85zhk.png" alt="Alt Text" width="800" height="230"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we need to define the list of &lt;em&gt;jobs&lt;/em&gt; that will be executed: the first sub-line is intended for the name of the job (test), the &lt;em&gt;runs-on&lt;/em&gt; field specifies the runner selected and for this solution will be a Ubuntu GitHub-hosted virtual machine.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;jobs:
  test:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@master
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;em&gt;uses&lt;/em&gt; field specifies a predefined task that can run inside the virtual machine, there are thousands of such actions available on GitHub Marketplace for free, later we will meet other examples. In the case shown above the &lt;em&gt;Action&lt;/em&gt; is intended to checkout the latest commit on master branch, in order to have an updated version of the source code for the desired branch. This is, in general, the very first step to perform before proceeding with the other steps. Now it's time to approach the implementation of the two solutions mentioned before. &lt;/p&gt;

&lt;h3&gt;
  
  
  Solution with postman/newman docker image
&lt;/h3&gt;

&lt;p&gt;Super easy, it's possible to run the docker command directly inside the VM, using the &lt;em&gt;run&lt;/em&gt; field. When you create the step you just have to be careful to copy inside the container the correct volume, containing the json file with the Postman Collection.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;- name: run docker command on VM
  run: |
    docker run -v $(pwd)/test:/etc/newman -t postman/newman:latest run "HttpbinForNewman.json" --reporters="cli"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Solution without postman/newman docker image
&lt;/h3&gt;

&lt;p&gt;In this case the solution is a bit more articulated: to run Newman command, the Node.js &amp;gt;= v10 need to be installed on the VM before and specific &lt;em&gt;GitHub Action&lt;/em&gt; exists for that: &lt;em&gt;actions/setup-node@v1&lt;/em&gt;. Using the &lt;em&gt;with&lt;/em&gt; field you can set options to configure the action. Then you have to install Newman, the easiest way is using NPM. The last step is run the agent: the newman run command allows you to specify a collection to be run. The -r is the flag related to the preferred reporter, in this case through cli.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;- name: Install Node
  uses: actions/setup-node@v1
  with: 
    node-version: '12.x'

- name: Install newman
  run: |
    npm install -g newman

- name: Run POSTMAN collection
  run: |
    newman run ./test/HttpbinForNewman.json -r cli
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  The result
&lt;/h2&gt;

&lt;p&gt;From the Actions pane of the GitHub page, is it possible to easily understand the list of results from the different workflows. The picture below is related to the one just discussed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ffniqlnluj82riy9nb1xp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ffniqlnluj82riy9nb1xp.png" alt="Alt Text" width="800" height="214"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Extending the &lt;em&gt;Docker for newman&lt;/em&gt; portion, is it possible to investigate the logs and evaluate the success of the tests performed through Newman, as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fb04yvmcg110ystn3pdki.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fb04yvmcg110ystn3pdki.png" alt="Alt Text" width="800" height="569"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is the YAML file used to obtain that result.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;name: CI

on:
  push:
    branches: 
    - main

jobs:
  test-api:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@main

    - name: Docker for newman
      run: |
        docker run -v $(pwd)/test:/etc/newman -t postman/newman:latest run "HttpbinForNewman.json" --reporters="cli"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;At the end of these few lines, there are not so many differences between the &lt;a href="https://gitlab.com/enbis/generic-api-gitlabci/-/jobs/936491838" rel="noopener noreferrer"&gt;GitLab pipeline&lt;/a&gt; and GitHub Actions, in relation to the case of using Newman to automate the API tests. The only aspect to take into account is the different structure of the YAML file. &lt;/p&gt;

</description>
      <category>tutorial</category>
      <category>devops</category>
      <category>github</category>
      <category>testing</category>
    </item>
    <item>
      <title>Automate the API tests in Gitlab CI with postman/newman docker image</title>
      <dc:creator>enbis</dc:creator>
      <pubDate>Wed, 30 Dec 2020 10:47:39 +0000</pubDate>
      <link>https://dev.to/enbis/automate-the-api-test-in-gitlab-ci-with-postman-newman-docker-image-455n</link>
      <guid>https://dev.to/enbis/automate-the-api-test-in-gitlab-ci-with-postman-newman-docker-image-455n</guid>
      <description>&lt;p&gt;The purpose of this article is to present a simple procedure to test automatically the service API response. The solution leverage the features offered by the Continuous Integration tool built into Gitlab. Along these few lines will be presented the agents involved and how to integrate every part. First of all the &lt;em&gt;Postman Collection&lt;/em&gt; and its test tab, then the dockerized &lt;em&gt;Newman command&lt;/em&gt; to run all the requests automatically in sequence and finally the integration with Gitlab preparing an ad-hoc &lt;em&gt;gitlab-ci.yml&lt;/em&gt; file to execute all the test using the CI pipeline after every change pushed. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgcezy2367yl7f6qsdymm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgcezy2367yl7f6qsdymm.png" alt="Alt Text" width="800" height="362"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That's it, let's start then! &lt;/p&gt;

&lt;h2&gt;
  
  
  TL;DR
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Postman: the API desktop tool client. The tool is used to send requests and inspect the responses, in order to easily debug the behavior of the API service during its development.
&lt;/li&gt;
&lt;li&gt;Newman: is a command line agent that allows to run multiple requests sequentially. It takes as input the Postman Collection extracted as a json file and return the responses with the results of the each test, if any. &lt;/li&gt;
&lt;li&gt;CI: Gitlab has its own section for Continuous Integration. For every change submitted, Gitlab is able to run a pipeline of scripts to build, test and validate the code changes. In this case, the CI will be devoted to execute automatically the requests contained inside the extracted Postman Collection, evaluating the results of each test. The status of each job will be related to the result obtained by the test.
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Project reference
&lt;/h2&gt;

&lt;p&gt;To those who are interested, a specific Gitlab repository has been created in order to investigate more thoroughly the solution presented here. &lt;br&gt;
Feel free to take inspiration from that if you find it useful. &lt;br&gt;
&lt;a href="https://gitlab.com/enbis/generic-api-gitlabci" rel="noopener noreferrer"&gt;generic-api-gitlabci&lt;/a&gt;. &lt;br&gt;
Without a real API service to test, the response has been simulated using &lt;a href="https://httpbin.org/" rel="noopener noreferrer"&gt;httpbin&lt;/a&gt; as endpoint. In particular the REST request &lt;em&gt;&lt;a href="https://httpbin.org/anything" rel="noopener noreferrer"&gt;https://httpbin.org/anything&lt;/a&gt;&lt;/em&gt; returns a predictable response: its payload is created extracting the details of the request received.&lt;/p&gt;
&lt;h2&gt;
  
  
  First step: &lt;em&gt;postman&lt;/em&gt;
&lt;/h2&gt;

&lt;p&gt;For those who like me have used Postman to send a REST request and analyze the response received, will be happy to know that there is much more to discover inside that tool. The test tab has been created for develop a specific test / assertion cases for each request; the guidelines are well documented here, on the &lt;a href="https://www.postmanlabs.com/postman-collection/index.html" rel="noopener noreferrer"&gt;Postman Collection SDK&lt;/a&gt;. This is a NodeJS module through which the developer can manipulate each request presented on the collection. There are many possibilities and useful functions to preare a full test case and investigate in depth each response; for semplicity I selected the easiest test/assertion cases as you can see in the &lt;a href="https://gitlab.com/enbis/generic-api-gitlabci/-/blob/master/test/HttpbinForNewman.json" rel="noopener noreferrer"&gt;HttpbinForNewman.json&lt;/a&gt; file. The list of test to be performed are included inside that json file, it could be easily downloaded and imported into the Postman tool as a new Collection to evaluate the content. As endpoint I used httpbin, which has the advantage of providing a predictable responses each time a request is sent. &lt;/p&gt;
&lt;h2&gt;
  
  
  Second step: &lt;em&gt;newman&lt;/em&gt;
&lt;/h2&gt;

&lt;p&gt;Newman is a command-line agent that runs the entire list of requests contained inside the Postman Collection executing the tests, if any. The easiest solution could be trigger it using its &lt;strong&gt;postman/newman&lt;/strong&gt; docker image. Be careful to use the proper path and the proper Postman Collection extracted as a json file to pass to the docker command. These two parameters are respectively named &lt;em&gt;pwd_path_collection&lt;/em&gt; and &lt;em&gt;name_of_collection_extracted&lt;/em&gt; in the command below. In this case, the result will be reported via cli.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker run -v &amp;lt;pwd_path_collection&amp;gt;:/etc/newman -t postman/newman:latest run "name_of_collection_extracted.json" --reporters="cli"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;At this stage you can test the Newman command directly from your local machine: the outcome of the command will be a table with the details of each request launched.&lt;/p&gt;

&lt;h2&gt;
  
  
  Third step: &lt;em&gt;gitlab-ci&lt;/em&gt;
&lt;/h2&gt;

&lt;p&gt;The last step is to integrate what Newman has been performed into Gitlab CI pipeline, again exploiting the docker image. The purpose is to create an automated tested solution, that could be run each time a new change is pushed on the remote repository. This step is achieved by adding the &lt;strong&gt;gitlab-ci.yml&lt;/strong&gt; file into the project's root as you can see &lt;a href="https://gitlab.com/enbis/generic-api-gitlabci/-/blob/master/.gitlab-ci.yml" rel="noopener noreferrer"&gt;here&lt;/a&gt;. To run the script with the docker command inside the Gitlab is necessary using docker-in-docker (dind) image. The Newman solution could be easily integrate inside the test stage of the CI. Since this is the core aspect of the solution, the content of the &lt;em&gt;gitlab-ci.yml&lt;/em&gt; file will be directly copied here.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;image: docker:dind

services:
  - docker:dind

postman_tests:
  script:
  - docker run -v $(pwd)/test:/etc/newman -t postman/newman:latest run "HttpbinForNewman.json" --reporters="cli"
  stage: test
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For convenience I have chosen to upload the file inside the test folder of the project, for that reason the docker volume argument mount the directory /test on the host inside the container.  &lt;/p&gt;

&lt;h2&gt;
  
  
  Test cases
&lt;/h2&gt;

&lt;p&gt;I have prepared two requests representing the test cases:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;GET -&amp;gt; &lt;em&gt;&lt;a href="https://httpbin.org/anything?first=a&amp;amp;second=b" rel="noopener noreferrer"&gt;https://httpbin.org/anything?first=a&amp;amp;second=b&lt;/a&gt;&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;POST -&amp;gt; &lt;em&gt;&lt;a href="https://httpbin.org/anything" rel="noopener noreferrer"&gt;https://httpbin.org/anything&lt;/a&gt;&lt;/em&gt; with a json body&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Both the requests have their tailor made list of tests. &lt;br&gt;
Once the Collection has been imported in the Postman tool, is sufficient send the request to see the tests executed. This is valid for both the requests in the Collection.&lt;/p&gt;

&lt;p&gt;As soon as the request will be exectued, the result of the test will be presented immediately. Here for example the screenshot obtained launching the GET request. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgw3nx1qtdcjq3pc5z6vm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgw3nx1qtdcjq3pc5z6vm.png" alt="Alt Text" width="625" height="604"&gt;&lt;/a&gt;   &lt;/p&gt;

&lt;p&gt;The outcome is rather simple to understand, but has the limitation of manual execution for each request. The integration with Gitlab CI has been discussed to overcome this. Let's see the outcome of the CI pipeline job. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ffpqjn0lg8c4zlocf1tku.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ffpqjn0lg8c4zlocf1tku.png" alt="Alt Text" width="711" height="754"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The image above has been taken from &lt;a href="https://gitlab.com/enbis/generic-api-gitlabci/-/jobs/936491838" rel="noopener noreferrer"&gt;here&lt;/a&gt;. From the list of all jobs finished is possible view immediately the status of each job, if passed or failed. The status is related to the outcome of the test.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F3su95vbu37ysptkv28tt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F3su95vbu37ysptkv28tt.png" alt="Alt Text" width="800" height="159"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusions
&lt;/h2&gt;

&lt;p&gt;The great advantage of integrating a solution like the one presented here is related to the creation of an automated process to test the integrity of the API service after each change introduced in the repository. That could save developers time and trouble, with the effort of spending a little time preparing the list of requests that the API service must always satisfy. Thanks to the Gitlab &lt;a href="https://gitlab.com/enbis/generic-api-gitlabci/-/pipelines" rel="noopener noreferrer"&gt;CI pipeline&lt;/a&gt; integration the developer could have an immediate feedback if the last change pushed have compromised anything on the response of the service API.&lt;/p&gt;

</description>
      <category>tutorial</category>
      <category>devops</category>
      <category>testing</category>
    </item>
    <item>
      <title>Go package: a brief tour into the world of Go Module</title>
      <dc:creator>enbis</dc:creator>
      <pubDate>Sun, 26 Apr 2020 20:37:03 +0000</pubDate>
      <link>https://dev.to/enbis/go-package-a-brief-tour-into-the-world-of-go-module-5hf1</link>
      <guid>https://dev.to/enbis/go-package-a-brief-tour-into-the-world-of-go-module-5hf1</guid>
      <description>&lt;p&gt;Starting from Go 1.11, the Golang team decided to introduce a new dependency management system: the Modules.&lt;/p&gt;

&lt;p&gt;A Module is a collection of Go Packages with explicit version information. The go.mod file is used to store that information and it is located at the root of the project tree. The idea behind this concept is similar to what npm is for Node.&lt;/p&gt;

&lt;p&gt;In this post, we will see how the main project interacts with the go.mod file and how it handles the different package versions available. First of all, we need a package with some releases, let's develop it.&lt;/p&gt;

&lt;h2&gt;
  
  
  go package
&lt;/h2&gt;

&lt;p&gt;The idea behind the package is to build something extremely simple, whose only purpose is to print its version on the terminal. I followed the best practice rules related to the semantic versioning: &lt;code&gt;v&amp;lt;Major&amp;gt;.&amp;lt;Minor&amp;gt;.&amp;lt;Patch&amp;gt;&lt;/code&gt;. I've already deployed the project on my personal GitHub account &lt;a href="https://github.com/enbis/versioningPack" rel="noopener noreferrer"&gt;github.com/enbis/versioningPack&lt;/a&gt;, but nothing stops you from making your own new one. Below the tree of the project folder.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;~$ tree
.
├── go.mod
├── README.md
├── versioning.go
└── versioning_test.go
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  the first release - v1.0.0
&lt;/h3&gt;

&lt;p&gt;As a first release, it's fair to use the v1.0.0 as a version identification. So, let's write a few lines of code to achieve that. &lt;/p&gt;

&lt;p&gt;The versioning.go file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight go"&gt;&lt;code&gt;&lt;span class="k"&gt;package&lt;/span&gt; &lt;span class="n"&gt;versioning&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="s"&gt;"fmt"&lt;/span&gt;

&lt;span class="k"&gt;func&lt;/span&gt; &lt;span class="n"&gt;GetVersion&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;fmt&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Println&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Version 1.0.0 pulled from remote repository"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The versioning_test.go:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight go"&gt;&lt;code&gt;&lt;span class="k"&gt;package&lt;/span&gt; &lt;span class="n"&gt;versioning&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="s"&gt;"testing"&lt;/span&gt;

&lt;span class="k"&gt;func&lt;/span&gt; &lt;span class="n"&gt;TestVersioning&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;t&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="n"&gt;testing&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;T&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;GetVersion&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The go.mod file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight go"&gt;&lt;code&gt;&lt;span class="n"&gt;module&lt;/span&gt; &lt;span class="n"&gt;github&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;com&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="n"&gt;enbis&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="n"&gt;versioningPack&lt;/span&gt;

&lt;span class="k"&gt;go&lt;/span&gt; &lt;span class="m"&gt;1.13&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Well, everything seems to work as expected. The GetVersion function prints the version on the terminal and we can be satisfied with our job. It's time to push the changes and tag the first release with the v1.0.0 version.&lt;/p&gt;

&lt;p&gt;Git versioning:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$ &lt;/span&gt;git add &lt;span class="nb"&gt;.&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;git commit &lt;span class="nt"&gt;-m&lt;/span&gt; &lt;span class="s1"&gt;'first release is ready'&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;git push origin master
&lt;span class="nv"&gt;$ &lt;/span&gt;git tag v1.0.0
&lt;span class="nv"&gt;$ &lt;/span&gt;git push &lt;span class="nt"&gt;--tag&lt;/span&gt; origin master
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  the second release - v1.1.0
&lt;/h3&gt;

&lt;p&gt;Assumption: minors changes required. So, let's start working on the new features (in that specific case I'm just talking about change the string printed out). As soon as the string is modified (and tested the result), we are ready to push changes and tag in order to create a new release: &lt;code&gt;v1.1.0&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The versioning.go file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight go"&gt;&lt;code&gt;&lt;span class="k"&gt;package&lt;/span&gt; &lt;span class="n"&gt;versioning&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="s"&gt;"fmt"&lt;/span&gt;

&lt;span class="k"&gt;func&lt;/span&gt; &lt;span class="n"&gt;GetVersion&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;fmt&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Println&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Version 1.1.0 pulled from remote repository"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Git versioning:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$ &lt;/span&gt;git add &lt;span class="nb"&gt;.&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;git commit &lt;span class="nt"&gt;-m&lt;/span&gt; &lt;span class="s1"&gt;'second release is ready'&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;git push origin master
&lt;span class="nv"&gt;$ &lt;/span&gt;git tag v1.1.0
&lt;span class="nv"&gt;$ &lt;/span&gt;git push &lt;span class="nt"&gt;--tag&lt;/span&gt; origin master
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  the first major release - v2.0.0
&lt;/h3&gt;

&lt;p&gt;Assumption: new changes required. This time the changes modify the package so much so that the new version won't be longer backward compatible. It's time to create a new feature branch, called v2, to prevent incompatibility problems. That's not enough since we are developing a package so the reference on the go.mod file must also be aligned with the branch.&lt;/p&gt;

&lt;p&gt;The versioning.go file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight go"&gt;&lt;code&gt;&lt;span class="k"&gt;package&lt;/span&gt; &lt;span class="n"&gt;versioning&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="s"&gt;"fmt"&lt;/span&gt;

&lt;span class="k"&gt;func&lt;/span&gt; &lt;span class="n"&gt;GetVersion&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;fmt&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Println&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Version 2.0.0 pulled from remote repository"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The go.mod file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight go"&gt;&lt;code&gt;&lt;span class="n"&gt;module&lt;/span&gt; &lt;span class="n"&gt;github&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;com&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="n"&gt;enbis&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="n"&gt;versioningPack&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="n"&gt;v2&lt;/span&gt;

&lt;span class="k"&gt;go&lt;/span&gt; &lt;span class="m"&gt;1.13&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Git versioning (and branching):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$ &lt;/span&gt;git checkout &lt;span class="nt"&gt;-b&lt;/span&gt; v2
&lt;span class="nv"&gt;$ &lt;/span&gt;git add &lt;span class="nb"&gt;.&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;git commit &lt;span class="nt"&gt;-m&lt;/span&gt; &lt;span class="s1"&gt;'new release is ready'&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;git push origin v2
&lt;span class="nv"&gt;$ &lt;/span&gt;git tag v2.0.0
&lt;span class="nv"&gt;$ &lt;/span&gt;git push &lt;span class="nt"&gt;--tag&lt;/span&gt; origin master
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  all the releases
&lt;/h3&gt;

&lt;p&gt;The job with that package seems finished. Let's recap the versions available.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;two releases on branch master: v1.0.0 and v1.1.0&lt;/li&gt;
&lt;li&gt;one release on branc v2: v2.0.0
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;* 633ad5d (HEAD -&amp;gt; master, tag: v1.1.0, origin/master, origin/HEAD) version 1.1.0 ready
* 215b94a (tag: v1.0.0) rename
* 9e6c026 go mod master
| * 3aa6e2c (tag: v2.0.0, origin/v2, v2) rename
| * dc39bee go mod v2
| * 39f1fad v2.0.0
|/  
* 75ddd41 versioning
* 49d2c42 first commit
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  the project
&lt;/h2&gt;

&lt;p&gt;It's time to create a project in order to use the newly developed (and versioned) package. The purpose of that project is to test all the versions available of the github.com/enbis/versioningPack through the go.mod. As the last aspect, we will try to customize the behavior of the package locally and referring it instead of the release version. &lt;/p&gt;

&lt;p&gt;Firstly, we need to create the development environment, out of the $GOPATH. Select the path you prefer and run the commands below.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;~&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;mkdir &lt;/span&gt;Projects/testVersioning
~&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;cd &lt;/span&gt;Projects/testVersioning/
~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;touch &lt;/span&gt;main.go
~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;go mod init versPack
~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;ll
    totale 12
    drwxr-xr-x 2 enrico enrico 4096 apr 26 10:45 ./
    drwxr-xr-x 5 enrico enrico 4096 apr 26 10:42 ../
    &lt;span class="nt"&gt;-rw-r--r--&lt;/span&gt; 1 enrico enrico   25 apr 26 10:45 go.mod
    &lt;span class="nt"&gt;-rw-r--r--&lt;/span&gt; 1 enrico enrico    0 apr 26 10:43 main.go
~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;cat &lt;/span&gt;go.mod 
    module versPack

    go 1.13
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  let's using the latest package's version available
&lt;/h3&gt;

&lt;p&gt;Pretty easy use the latest version of the package: just pull it and the go.mod will search for the latest tag on the master branch.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;go get github.com/enbis/versioningPack
    go: finding github.com v1.1.0
    go: finding github.com/enbis v1.1.0
~/Projects/other/gomod&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;cat &lt;/span&gt;go.mod 
    module versPack

    go 1.13

    require github.com/enbis/versioningPack v1.1.0 // indirect
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's try the package's feature in the main function.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight go"&gt;&lt;code&gt;&lt;span class="k"&gt;package&lt;/span&gt; &lt;span class="n"&gt;main&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;v1&lt;/span&gt; &lt;span class="s"&gt;"github.com/enbis/versioningPack"&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;func&lt;/span&gt; &lt;span class="n"&gt;main&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;v1&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;GetVersion&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Looking at the output generated confirms the go.mod is referring to the v1.1.0.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;go run main.go 
    Version 1.1.0 pulled from remote repository
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  let's using the package's version v1.0.0
&lt;/h2&gt;

&lt;p&gt;Now, we'd like to work with the previous version of the versioningPack. We start deleting both go.mod and go.sum file, we need to init again the go.mod and pull a specific version of the github.com/enbis/versioningPack package. The secret lies in formatting the request when you pull the package, adding &lt;code&gt;@&lt;/code&gt; you can request a specific version among the tags contained on the master branch.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;rm &lt;/span&gt;go.mod go.sum
~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;go mod init versPack
~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;go get github.com/enbis/versioningPack@v1.0.0
    go: finding github.com v1.0.0
    go: finding github.com/enbis v1.0.0
~/Projects/other/gomod&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;cat &lt;/span&gt;go.mod 
    module versPack

    go 1.13

    require github.com/enbis/versioningPack v1.0.0 // indirect
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As expected we pulled the v1.0.0. That's will be confirmed by running the main function.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;go run main.go 
    Version 1.0.0 pulled from remote repository
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  let's using the package's version v2.0.0
&lt;/h2&gt;

&lt;p&gt;What happened to our new major version? How can we pull it? If we try to get it using the request &lt;code&gt;go get github.com/enbis/versioningPack@v2.0.0&lt;/code&gt; an error occurs: invalid version module. That because v2.0.0 resides on the v2 branch. So, we need to specify it when we launch the request. Let's try with &lt;code&gt;go get github.com/enbis/versioningPack/v2&lt;/code&gt;, the suffix matches the branch name.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;go get github.com/enbis/versioningPack/v2
~/Projects/other/gomod&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;cat &lt;/span&gt;go.mod 
    module versPack

    go 1.13

    require github.com/enbis/versioningPack/v2 v2.0.0 // indirect
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;All we need to do is rename the package, with the proper reference contained inside the go.mod.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight go"&gt;&lt;code&gt;&lt;span class="k"&gt;package&lt;/span&gt; &lt;span class="n"&gt;main&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="n"&gt;v2&lt;/span&gt; &lt;span class="s"&gt;"github.com/enbis/versioningPack/v2"&lt;/span&gt;

&lt;span class="k"&gt;func&lt;/span&gt; &lt;span class="n"&gt;main&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;v2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;GetVersion&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Trying it, the result is fairly obvious.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;go run main.go 
    Version 2.0.0 pulled from remote repository
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  let's using two different version at the same time
&lt;/h2&gt;

&lt;p&gt;Now, what we are going to do is using both major version in the same file. This is a little used case history in a normal case but is useful to explain the power of the module in Go. You can make the two major versions coexist in the same solution, as long as they reside on two different branches and have different references in their go.mod respectively.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;go get github.com/enbis/versioningPack/v2
~/Projects/other/gomod&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;cat &lt;/span&gt;go.mod 
    module versPack

    go 1.13

    require &lt;span class="o"&gt;(&lt;/span&gt;
        github.com/enbis/versioningPack v1.1.0 // indirect
        github.com/enbis/versioningPack/v2 v2.0.0 // indirect
    &lt;span class="o"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As usual, two aliases are required to refer to the two versions of the same package.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight go"&gt;&lt;code&gt;&lt;span class="k"&gt;package&lt;/span&gt; &lt;span class="n"&gt;main&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;v1&lt;/span&gt; &lt;span class="s"&gt;"github.com/enbis/versioningPack"&lt;/span&gt;
    &lt;span class="n"&gt;v2&lt;/span&gt; &lt;span class="s"&gt;"github.com/enbis/versioningPack/v2"&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;func&lt;/span&gt; &lt;span class="n"&gt;main&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;v1&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;GetVersion&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;v2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;GetVersion&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is the output.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;~/Projects/testVersioning&lt;span class="nv"&gt;$ &lt;/span&gt;go run main.go 
    Version 1.1.0 pulled from remote repository
    Version 2.0.0 pulled from remote repository
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  let's testing a new feature of the package before versioning it
&lt;/h2&gt;

&lt;p&gt;So far, we saw how to use different versions of the same package and how to make two different versions coexist together. Great, but there is more. You can replace the path of the package inside the go.mod file in order to refer to your own version of the same package. Just add the keyword &lt;code&gt;replace&lt;/code&gt; and the path of the package you want to replace.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;~/Projects/other/gomod&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;cat &lt;/span&gt;go.mod 
    module versPack

    go 1.13

    require &lt;span class="o"&gt;(&lt;/span&gt;
        github.com/enbis/versioningPack v1.1.0 // indirect
        github.com/enbis/versioningPack/v2 v2.0.0 // indirect
    &lt;span class="o"&gt;)&lt;/span&gt;

    replace github.com/enbis/versioningPack &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; ../versioningPack
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's is pretty clear, I copied the package one folder above the location of the project folder. The last line under the "require" defines the original versioningPack is now replaced with the local package. Let's try to make some changes to the local package.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight go"&gt;&lt;code&gt;&lt;span class="k"&gt;package&lt;/span&gt; &lt;span class="n"&gt;versioning&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="s"&gt;"fmt"&lt;/span&gt;

&lt;span class="k"&gt;func&lt;/span&gt; &lt;span class="n"&gt;GetVersion&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;fmt&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Println&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Version 1.1.1 from local folder"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And run the main function of the project.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;~/Projects/testVersioning$ go run main.go 
    Version 1.1.1 from local folder
    Version 2.0.0 pulled from remote repository
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  conclusion
&lt;/h2&gt;

&lt;p&gt;That's just the tip of the iceberg regarding Go packages and module but I hope you found this analysis interesting, despite the simplicity of the solution. Enjoy your time developing your new Go packages, which I'm sure will be much more useful than mine github.com/enbis/versioningPack.&lt;/p&gt;

</description>
      <category>go</category>
      <category>tutorial</category>
      <category>versioning</category>
    </item>
    <item>
      <title>AMQP Exchange type comparison, using GO RabbitMQ client.</title>
      <dc:creator>enbis</dc:creator>
      <pubDate>Wed, 22 Apr 2020 09:43:39 +0000</pubDate>
      <link>https://dev.to/enbis/amqp-exchange-type-comparison-using-go-rabbitmq-client-39p7</link>
      <guid>https://dev.to/enbis/amqp-exchange-type-comparison-using-go-rabbitmq-client-39p7</guid>
      <description>&lt;p&gt;RabbitMQ is a popular and powerful open-source message broker. I don't want to bother who will read it writing details of the technology behind RabbitMQ, prons and cons or mainly use. I prefer to deepen the features of the Advanced Message Queuing Protocol (AMQP) exploring the differences regarding the Exchange type by using some easy examples.&lt;/p&gt;

&lt;h2&gt;
  
  
  requirements
&lt;/h2&gt;

&lt;p&gt;You can trust my word, or you can try by your self the examples provided below. In the latter case, this is what you need to launch the examples:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;linux distribution&lt;/li&gt;
&lt;li&gt;docker&lt;/li&gt;
&lt;li&gt;go installed ( my version 1.13.3 )&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  tl;dr
&lt;/h2&gt;

&lt;p&gt;RabbitMQ works as a mediator, or middleware, that allows software components to communicate. Producer and Consumer represent respectively the one who creates the message and the one who receives the message, don't have to reside on the same network and the interoperability between them allows the use of different technologies. Here, for simplicity, we will work in the same network and everything will be developed in Go. &lt;br&gt;
The Message Broker is responsible for distributing the message from Producer to Consumer, guaranteeing the message received. The atomic unit of the system is the Message, it contains in its body the binary data sent. Message body could be set in different types of formats such as text, XML, JSON and thanks to its binary content stored the data are uniquely interpreted.&lt;/p&gt;
&lt;h3&gt;
  
  
  Other components involved
&lt;/h3&gt;

&lt;p&gt;We have already faced Producer, Consumer and Message Broker. The communication process described above has been explained too easily. Inside the RabbitMQ, different components allow the procedure to be completed. The message flow starts when the Producer creates the message and sends it to the Broker. The first component the message encounters is the Exchange that routes it into zero or more Message Queue, according to the Exchange type and matching between the routing key of the message and the binding key of the Bindings. Binding is something like a link or a relationship between Queue and Exchange. Message Queue stores messages in memory or disk and delivers messages to the Consumer.&lt;br&gt;
So, the Exchange is the element responsible for delivers of the messages by applying a routing algorithm. The core of this post is to explain how the Exchange behavior changes comparing these types: &lt;strong&gt;fanout&lt;/strong&gt;, &lt;strong&gt;direct&lt;/strong&gt; and &lt;strong&gt;topic&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ftzy7ie9io8euktg14oto.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ftzy7ie9io8euktg14oto.png" alt="Alt Text" width="800" height="415"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  core of the post
&lt;/h2&gt;

&lt;p&gt;Let's try to apply RabbitMQ to an application that can be easily explained, or at least that's the easiest solution I imagined.&lt;br&gt;
The environment is a home automation system able to control its peripherals (like a bulb light) by receiving messages through the RabbitMQ broker. So the Broker could be an interface between some apps (Producer) and the actuators of the house (Consumer). &lt;/p&gt;
&lt;h3&gt;
  
  
  project tree
&lt;/h3&gt;

&lt;p&gt;Here the link where to find the repository.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;https://github.com/enbis/learning-rabbitmq&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;The project tree is shown below. The folders referring to the cases dealt in this post are the four exposed. The other folders are sided tests.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.
├── acknowledgment
├── direct-exchange
│   ├── config.yml
│   ├── consumer
│   │   ├── consumer.go
│   │   └── consumer_test.go
│   ├── makefile
│   ├── producer
│   │   ├── producer.go
│   │   └── producer_test.go
│   └── README.md
├── durability
├── fair-dispatch
├── fanout-exchange
│   ├── config.yml
│   ├── consumer
│   │   ├── consumer.go
│   │   └── consumer_test.go
│   ├── makefile
│   ├── producer
│   │   ├── producer.go
│   │   └── producer_test.go
│   └── README.md
├── global
│   ├── config
│   │   └── config.yml
│   ├── connection
│   │   └── handler.go
│   ├── models
│   │   └── models.go
│   └── utils
│       └── utils.go
├── README.md
├── round-robin-queuing
├── start-messaging
└── topic-exchange
    ├── config.yml
    ├── consumer
    │   ├── consumer.go
    │   └── consumer_test.go
    ├── makefile
    ├── producer
    │   ├── producer.go
    │   └── producer_test.go
    └── README.md
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;All the examples refer to the global folder for the functions of common use like the connection handler or the utils. Within each folder, you will find the makefile, config file, and the README with some instructions to execute the example.&lt;/p&gt;

&lt;h3&gt;
  
  
  docker
&lt;/h3&gt;

&lt;p&gt;Be careful pulling the correct version of the docker image, to be able to inspect the useful RabbitMQ UI on localhost:15672.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;docker run --rm --name myrabbitmq -p 5672:5672 -p 15672:15672 rabbitmq:3-management&lt;/code&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  fanout-exchange
&lt;/h3&gt;

&lt;p&gt;Fanout Exchange type is only capable of broadcasting, it ignores the Binding routing-key and sends messages to each Queue it knows. So, the messages sent by the Producer arrive at the Exchange, which forwards them to the Message Queue unconditionally.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fjvgwgc747ixs36qnoiha.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fjvgwgc747ixs36qnoiha.png" alt="Alt Text" width="800" height="415"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Applying it to the home automation example, we can imagine having two Queues operating two distinct rooms. The Consumer can be imagined as a light actuator located inside each room. As soon as the Producer sends the commands to the Exchange, the messages will be forwarded to both Queue with the result that the action to turn on or off the light will be executed simultaneously in both rooms.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;To run the test&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;From within the fanout-exchange folder&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Terminal 0 -&amp;gt; &lt;code&gt;make test/consumer/room0&lt;/code&gt; -&amp;gt; It runs the Consumer#0, as room0.&lt;/li&gt;
&lt;li&gt;Terminal 1 -&amp;gt; &lt;code&gt;make test/consumer/room1&lt;/code&gt; -&amp;gt; It runs the Consumer#1, as room1.&lt;/li&gt;
&lt;li&gt;Terminal 2 -&amp;gt; &lt;code&gt;make test/producer&lt;/code&gt; -&amp;gt; It runs the Producer, that sends messages to switch on / off the light bulb.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;It will produce the following results&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Terminal 2 -&amp;gt; The Producer&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;0: Light bulb On
1: Light bulb Off
2: Light bulb On
3: Light bulb Off
4: Light bulb On
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Terminal 0 -&amp;gt; The Consumer#0 as a Room0&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Waiting messages
message received: Light bulb On
message received: Light bulb Off
message received: Light bulb On
message received: Light bulb Off
message received: Light bulb On
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Terminal 1 -&amp;gt; The Consumer#1 as a Room1&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Waiting messages
message received: Light bulb On
message received: Light bulb Off
message received: Light bulb On
message received: Light bulb Off
message received: Light bulb On
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  direct-exchange
&lt;/h3&gt;

&lt;p&gt;Direct Exchange handles a simple routing algorithm: a message goes to the Queues whose binding key exactly matches the routing key of the message. That type of Exchange is able to forward data selectively, on the other side it can't do routing based on multiple criteria. So, the Exchange receives the message with routing key R and sends it to Queue bound to the Exchange with binding key B only if R and B are equals.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6lsx5e2hpi60gx84wuhl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6lsx5e2hpi60gx84wuhl.png" alt="Alt Text" width="800" height="415"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Applying it to the home automation example, again we can imagine having two Queues operating two distinct rooms. The Queue related to the first room is bound to the Exchange using routing key &lt;strong&gt;all_rooms&lt;/strong&gt; and &lt;strong&gt;first_room&lt;/strong&gt;; the Queue related to the second room is bound to the Exchange using routing key &lt;strong&gt;all_rooms&lt;/strong&gt; and &lt;strong&gt;second_room&lt;/strong&gt;. So, messages with routing key all_rooms are routed to both Queues, rather messages with routing key first_room or second_room are routed respectively to first Queue or the second Queue. In our imaginary home automation system, we can think to send the command to switch on the light with routing key all_rooms, and switch off them selectively using the other two routing keys. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;To run the test&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;From within the direct-exchange folder&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Terminal 0 -&amp;gt; &lt;code&gt;make test/consumer/room0&lt;/code&gt; -&amp;gt; It runs the Consumer#0, binding with all_rooms and first_room routing key.&lt;/li&gt;
&lt;li&gt;Terminal 1 -&amp;gt; &lt;code&gt;make test/consumer/room1&lt;/code&gt; -&amp;gt; It runs the Consumer#1, binding with all_rooms and second_room routing key&lt;/li&gt;
&lt;li&gt;Terminal 2 -&amp;gt; &lt;code&gt;make test/producer&lt;/code&gt; -&amp;gt; It runs the Producer. It sends 3 messages: Bulb On, Bulb Off and Bulb Off. Each message is destined to the proper Queue, depending on the binding key.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;It will produce the following results&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Terminal 2 -&amp;gt; The Producer&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;0: Light bulb On to routingKey all_rooms
1: Light bulb Off to routingKey first_room
2: Light bulb Off to routingKey second_room
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Terminal 0 -&amp;gt; The Consumer#0 as a Room0&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Binding to  all_rooms
Binding to  first_room
Waiting messages

message received: Light bulb On
message received: Light bulb Off
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Terminal 1 -&amp;gt; The Consumer#1 as a Room1&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Binding to all_rooms
Binding to second_room
Waiting messages

message received: Light bulb On
message received: Light bulb Off
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  topic-exchange
&lt;/h3&gt;

&lt;p&gt;Topic Exchange handles a complex routing algorithm, leveraging on messages with routing key composed by a list of words delimited by dots. The logic behind the topic Exchange is not so different from the direct Exchange, a message containing a rounding key will be delivered to all Queues that are bound to the Exchange with a matching binding key pattern. The difference with respect to the previous case lies in the fact that in this case there are special characters that allow a greater spectrum of combinations.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;*&lt;/code&gt; star substitutes exactly one word of the key pattern&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;#&lt;/code&gt; hash substitutes for zero or more words of the key pattern&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F80fekptmx351kj74dyyb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F80fekptmx351kj74dyyb.png" alt="Alt Text" width="800" height="415"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Applying it to the home automation example, we have a broader case study to work on. Firstly we need to specify the categories used to define the pattern. Thinking about a list of three words, we can identify the categories as follow: &amp;lt;building&amp;gt;.&amp;lt;room&amp;gt;.&amp;lt;target&amp;gt;. Now we can imagine the Queues being addressed to a group of different actuators: the first one will act on the lights, the second on everything related to the garage area (lights included). So, back to the binding patterns, the first Queue will handle this pattern &lt;strong&gt;#.lights&lt;/strong&gt; to specify that is interested in everything related to the implementation of the lights, rather the second Queue receives all commands related to the garage: &lt;strong&gt;*.garage.*&lt;/strong&gt;. In that situation, all the messages that have routing key with &lt;strong&gt;lights&lt;/strong&gt; as last routing key parameters will be forwarded by the first Queue; while all the messages containing &lt;strong&gt;garage&lt;/strong&gt; as a second routing key parameter are intended to the second Queue. In some cases, the messages will be forwarded to both the Queues.  &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;This is the list messages tested&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;routing key &lt;code&gt;house.room1.light&lt;/code&gt;, with body On -&amp;gt; direct to lights only&lt;/li&gt;
&lt;li&gt;routing key &lt;code&gt;house.garage.light&lt;/code&gt;, with body On -&amp;gt; direct to both Consumers&lt;/li&gt;
&lt;li&gt;routing key &lt;code&gt;house.garage.door&lt;/code&gt;, with body Open -&amp;gt; direct to garage only&lt;/li&gt;
&lt;li&gt;routing key &lt;code&gt;house.backyard.irrigation&lt;/code&gt;, with body On -&amp;gt; neither of two consumers receives the message&lt;/li&gt;
&lt;li&gt;routing key &lt;code&gt;house.garage.light.desktopLamp&lt;/code&gt;, with body On -&amp;gt; neither of two consumers receives the message, the pattern format is not recognizable.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The Consumer uses simple &lt;code&gt;strings.Split()&lt;/code&gt; function to read the routing key and understand the action contained in the body of the message.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;go func() {
    for msg := range msgs {
        s := strings.Split(string(msg.RoutingKey), ".")
        fmt.Println("Where: " + s[1])
        fmt.Println("What: " + s[2])
        fmt.Println(string(msg.Body))
        fmt.Println("----------")
    }

}()
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;To run the test&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;From within the topic-exchange folder&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Terminal 0 -&amp;gt; &lt;code&gt;make test/consumer/lights&lt;/code&gt; -&amp;gt; It runs the Consumer#0, that handles this binding pattern: &lt;code&gt;#.light&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Terminal 1 -&amp;gt; &lt;code&gt;make test/consumer/garage&lt;/code&gt; -&amp;gt; It runs the Consumer#1, that handles this binding pattern: &lt;code&gt;*.garage.*&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Terminal 2 -&amp;gt; &lt;code&gt;make test/producer&lt;/code&gt; -&amp;gt; It runs the Producer, sending 5 messages with different routing key.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;It will produce the following results&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Terminal 2 -&amp;gt; The Producer&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;0: value 0 Action: On to routingKey house.room1.light
1: value 1 Action: On to routingKey house.garage.light
2: value 2 Action: Open to routingKey house.garage.door
3: value 3 Action: On to routingKey house.backyard.irrigation
4: value 4 Action: On to routingKey house.garage.light.desktopLamp
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Terminal 0 -&amp;gt; The Consumer#0&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;qbinding  #.light
Waiting messages

Where: room1
What: light
0 Action: On
----------
Where: garage
What: light
1 Action: On
----------
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Terminal 1 -&amp;gt; The Consumer#1&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;qbinding  *.garage.*
Waiting messages

Where: garage
What: light
1 Action: On
----------
Where: garage
What: door
2 Action: Open
----------
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  conclusion
&lt;/h2&gt;

&lt;p&gt;After all this, my personal advice is to keep using your hands to turn the house lights on and off 🙌💡&lt;/p&gt;

</description>
      <category>amqp</category>
      <category>go</category>
      <category>rabbitmq</category>
    </item>
    <item>
      <title>Generate and run WebAssembly code using Go</title>
      <dc:creator>enbis</dc:creator>
      <pubDate>Sun, 22 Mar 2020 19:38:54 +0000</pubDate>
      <link>https://dev.to/enbis/generate-and-run-webassembly-code-using-go-4fbp</link>
      <guid>https://dev.to/enbis/generate-and-run-webassembly-code-using-go-4fbp</guid>
      <description>&lt;p&gt;Dive in the features of WebAssembly is not the focus along these few lines and for sure I'm not the most suitable person to do that. My purpose is to talk about how WebAssembly and Go can work together, since I'm interested in all the possible applications of Go language and recently I discovered this new solution that I'll try to show here step by step.&lt;br&gt;
For those who have never heard WebAssembly I think could be fair a brief presentation of about what that standard is and what is used for.&lt;/p&gt;
&lt;h2&gt;
  
  
  A few word on WebAssembly
&lt;/h2&gt;

&lt;p&gt;Probably the best approach to start with is take a look at the official website &lt;a href="https://webassembly.org/" rel="noopener noreferrer"&gt;https://webassembly.org/&lt;/a&gt;.&lt;br&gt;
&lt;code&gt;&lt;br&gt;
WebAssembly is a binary instruction format for a stack-based virtual machine. Wasm is designed as a portable target for compilation of high-level languages like C/C++/Rust, enabling deployment on the web for client and server applications.&lt;br&gt;
&lt;/code&gt;&lt;br&gt;
Let's try to analyze the keywords.&lt;br&gt;
The first thing to know is that the WebAssembly code is compiled to binary format, the file generated after the build has &lt;code&gt;.wasm&lt;/code&gt; extension. &lt;br&gt;
The JavaScript engine decodes and interprets the .wasm file once it will be loaded by the web page. To execute the code a stack-based virtual machine is intended as a pointer that maintains the position of the executed code and a virtual control keep track of blocks within the code as though a stack. &lt;br&gt;
The portability concerns the possibility to execute efficiently the wasm file on a variety of operating systems and architectures. WebAssembly is not tied to a specific environment and generate the binary file is easy starting from source code written in high-level languages.&lt;br&gt;&lt;br&gt;
WebAssembly in nutshell:  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;useful for developers that want to run code on a web browser without needs third party plugin, the major browsers natively supports WebAssembly&lt;/li&gt;
&lt;li&gt;it is fast, WebAssembly code runs at a speed that is pretty close to the native speed&lt;/li&gt;
&lt;li&gt;the .wasm binary can be generated using a wide range of languages like C, C++ and Rust. Now even in Go.&lt;/li&gt;
&lt;li&gt;the browser's JavaScript engine will interpret the .wasm file and execute the functions. &lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Build the WebAssembly binary file using GO code
&lt;/h2&gt;

&lt;p&gt;The WebAssembly compile target is available starting from Go 1.11. &lt;br&gt;
The Go 1.12 version add some breaking changes in &lt;code&gt;syscall/js&lt;/code&gt;, the package that will be briefly explore below. In order to proceed with this post without wasting time fixing errors, be ensure you have installed Go 1.12 or higher. You can quickly check the version installed with the command &lt;code&gt;go version&lt;/code&gt;. &lt;br&gt;
Now we can spent a few minutes exploring the algorithm we will use to obtain our purpose. Thinking about the easiest solution, we can read two integers received as parameters from the web page and print out the sum of them. Let's take a look at the code before moving to the build process for the .wasm file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;package main

import (
    "fmt"
    "strconv"
    "syscall/js"
)

var done = make(chan struct{})

func main() {
    callback := js.FuncOf(printResult)
    defer callback.Release()
    setResult := js.Global().Get("setResult")
    setResult.Invoke(callback)
    &amp;lt;-done
}

func printResult(value js.Value, args []js.Value) interface{} {
    value1 := args[0].String()
    v1, err := strconv.Atoi(value1)
    if err != nil {
        fmt.Errorf("error %s", err.Error())
        return err
    }
    value2 := args[1].String()
    v2, err := strconv.Atoi(value2)
    if err != nil {
        fmt.Errorf("error %s", err.Error())
        return err
    }

    fmt.Printf("%d\n", v1+v2)
    done &amp;lt;- struct{}{}
    return nil
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;printResult&lt;/code&gt; function didn't do anything else but get the values from JavaScript argument, parse it to int and print the sum. The channel is used to notify the callback has been called.&lt;br&gt;&lt;br&gt;
In the main function there are maybe the darker aspects of this silly example. First of all, let's see the package imported: &lt;code&gt;syscall/js&lt;/code&gt;, here is the link to get more information &lt;a href="https://golang.org/pkg/syscall/js/" rel="noopener noreferrer"&gt;https://golang.org/pkg/syscall/js/&lt;/a&gt;.&lt;br&gt;
&lt;code&gt;&lt;br&gt;
The package gives access to the WebAssembly host environment when using the js/wasm architecture. Its API is based on JavaScript semantics.&lt;br&gt;
&lt;/code&gt;&lt;br&gt;
&lt;code&gt;js.FuncOf(printResult)&lt;/code&gt; wraps the Go function &lt;code&gt;printResult&lt;/code&gt; as a callback and its purpose is the sum operation. The JavaScript's values are passed to that function through &lt;code&gt;js.Value&lt;/code&gt; array. The &lt;code&gt;defer callback.Release()&lt;/code&gt; is used to free up resources as soon as the function returns. &lt;code&gt;setResult&lt;/code&gt; is the JavaScript property used as a resolver of the promise, that wait for the result of the wrapped Go callback. We will better see this aspect later. &lt;br&gt;
Now it's time to build the wasm file. It's an easy extension of the build command we have already learned to use when compiling Go code. Be careful to specify the js/wasm architecture using GOOS and GOARCH environment variables.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;GOOS=js GOARCH=wasm go build -o main.wasm toWasm.go
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;GOOS specifies the operating system, the default value is overwritten with the instance of JavaScript &lt;/li&gt;
&lt;li&gt;GOARCH the default values specifies the processor architecture, in this case is overwritten with the WebAssembly value.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;GOOS and GOARCH specified in the command tells Go to create a WebAssmebly file, without these details the .wasm file won't be created. If everything has been done well, we will find the &lt;code&gt;main.wasm&lt;/code&gt; file in our directory.  &lt;/p&gt;

&lt;h2&gt;
  
  
  Load the build and launch the HTML file
&lt;/h2&gt;

&lt;p&gt;The first step is to copy the main.wasm just generated in the same directory selected to hold the html code as well. After that we need to copy the &lt;code&gt;wasm_exec.js&lt;/code&gt; file in the same directory, this file is part of the Go installation package. &lt;br&gt;
Assuming you are already in the working directory, copy and paste the following command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cp "$(go env GOROOT)/misc/wasm/wasm_exec.js" .
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now you have both &lt;code&gt;main.wasm&lt;/code&gt; and &lt;code&gt;wasm_exec.js&lt;/code&gt; in the same path, there is nothing more to do but add the &lt;code&gt;index.html&lt;/code&gt;. This is the tree output of my working dir.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ tree
.
├── index.html
├── main.wasm
└── wasm_exec.js

0 directories, 3 files
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Thinking about what we need to provide to the &lt;code&gt;printResult&lt;/code&gt; function, is necessary add two &lt;code&gt;&amp;lt;input&amp;gt;&lt;/code&gt; to get the values from the web page and a &lt;code&gt;&amp;lt;button&amp;gt;&lt;/code&gt; to run the sum execution. The sum result will be printed out on the browser console.&lt;/p&gt;

&lt;p&gt;Other than that the &lt;code&gt;index.html&lt;/code&gt; &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;includes the &lt;code&gt;wasm_exec.js&lt;/code&gt; file we copied before which allowing us to use the &lt;code&gt;Go()&lt;/code&gt; constructor&lt;/li&gt;
&lt;li&gt;loads the &lt;code&gt;main.wasm&lt;/code&gt; as source using the &lt;code&gt;WebAssembly.instantiateStreaming()&lt;/code&gt; method. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The return values of the &lt;code&gt;instantiateStreaming()&lt;/code&gt; are &lt;code&gt;mod = result.module&lt;/code&gt; representing the compiled WebAssembly module and &lt;code&gt;inst = result.instance&lt;/code&gt; representing the object that contains the exported functions. &lt;br&gt;
The instance &lt;code&gt;const go = new Go()&lt;/code&gt; is necessary to run the code exported.&lt;br&gt;
Finally, &lt;code&gt;run()&lt;/code&gt; is an async function that waits for the result of the promise. The execution of the job is based on passing the values to the Go code. As soon as the promise make the result available, the code is free to reset the instance in order to be ready for the next run &lt;code&gt;inst = await WebAssembly.instantiate(mod, go.importObject)&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;html&amp;gt;
    &amp;lt;head&amp;gt;
        &amp;lt;meta charset="utf-8"&amp;gt;
        &amp;lt;title&amp;gt;Go WebAssembly&amp;lt;/title&amp;gt;
    &amp;lt;/head&amp;gt;

    &amp;lt;body&amp;gt;
        &amp;lt;script src="wasm_exec.js"&amp;gt;&amp;lt;/script&amp;gt;
        &amp;lt;script&amp;gt;
            if (!WebAssembly.instantiateStreaming) {
                WebAssembly.instantiateStreaming = async (resp, importObject) =&amp;gt; {
                    const source = await(await resp).arrayBuffer();
                    return await WebAssembly.instantiate(source, importObject);
                };
            }
            const go = new Go()
            let mod, inst;
            WebAssembly.instantiateStreaming(fetch("main.wasm"), go.importObject).then((result) =&amp;gt; {
                mod = result.module;
                inst = result.instance;
            }).catch((err) =&amp;gt;{
                console.error(err)
            });

            var setResult
            async function run() {

                const printResultPromise = new Promise(resolve =&amp;gt; {
                    setResult = resolve
                })
                const run = go.run(inst)

                const printResult = await printResultPromise

                printResult(document.querySelector('#value1').value, document.querySelector('#value2').value)
                await run 

                inst = await WebAssembly.instantiate(mod, go.importObject)
            }
        &amp;lt;/script&amp;gt;
        &amp;lt;button onClick="run()" id="runButton"&amp;gt;Run&amp;lt;/button&amp;gt;
        &amp;lt;input id="value1" type="text"&amp;gt;
        &amp;lt;input id="value2" type="text"&amp;gt;
    &amp;lt;/body&amp;gt;

&amp;lt;/html&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now we have finished the code review. Let's try the solution loading the &lt;code&gt;index.html&lt;/code&gt; file on the browser, this will be the outcome.  &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fej1r9k3tj6lsq9j3s4hx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fej1r9k3tj6lsq9j3s4hx.png" alt="Alt Text" width="800" height="292"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is a just silly example, but the key is that create some WebAssembly code starting from Go is an easy job because of its cross-compilation capabilities. That open new possibilities leveraging on the browser support no longer for the full benefit of JavaScript only.&lt;/p&gt;

</description>
      <category>go</category>
      <category>webassembly</category>
    </item>
    <item>
      <title>How udev rules can help us to recognize a usb-to-serial device over /dev/tty interface</title>
      <dc:creator>enbis</dc:creator>
      <pubDate>Wed, 22 Jan 2020 22:52:13 +0000</pubDate>
      <link>https://dev.to/enbis/how-udev-rules-can-help-us-to-recognize-a-usb-to-serial-device-over-dev-tty-interface-pbk</link>
      <guid>https://dev.to/enbis/how-udev-rules-can-help-us-to-recognize-a-usb-to-serial-device-over-dev-tty-interface-pbk</guid>
      <description>&lt;p&gt;This is the story of how I found an easy solution to recognize two identical devices connected to the pc via usb-to-serial interface. At first I tried to understand how the pc allocates the &lt;code&gt;/dev/ttyUSB&lt;/code&gt; interfaces, searching a reason beyond its incremental allocation. In particular, I had some problems when the pc turns on with both usb ports connected. In this situation figure out which &lt;code&gt;ttyUSB&lt;/code&gt; interface the device is referring could be tricky. Luckily the &lt;strong&gt;udev rules&lt;/strong&gt; can help me to break the deadlock, and now I'll try to explain how starting at the beginning.  &lt;/p&gt;

&lt;p&gt;I was working on a service dedicated to process input data coming from two identical usb-to-serial devices. Code side was a very simple service developed in &lt;em&gt;Go&lt;/em&gt;, whose purpose is open two goroutines and manage the data streams.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;var reader *bufio.Reader
c := &amp;amp;serial.Config{Name: "COM_name", Baud: 115200}
s, err := serial.OpenPort(c)
if err != nil {
    fmt.Printf("Open port error: %s\n", err)
    os.Exit(1)
}
reader = bufio.NewReader(s)
scanner := bufio.NewScanner(reader)
for scanner.Scan() {
    ...
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;These few lines of code are for open a serial communication via COM port, and start reading the stream coming from the connected device. Here there is a parameter that could be troublesome to manage at runtime, especially with more than one device to be acknowledged: I'm talking about the &lt;em&gt;COM_name&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;At the boot time, the machine assigns random numbers for each ttyUSB devices plugged, apparently without following logical process. In my case, the two devices were generally assigned the values 0 and 1, without being able to control the order of assignment. This means that for some switching-on the ttyUSB0 port was associated to device A and ttyUSB1 to device B, some other vice versa. To avoid this problem I had to refer to a &lt;strong&gt;symbolic link&lt;/strong&gt; to these devices, which should remain valid between reboots. The udev rules can do that for me.&lt;/p&gt;

&lt;h2&gt;
  
  
  checklist of the process
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;identify the environment variable required to distinguish the devices, using &lt;strong&gt;udevadm&lt;/strong&gt; command &lt;/li&gt;
&lt;li&gt;write the &lt;strong&gt;udev rule&lt;/strong&gt; that uses the environment variable to create the specific SYMLINK&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;reload the rules&lt;/strong&gt; in order to apply the changes&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Conventions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;# Linux command to be executed with root privileges. &lt;/li&gt;
&lt;li&gt;$ Linux command to be executed as a regular non-privileged user&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  1. udevadm
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;udev&lt;/strong&gt; is a device manager for the Linux kernel, able to manage the device nodes in the /dev directory.&lt;br&gt;
So, first of all you need to find out which /dev/tty interface the machine assigned to the devices. In my case, I already know that Linux grouped my devices under default UART-over-USB name, so &lt;em&gt;ttyUSBx&lt;/em&gt; ( x stands for the incremental index ). Other possibilities could be ttyS or ttyACM.&lt;br&gt;&lt;br&gt;
To list the assigned interfaces, just run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ ls -al /dev/ttyUSB*

crw-rw---- 1 root dialout 188, 0 gen 21 12:24 /dev/ttyUSB0
crw-rw---- 1 root dialout 188, 1 gen 21 12:24 /dev/ttyUSB1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now you have the devpath to deepen the knowledge of your devices. So it's time to execute a &lt;strong&gt;udevadm info&lt;/strong&gt; command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ udevadm info -a -n /dev/ttyUSB0

Udevadm info starts with the device specified by the devpath and then
walks up the chain of parent devices. It prints for every device
found, all possible attributes in the udev rules key format.
A rule to match, can be composed by the attributes of the device
and the attributes from one single parent device.

looking at device 
'/devices/pci0000:00/0000:00:14.0/usb1/1-8/1-8.1/1-8.1:1.0/ttyUSB0/tty/ttyUSB0':
    KERNEL=="ttyUSB0"
    SUBSYSTEM=="tty"
    DRIVER==""

looking at parent device 
'/devices/pci0000:00/0000:00:14.0/usb1/1-8/1-8.1/1-8.1:1.0/ttyUSB0':
    KERNELS=="ttyUSB0"
    SUBSYSTEMS=="usb-serial"
    DRIVERS=="ftdi_sio"
    ATTRS{latency_timer}=="16"
    ATTRS{port_number}=="0"

looking at parent device 
'/devices/pci0000:00/0000:00:14.0/usb1/1-8/1-8.1/1-8.1:1.0':
    KERNELS=="1-8.1:1.0"
    SUBSYSTEMS=="usb"
    DRIVERS=="ftdi_sio"
    ATTRS{authorized}=="1"
    ATTRS{bAlternateSetting}==" 0"
    ATTRS{bInterfaceClass}=="ff"
    ATTRS{bInterfaceNumber}=="00"
    ATTRS{bInterfaceProtocol}=="ff"
    ATTRS{bInterfaceSubClass}=="ff"
    ATTRS{bNumEndpoints}=="02"
    ATTRS{interface}=="brd3"
    ATTRS{supports_autosuspend}=="1"
    ....
    ....
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;On top, that command prints the attributes of the specified device and then walks up the chain of parent devices. Scrolling through the information, you can find some difference between two identical devices ( comparing differen ATTRS{...} values ). Unfortunately, that attributes are not available for udev matching rules, you have to extract the &lt;strong&gt;environment variables&lt;/strong&gt;. So taking note of the device devpath ( be careful the first devpath, not the parent device ) you can extract the differences between the environment variables with the &lt;strong&gt;udevadm test&lt;/strong&gt; command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ udevadm test 
'/devices/pci0000:00/0000:00:14.0/usb1/1-8/1-8.1/1-8.1:1.0/ttyUSB0/tty/ttyUSB0' &amp;gt;u0

$ udevadm test 
'/devices/pci0000:00/0000:00:14.0/usb1/1-8/1-8.3/1-8.3:1.0/ttyUSB1/tty/ttyUSB1' &amp;gt;u1

$ diff -u u0 u1

 .ID_PORT=0
 ACTION=add
-DEVLINKS=/dev/serial/by-id/usb-LABS_brd3_A95B4RL5-if00-port0 /dev/serial/by-path/pci-0000:00:14.0-usb-0:8.3:1.0-port0
-DEVNAME=/dev/ttyUSB0
-DEVPATH=/devices/pci0000:00/0000:00:14.0/usb1/1-8/1-8.3/1-8.3:1.0/ttyUSB0/tty/ttyUSB0
+DEVLINKS=/dev/serial/by-id/usb-LABS_brd4_A93TPMCI-if00-port0 /dev/serial/by-path/pci-0000:00:14.0-usb-0:8.2:1.0-port0
+DEVNAME=/dev/ttyUSB1
+DEVPATH=/devices/pci0000:00/0000:00:14.0/usb1/1-8/1-8.2/1-8.2:1.0/ttyUSB1/tty/ttyUSB1
 ID_BUS=usb
 ID_MM_CANDIDATE=1
-ID_MODEL=brd3
-ID_MODEL_ENC=brd3
+ID_MODEL=brd4
+ID_MODEL_ENC=brd4
 ID_MODEL_FROM_DATABASE=FT232 Serial (UART) IC
-ID_PATH=pci-0000:00:14.0-usb-0:8.3:1.0
-ID_PATH_TAG=pci-0000_00_14_0-usb-0_8_3_1_0
+ID_PATH=pci-0000:00:14.0-usb-0:8.2:1.0
+ID_PATH_TAG=pci-0000_00_14_0-usb-0_8_2_1_0
-ID_SERIAL_SHORT=A95B4RL5
+ID_SERIAL_SHORT=A93TPMCI
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this list I choose the &lt;strong&gt;ID_MODEL&lt;/strong&gt; as variable to recognize and diversify the two devices. Now I can write the &lt;strong&gt;.rules&lt;/strong&gt; file to create and persist the SYMLINK.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. udev rules
&lt;/h3&gt;

&lt;p&gt;Udev rules are defined into files with .rules extension. The location reserved for custom made rules is &lt;code&gt;/etc/udev/rules.d/&lt;/code&gt;. The files in which the rules are defined are conventionally named with a number as prefix, then the name of the rule and the .rules extension, and are processed in lexical order. The syntax of udev rules is composed by the &lt;em&gt;match section&lt;/em&gt; in which are defined the conditions, and the &lt;em&gt;action section&lt;/em&gt; in which is performed the action. Since both of my devices are based on FTDI adapter, might be a good idea write the .rules starting from idVendor and idProduct attribute, which are the same for the two interfaces. The comparison of the previous point has shown that the devices differing in ID_MODEL environment variable, so I can use that value to create the symbolic link.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#cat &amp;gt;/etc/udev/rules.d/99-usb-serial.rules &amp;lt;&amp;lt;'EOT'
SUBSYSTEM=="tty", ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6001",
SYMLINK+="tty%E{ID_MODEL}"
EOT
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That is the rule, now I just have to reload the rules to activate it.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. reload the rules
&lt;/h3&gt;

&lt;p&gt;Reloading the rules serves to ensure that the process ends successfully.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# udevadm control --reload
# reboot
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As soon as the computer is turned on, and for all the subsequent boots, you will see the &lt;strong&gt;symbolic link&lt;/strong&gt; provided by the .rules file. This means that now you will always know which COM port is related for each devices plugged to the pc since it is linked to its ID_MODEL.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ ls -al /dev/tty* | grep USB
lrwxrwxrwx 1 root   root          7 gen 21 12:24 /dev/ttybrd3 -&amp;gt; ttyUSB0
lrwxrwxrwx 1 root   root          7 gen 21 12:24 /dev/ttybrd4 -&amp;gt; ttyUSB1
crw-rw---- 1 root   dialout 188,  0 gen 21 12:24 /dev/ttyUSB0
crw-rw---- 1 root   dialout 188,  1 gen 21 12:24 /dev/ttyUSB1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now it won't be a problem anymore the random index provided by the machine for the /dev/ttyUSB device interface.&lt;/p&gt;

</description>
      <category>linux</category>
      <category>go</category>
      <category>tutorial</category>
    </item>
  </channel>
</rss>
