<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Fabrizio Guglielmino</title>
    <description>The latest articles on DEV Community by Fabrizio Guglielmino (@guglielmino).</description>
    <link>https://dev.to/guglielmino</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/guglielmino"/>
    <language>en</language>
    <item>
      <title>Refactor to Microservices with gRPC</title>
      <dc:creator>Fabrizio Guglielmino</dc:creator>
      <pubDate>Sun, 26 May 2019 17:20:33 +0000</pubDate>
      <link>https://dev.to/guglielmino/refactor-to-microservices-with-grpc-33id</link>
      <guid>https://dev.to/guglielmino/refactor-to-microservices-with-grpc-33id</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Developer job is hard, often we are litterally exposed to infernal conditions (I hav a tendency to dramatize :-) ). Some time ago it was the &lt;a href="https://en.wikipedia.org/wiki/DLL_Hell"&gt;DLL Hell&lt;/a&gt;, more recently the &lt;a href="http://callbackhell.com/"&gt;callbacks hell&lt;/a&gt;, but the one that I fear the most is &lt;strong&gt;THE LEGACY CODE HELL&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--x6E5nhMf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://24t9d72kcs873my15o9hr1pu-wpengine.netdna-ssl.com/wp-content/uploads/2014/12/06-programming-coding-is-hell.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--x6E5nhMf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://24t9d72kcs873my15o9hr1pu-wpengine.netdna-ssl.com/wp-content/uploads/2014/12/06-programming-coding-is-hell.png" alt="Legacy code Hell"&gt;&lt;/a&gt;&lt;br&gt;
credits toggl.com&lt;/p&gt;

&lt;p&gt;In a perfect world you create project from scratch, choosing the architectural patterns and tools that fit for what you want to achieve. Since we're not in a perfect world, more often than not, we need to work on legacy code. In my career this happened a lot of times, and for sure I'm not alone, that's why psychologists earn a lot of money.&lt;/p&gt;

&lt;h2&gt;
  
  
  Splitting the Hell
&lt;/h2&gt;

&lt;p&gt;A piece of Hell is better than full Hell, or at least this is how is supposed to be. What I'm going to describe is an approach to split a legacy application in small pieces and try to rule them as an application.&lt;/p&gt;

&lt;p&gt;Subject of the article is a monolithic Python 2.7 application. The approach is to create a proof of concept to validate the progressive porting of a monolith codebase to a microservices architecture.&lt;br&gt;
Microservice is an abused term, a buzzword if you like, but it's an interesting architectural pattern with a lot of benefits, if adopted with pragmatism. For example, migrating the "monolith" code base from Python 2.7 to Python 3.x could be a pain. Instead, splitting the project in small components (or services), and let them communicate each other, can be a lot easier, divide et impera folks! The foundation to split a project in this way is to define an efficient way to manage service to service communication. It must be simple, fast, scalable and battle tested, the name for that thing is RPC system (Remote Procedure Call).&lt;/p&gt;

&lt;h2&gt;
  
  
  RPC
&lt;/h2&gt;

&lt;p&gt;Remote Procedure Call is a quite old idea, since the very first computer's networks started to spread, some RPC system was implemented. RPC is normally based on a request/response pattern, there are many RPC systems all around, often implemented in very different ways. Even though, the idea is always the same: a process &lt;em&gt;A&lt;/em&gt; makes a request to a process &lt;em&gt;B&lt;/em&gt; which can respond something to &lt;em&gt;A&lt;/em&gt;. Those processes can run in the same host or in different ones, assumed that they are able to communicate each other through the network. This is a simplified view but, from a logical standpoint, it solves our requirement. Of, course there is a lot more to take in consideration to choose the right RPC, specifically it should be:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Resilient&lt;/li&gt;
&lt;li&gt;Performant&lt;/li&gt;
&lt;li&gt;Secure&lt;/li&gt;
&lt;li&gt;Language agnostic&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Last point is particular important nowadays, I'm a great opponent of the "silver bullet" approach, that often is "if all you have is a hammer, everything looks like a nail". Having the choice from a wide range of languages, you can discover that some components are better if developed with JavaScript, other in Python and some other in Go, it's powerfull! (and at the same time dangerous if aboused). &lt;/p&gt;

&lt;h2&gt;
  
  
  Validate the architectural change
&lt;/h2&gt;

&lt;p&gt;It's a best practice to validate an architectural approach creating (at least) a pilot project, a PoC if you prefer. At the same time it's mandatory to clearly define a list of requirements to validate, in this case they are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Should be able to call services implemented in different Python versions (2.x and 3.x)&lt;/li&gt;
&lt;li&gt;Should be able to call services implemented in different Language, say JavaScript&lt;/li&gt;
&lt;li&gt;Should work in a containers environment&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Normally it's better to keep the list quite short, validating what specifically we need. In this case the specific need, in plain English, is to check how gRPC works with different languages inside a containers environment.&lt;/p&gt;

&lt;h3&gt;
  
  
  gRPC as service communication system
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;gRPC is a modern, open source remote procedure call (RPC) framework that can run anywhere&lt;/em&gt;, that's what you can read from the &lt;a href="https://grpc.io/faq/"&gt;official site FAQ&lt;/a&gt;. It looks exactly what we are looking for, then it worth to give it a try.&lt;/p&gt;

&lt;p&gt;gRPC uses &lt;a href="https://developers.google.com/protocol-buffers/"&gt;protocol buffer&lt;/a&gt; as a mechanism to serialize data and define the service interfaces. Using a specific language to create the interface it's a quite common approach, in RPC terms it's called &lt;a href="https://en.wikipedia.org/wiki/Interface_description_language"&gt;IDL&lt;/a&gt;. Tipically, IDL is a custom description language, specifically tailored to design the interface used in services comminications.&lt;br&gt;
Focusing on the projects structure if you use an IDL you need at least two thing:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;One or more IDL sources, for the services interfaces&lt;/li&gt;
&lt;li&gt;A way to use (compile, or dynamically load) the IDL definitions in your code &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In simple words the IDL is a contract shared between processes that need to communicate each other, both one-way or two-ways. This is an important point in managing the project structure because you need to get a decision on how to keep the IDL sources shared by the projects using them.&lt;/p&gt;

&lt;h2&gt;
  
  
   Defining the interface
&lt;/h2&gt;

&lt;p&gt;Let's start with an example of the IDL interface we are going to use in the PoC.&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight protobuf"&gt;&lt;code&gt;&lt;span class="na"&gt;syntax&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"proto3"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="s"&gt;"common.proto"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kn"&gt;package&lt;/span&gt; &lt;span class="nn"&gt;notificator&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;service&lt;/span&gt; &lt;span class="n"&gt;NotificatorService&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;rpc&lt;/span&gt; &lt;span class="n"&gt;SendNotification&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;NotificationPayload&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;returns&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Result&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kd"&gt;message&lt;/span&gt; &lt;span class="nc"&gt;NotificationPayload&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="na"&gt;destination&lt;/span&gt;  &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="kd"&gt;message&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;It could be scary at first look but in fact it's quite simple. The main points here is the service definition, what operations the service provides and how data is structured. Translating the above IDL in plain English, we are defining a &lt;code&gt;NotificationService&lt;/code&gt; exposing a single method called &lt;code&gt;SendNotification&lt;/code&gt;, that method expects to receive a &lt;code&gt;NotificationPayload&lt;/code&gt; as input and responds with a &lt;code&gt;Result&lt;/code&gt; as output. &lt;code&gt;Result&lt;/code&gt; is defined in an external file to test how IDL files can be organized splitting the code.&lt;br&gt;
An important thing, that immediately shows up, is that there is an extra work to create and maintain those files. This is the core aspect of gRPC, having a strict interface definition, a contract between services, is very important to keep control of the communication between services.&lt;/p&gt;

&lt;p&gt;Lastly, IDL files can be loaded at runtime or use gRPC tools to statically generate code from them. There is no ideal solution, it mostly depends on build and deploy infrascructure, in this project I used the latter approach.&lt;/p&gt;

&lt;h2&gt;
  
  
  Implementation
&lt;/h2&gt;

&lt;p&gt;It's time to start writing the code, but first it's mandatory to define a project structure. Since my preferred approach is to start as simple as I can, I created the project folders as below.&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
├── client-2.x
├── protos
│   ├── common.proto
│   └── notification.proto
└── server-3.x
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Nothing special here, the two folders &lt;code&gt;client-2.x&lt;/code&gt; and &lt;code&gt;server-3.x&lt;/code&gt;contain the code of a hypothetical service and his consumer, I called them client and server to be clear on the roles but keep in mind that in gRPC there is no a role concept, it defines interfaces and how service interact each other isn't something it needs to know. The intersting folder is the &lt;code&gt;protos&lt;/code&gt;, I put here the IDL sources with the interface definitions. The project consist of a service to send notifications (whathewer it's a push notification, SMS or anything else). Then, service definition defines a method to send the notification, the payload with message body and destination address. Traslated in Protocol Buffer IDL this is equivalent to code for IDL interface in the previous paragraph.&lt;/p&gt;

&lt;p&gt;In Protocol Buffer method parameters, and return types, need to be always defined as custom types, in other terms you can't use primitive types, like &lt;code&gt;string&lt;/code&gt; or &lt;code&gt;bool&lt;/code&gt; as they are, it's mandatory to define a custom type. &lt;br&gt;
In our case &lt;code&gt;NotificationPayload&lt;/code&gt;'s definition is shown in the bottom, while &lt;code&gt;Result&lt;/code&gt; is imported from &lt;code&gt;common.proto&lt;/code&gt;. One cavets in proto files type definition is about the numbers assigned to each property (like &lt;code&gt;destination = 1&lt;/code&gt; or &lt;code&gt;message = 2&lt;/code&gt; in the above sample). Those numbers are related to how &lt;a href="https://developers.google.com/protocol-buffers/docs/encoding"&gt;Protocol Buffer encoding&lt;/a&gt; works. What it's important to know is that they must be unique in the message definition and, most important, if changed the encoded data is incompatible with a client using the old numeration.&lt;/p&gt;

&lt;p&gt;There are many other details about Protocol Buffer, they are well documented in the &lt;a href="https://developers.google.com/protocol-buffers/docs/proto3"&gt;official Protocol Buffer Documentation&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Installing dependencies
&lt;/h2&gt;

&lt;p&gt;Both projects, &lt;code&gt;client-2.x&lt;/code&gt; and &lt;code&gt;server-3-x&lt;/code&gt;, come with a &lt;code&gt;requirements.txt&lt;/code&gt; file. As a de facto standard, having this file it's trivial to install all the project dependencies with &lt;code&gt;pip install -r requirement.txt&lt;/code&gt;.&lt;br&gt;
Looking inside the requirements file can be interesting to see what the project needs, in particular the two core packages are &lt;code&gt;grpcio&lt;/code&gt; and &lt;code&gt;grpcio-tools&lt;/code&gt;, those are the gRPC implementation and a tool packages, core packages to use gRPC.&lt;/p&gt;

&lt;h3&gt;
  
  
  Note about the Makefile(s)
&lt;/h3&gt;

&lt;p&gt;You'll notice in the project some Makefiles, that's not because I'm a nostalgic C/C++ developer :-). It is because Python lacks a standard way to define scripts, like Node.js does with &lt;code&gt;scripts&lt;/code&gt; in the &lt;code&gt;package.json&lt;/code&gt;. I find the &lt;code&gt;Makefile&lt;/code&gt; a good compromise, instead of creating custom shell script, so the project dependencies can be installed  with &lt;code&gt;make install&lt;/code&gt;, typing simply &lt;code&gt;make&lt;/code&gt; are listed all the commands provided. Of course, &lt;code&gt;make&lt;/code&gt; must be present on the system, how to install it is out of scope and OS dependend but there is a HUGE amount of documentation all around about this.&lt;/p&gt;

&lt;h2&gt;
  
  
  Calling a service
&lt;/h2&gt;

&lt;p&gt;All right up here, but how we use the IDL to call a service via gRPC? As I wrote before there are two way to use the &lt;code&gt;proto&lt;/code&gt; files, in this project we generate the code from the IDL. We noticed before that besides the Python gRPC package there is another one called &lt;code&gt;grpc_tools&lt;/code&gt;. It's hard to guess, but it turns out to be a package providing tools for gRPC. One function provided is the code generation starting from the &lt;code&gt;proto&lt;/code&gt; file, that's what we are going to use.&lt;br&gt;
Let's start with &lt;code&gt;client-2.x&lt;/code&gt; project, it's exactly the same for &lt;code&gt;server-3.x&lt;/code&gt;, using the make file provided in the project it's matter of running &lt;code&gt;make build&lt;/code&gt;. Actually the Makefile runs the Python gRPC tools, looking inside one of the Makefile provided inside the client or the server we can see how.&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight shell"&gt;&lt;code&gt;python &lt;span class="nt"&gt;-m&lt;/span&gt; grpc_tools.protoc &lt;span class="nt"&gt;-I&lt;/span&gt;../protos &lt;span class="nt"&gt;--python_out&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nb"&gt;.&lt;/span&gt; &lt;span class="nt"&gt;--grpc_python_out&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nb"&gt;.&lt;/span&gt; ../protos/common.proto
python &lt;span class="nt"&gt;-m&lt;/span&gt; grpc_tools.protoc &lt;span class="nt"&gt;-I&lt;/span&gt;../protos &lt;span class="nt"&gt;--python_out&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nb"&gt;.&lt;/span&gt; &lt;span class="nt"&gt;--grpc_python_out&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nb"&gt;.&lt;/span&gt; ../protos/notification.proto
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Running the above commands will produce some new Python source files. These files are Python translation of service and the payloads defined in &lt;code&gt;proto&lt;/code&gt; file. The thing to notice is that for each &lt;code&gt;proto&lt;/code&gt; file are created two files. By convention these files have the same &lt;code&gt;proto&lt;/code&gt;'s name and a postfix, one is &lt;code&gt;_pb2.py&lt;/code&gt; and the other one is &lt;code&gt;_pb2_grpc.py&lt;/code&gt;. Quite simply, the former is where data structures are defined, like &lt;code&gt;NotificationPayload&lt;/code&gt;, the latter is where service stubs are.&lt;br&gt;
Let's start from the client, calling the &lt;code&gt;NotificationService&lt;/code&gt; is as simple as the following code.&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight python"&gt;&lt;code&gt;    &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;grpc&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;insecure_channel&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;'{0}:{1}'&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;GRPC_HOST&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;GRPC_PORT&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;stub&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;notification_pb2_grpc&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;NotificatorServiceStub&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;stub&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SendNotification&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                   &lt;span class="n"&gt;notification_pb2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;NotificationPayload&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;destination&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"Fabrizio"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"Hello!!!"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                   &lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;It's simple, isn't it? It's matter of creating a gRPC channel, instanciate the stub and calling our &lt;code&gt;SendNotification&lt;/code&gt; on the stub as it was defined somewhere in our project, if you are familiar with design pattern it's a proxy. The &lt;code&gt;insecure_channel&lt;/code&gt; it's to take a part the overhead of the security, gRPC address security seriously but to keep code readble I choose to bypass this part (anyway, it's well documented on the &lt;a href="https://grpc.io/docs/guides/auth.html"&gt;official site&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;One important note about the environment: I wrote that one requirement for the PoC is to test service communication between different Python versions. If you want to test the project without Docker (below more information about it) you need to use Python 2.7 for the client and Pythion 3.6 for the server, on the same machine. This can be done with &lt;code&gt;virtualenv&lt;/code&gt;, a quick introduction to it can be found &lt;a href="https://blog.dbrgn.ch/2012/9/18/virtualenv-quickstart/"&gt;here&lt;/a&gt;, anyway if you prefer the "let me see how it works as soon as possibile" approach, read the "Running in Docker" paragraph below.&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating the service
&lt;/h2&gt;

&lt;p&gt;At this point we have almost everything, we definded the IDL, developed the client but we miss the main dish: the service! &lt;br&gt;
I left the service implementation after the client on purpose, having already defined the IDL and the client it should be clear what we need from it. The important point to focus on is that we need, somewhere in the code, the implementation of the service we want to make available through gRPC, below our super-mega-cool &lt;code&gt;NotificationService&lt;/code&gt;.&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;NotificatorServiceServicer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;notification_pb2_grpc&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;NotificatorServiceServicer&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;SendNotification&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;  &lt;span class="n"&gt;request&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;logging&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;debug&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"handling notification message '{request.message}' to {request.destination})  "&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;common_pb2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Result&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;status&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;It is immediately clear that we are implementing here: the interface defined in our IDL. Base class &lt;code&gt;notification_pb2_grpc.NotificatorServiceServicer&lt;/code&gt; payload and result are the ones desinged in the IDL.&lt;br&gt;
The implementation is trivial: we use &lt;code&gt;message&lt;/code&gt; and &lt;code&gt;destination&lt;/code&gt; coming from request, which is &lt;code&gt;NotificationPayload&lt;/code&gt;, to log a message, responding with a &lt;code&gt;Result&lt;/code&gt; wrapping a success status &lt;code&gt;status=True&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Defining the service is not enough to make it available to client, we need a way to expose the service over the network, four line of code are all what we need for that.&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;server&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;grpc&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;server&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;futures&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ThreadPoolExecutor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;max_workers&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="n"&gt;notification_pb2_grpc&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;add_NotificatorServiceServicer_to_server&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;NotificatorServiceServicer&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;server&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;server&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;add_insecure_port&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"0.0.0.0:5001"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;server&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;start&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Shortly, we create a gRPC server instance, bound our service to it, define the port on which listening for requests and ran the server. Under the hood a lot of stuff is happeng but for now let us content ourselves with this.&lt;/p&gt;

&lt;p&gt;At this point running the server in a &lt;code&gt;virtualenv&lt;/code&gt; with Python 3.6 and the client in another one with Python 2.7 they should start calling each others, the full source code is available &lt;a href="https://github.com/guglielmino/grpc-python-poc"&gt;here&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What about using other languages ?
&lt;/h2&gt;

&lt;p&gt;I didn't forget one of the most important point to check with the PoC, testing the interoperability with other languages. Now, that we got a bit of confidence with gRPC and how it works, it's time to introduce a new client. This one uses JavaScript, working exactly in the same way of the Python 2.x one. Of course, there are gRPC bindings for almost any language (C, C++, Java, C#, ...) but I choose to use JavaScript because nowedays it is one of the most widespread laguages.&lt;br&gt;
In the previous project strutture I lied, I omitted the JavaScript client, the real project structure is the one below.&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

├── client-2.x
├── js-client     &amp;lt;&amp;lt;&amp;lt;=== You are here!!!
├── protos
│   ├── common.proto
│   └── notification.proto
└── server-3.x
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Obviously, the JavaScript client is intended to have the same behaviour of the Python one, if you are confident with the Node.js environment you know that the first step is to install dependencies (aka node modules).&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm intall
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;With all the modules in place we need to generate the gRPC proxy code, from the proto files, as we did in for the Python version. As usual in Node.js environment there is a script defined in &lt;code&gt;package.json&lt;/code&gt; for that&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm run build
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;That's a shortcut but "under the hood" the command is quite similar to the one used for the Python client.&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;grpc_tools_node_protoc --js_out=import_style=commonjs,binary:. --grpc_out=. --plugin=protoc-gen-grpc=node_modules/grpc-tools/bin/grpc_node_plugin -I ../protos/ common.proto &amp;amp;&amp;amp; grpc_tools_node_protoc --js_out=import_style=commonjs,binary:. --grpc_out=. --plugin=protoc-gen-grpc=node_modules/grpc-tools/bin/grpc_node_plugin -I ../protos/ notification.proto
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;In short, I used the &lt;code&gt;protoc&lt;/code&gt; (aka the protobuf IDL compiler) specific for Node.js, this command creates four files, in the same way I did with  the &lt;code&gt;protoc&lt;/code&gt; invoked with Python above&lt;/p&gt;

&lt;h2&gt;
  
  
  Running in Docker
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Containers
&lt;/h3&gt;

&lt;p&gt;If you followed all the article instructions at this point you are able to run everything locally, but since one of my requirements was to test the project inside a containers environment, the project contains Dockerfile(s) and docker-compose definition. Again, installation of Docker is out of scope (I feel like joke of the inventor of the skydivers hook (*) )&lt;/p&gt;

&lt;h3&gt;
  
  
  Running locally with docker compose
&lt;/h3&gt;

&lt;p&gt;Assuming that Docker environment is configured in the machine, running the project is matter of running &lt;code&gt;docker-compose up&lt;/code&gt; in the root folder. After a while the console will be flooded by messages from both server and client.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6a2uQYap--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/v81w0g2q3l3hluxoufn4.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6a2uQYap--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/v81w0g2q3l3hluxoufn4.gif" alt="running docker compose"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Each iteration three messages are printed on the standard ouput.&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight shell"&gt;&lt;code&gt;client_1  | DEBUG:root:Client: sending notification calling gRPC server
server_1  | DEBUG:root:handling notification message &lt;span class="s1"&gt;'Hello!!!'&lt;/span&gt; to Fabrizio&lt;span class="o"&gt;)&lt;/span&gt;  
client_1  | DEBUG:root:Client: notification sent
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;We scratched just the tip of the iceberg, gRPC is quite complex and I have overlooked a lot of details. If at this point is clear how gRPC can help in splitting architectures in components, I achived my main goal. The obvious suggestion is to go deepen in the advanced topics, the &lt;a href="https://grpc.io/"&gt;official site&lt;/a&gt; is the place where to start, and try to use it in some small projects.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;(*) joke of the inventor of the skydivers hook&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;An inventor went to the patent office saying: "I invented a hook to save life of skydivers and I want to patent it". &lt;/p&gt;

&lt;p&gt;&lt;em&gt;Employee&lt;/em&gt; said: "Well, tell me how it works"&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Inventor&lt;/em&gt;: "Simple, if the parachute doesn't open the skydiver can use the hook to save his life" &lt;/p&gt;

&lt;p&gt;&lt;em&gt;Employee&lt;/em&gt;: "Ok, fine but where the skydiver is supposed to hook?" &lt;/p&gt;

&lt;p&gt;&lt;em&gt;Inventor&lt;/em&gt;: "Hey, I can't just make up all the stuff by myself!"&lt;/p&gt;

</description>
      <category>python</category>
      <category>grpc</category>
      <category>node</category>
    </item>
  </channel>
</rss>
