<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: alabulei1</title>
    <description>The latest articles on DEV Community by alabulei1 (@alabulei1).</description>
    <link>https://dev.to/alabulei1</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/alabulei1"/>
    <language>en</language>
    <item>
      <title>Running JavaScript in WebAssembly with WasmEdge</title>
      <dc:creator>alabulei1</dc:creator>
      <pubDate>Thu, 23 Sep 2021 16:38:46 +0000</pubDate>
      <link>https://dev.to/alabulei1/running-javascript-in-webassembly-with-wasmedge-8ld</link>
      <guid>https://dev.to/alabulei1/running-javascript-in-webassembly-with-wasmedge-8ld</guid>
      <description>&lt;p&gt;&lt;a href="https://webassembly.org/" rel="noopener noreferrer"&gt;WebAssembly&lt;/a&gt; started as a “JavaScript alternative for browsers”. The idea is to run high-performance applications compiled from languages like C/C++ or Rust safely in browsers. In the browser, WebAssembly runs side by side with JavaScript.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiuwpx7ri4h7pljq78dj5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiuwpx7ri4h7pljq78dj5.png" alt="WebAssembly and JavaScript in the browser"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Figure 1. WebAssembly and JavaScript in the browser.&lt;/p&gt;

&lt;p&gt;As WebAssembly is increasingly used in the cloud, it is now a &lt;a href="https://github.com/WasmEdge/WasmEdge" rel="noopener noreferrer"&gt;universal runtime for cloud-native applications&lt;/a&gt;. Compared with Docker-like application containers, WebAssembly runtimes achieve higher performance with lower resource consumption. The common uses cases for WebAssembly in the cloud include the following.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Runtime for &lt;a href="https://github.com/second-state/aws-lambda-wasm-runtime" rel="noopener noreferrer"&gt;serverless function-as-a-service&lt;/a&gt; (FaaS)&lt;/li&gt;
&lt;li&gt;Embedding &lt;a href="http://reactor.secondstate.info/en/docs/" rel="noopener noreferrer"&gt;user-defined functions into SaaS&lt;/a&gt; apps or databases&lt;/li&gt;
&lt;li&gt;Runtime for &lt;a href="https://github.com/second-state/dapr-wasm" rel="noopener noreferrer"&gt;sidecar applications&lt;/a&gt; in a service mesh&lt;/li&gt;
&lt;li&gt;Programmable plug-ins for web proxies&lt;/li&gt;
&lt;li&gt;Sandbox runtimes for edge devices including &lt;a href="https://www.secondstate.io/articles/second-state-joins-the-autoware-foundation/" rel="noopener noreferrer"&gt;software-defined vehicles&lt;/a&gt; and smart factories&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;However, in those cloud-native use cases, developers often want to use JavaScript to write business applications. That means we must now support &lt;a href="https://github.com/WasmEdge/WasmEdge/blob/master/docs/run_javascript.md" rel="noopener noreferrer"&gt;JavaScript in WebAssembly&lt;/a&gt;. Furthermore, we should support calling C/C++ or Rust functions from JavaScript in a WebAssembly runtime to take advantage of WebAssembly’s computational efficiency. The WasmEdge WebAssembly runtime allows you to do exactly that.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd53ekvc322ieagp6cw2h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd53ekvc322ieagp6cw2h.png" alt="WebAssembly and JavaScript in the cloud"&gt;&lt;/a&gt;&lt;br&gt;
Figure 2. WebAssembly and JavaScript in the cloud.&lt;/p&gt;
&lt;h2&gt;
  
  
  WasmEdge
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/WasmEdge/WasmEdge" rel="noopener noreferrer"&gt;WasmEdge&lt;/a&gt; is a leading cloud-native WebAssembly runtime &lt;a href="https://www.secondstate.io/articles/wasmedge-joins-cncf/" rel="noopener noreferrer"&gt;hosted by the CNCF&lt;/a&gt; (Cloud Native Computing Foundation) / Linux Foundation. It is the fastest WebAssembly runtime in the market today. WasmEdge supports all standard WebAssembly extensions as well as proprietary extensions for Tensorflow inference, KV store, and image processing, etc. Its compiler toolchain supports not only WebAssembly languages such as C/C++, Rust, Swift, Kotlin, and AssemblyScript but also &lt;a href="https://github.com/WasmEdge/WasmEdge/blob/master/docs/run_javascript.md" rel="noopener noreferrer"&gt;regular JavaScript&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;A WasmEdge application can be embedded into a &lt;a href="https://github.com/WasmEdge/WasmEdge/blob/master/docs/c_api_quick_start.md" rel="noopener noreferrer"&gt;C&lt;/a&gt; program, a &lt;a href="https://www.secondstate.io/articles/extend-golang-app-with-webassembly-rust/" rel="noopener noreferrer"&gt;Go&lt;/a&gt; program, a &lt;a href="https://github.com/WasmEdge/WasmEdge/tree/master/wasmedge-rs" rel="noopener noreferrer"&gt;Rust&lt;/a&gt; program, a &lt;a href="https://www.secondstate.io/articles/getting-started-with-rust-function/" rel="noopener noreferrer"&gt;JavaScript&lt;/a&gt; program, or the operating system’s &lt;a href="https://github.com/WasmEdge/WasmEdge/blob/master/docs/run.md" rel="noopener noreferrer"&gt;CLI&lt;/a&gt;. The runtime can be managed by Docker tools (eg &lt;a href="https://www.secondstate.io/articles/manage-webassembly-apps-in-wasmedge-using-docker-tools/" rel="noopener noreferrer"&gt;CRI-O&lt;/a&gt;), orchestration tools (eg K8s), serverless platforms (eg &lt;a href="https://www.secondstate.io/articles/vercel-wasmedge-webassembly-rust/" rel="noopener noreferrer"&gt;Vercel&lt;/a&gt;, &lt;a href="https://www.secondstate.io/articles/netlify-wasmedge-webassembly-rust-serverless/" rel="noopener noreferrer"&gt;Netlify&lt;/a&gt;, &lt;a href="https://www.cncf.io/blog/2021/08/25/webassembly-serverless-functions-in-aws-lambda/" rel="noopener noreferrer"&gt;AWS Lambda&lt;/a&gt;, &lt;a href="https://github.com/second-state/tencent-scf-wasm-runtime" rel="noopener noreferrer"&gt;Tencent SCF&lt;/a&gt;), and data streaming frameworks (eg &lt;a href="https://www.secondstate.io/articles/yomo-wasmedge-real-time-data-streams/" rel="noopener noreferrer"&gt;YoMo&lt;/a&gt; and Zenoh).&lt;/p&gt;

&lt;p&gt;Now, you can run JavaScript programs in WasmEdge powered serverless functions, microservices, and AIoT applications! It not only runs plain JavaScript programs but also allows developers to use Rust and C/C++ to create new JavaScript APIs within the safety sandbox of WebAssembly.&lt;/p&gt;
&lt;h2&gt;
  
  
  Building a JavaScript engine in WasmEdge
&lt;/h2&gt;

&lt;p&gt;First, let’s build a WebAssmbly-based JavaScript interpreter program for WasmEdge. It is based on &lt;a href="https://bellard.org/quickjs/" rel="noopener noreferrer"&gt;QuickJS&lt;/a&gt; with WasmEdge extensions, such as &lt;a href="https://github.com/second-state/wasmedge_wasi_socket" rel="noopener noreferrer"&gt;network sockets&lt;/a&gt; and &lt;a href="https://www.secondstate.io/articles/wasi-tensorflow/" rel="noopener noreferrer"&gt;Tensorflow inference&lt;/a&gt;, incorporated into the interpreter as JavaScript APIs. You will need to &lt;a href="https://www.rust-lang.org/tools/install" rel="noopener noreferrer"&gt;install Rust&lt;/a&gt; to build the interpreter.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;If you just want to use the interpreter to run JavaScript programs, you can skip this section. Make sure you have installed &lt;a href="https://www.rust-lang.org/tools/install" rel="noopener noreferrer"&gt;Rust &lt;/a&gt;and &lt;a href="https://github.com/WasmEdge/WasmEdge/blob/master/docs/install.md" rel="noopener noreferrer"&gt;WasmEdge&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Fork or clone &lt;a href="https://github.com/second-state/wasmedge-quickjs" rel="noopener noreferrer"&gt;the wasmegde-quickjs Github repository&lt;/a&gt; to get started.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ git clone https://github.com/second-state/wasmedge-quickjs
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Following the instructions from that repo, you will be able to build a JavaScript interpreter for WasmEdge.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ rustup target add wasm32-wasi
$ cargo build --target wasm32-wasi --release
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The WebAssembly-based JavaScript interpreter program is located in the build target directory. You can now try a simple “hello world” JavaScript program (&lt;a href="https://github.com/second-state/wasmedge-quickjs/blob/main/example_js/hello.js" rel="noopener noreferrer"&gt;example_js/hello.js&lt;/a&gt;), which prints out the command line arguments to the console.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;args = args.slice(1)
print(“Hello”,…args)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Run the &lt;code&gt;hello.js&lt;/code&gt; file in WasmEdge’s QuickJS runtime as follows. Note, the &lt;code&gt;--dir .:.&lt;/code&gt; on the command line is to give &lt;code&gt;wasmedge&lt;/code&gt; permission to read the local directory in the file system for the &lt;code&gt;hello.js&lt;/code&gt; file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ wasmedge --dir .:. target/wasm32-wasi/release/quickjs-rs-wasi.wasm example_js/hello.js WasmEdge Runtime
Hello WasmEdge Runtime
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, let’s try a few more advanced JavaScript programs.&lt;/p&gt;

&lt;h2&gt;
  
  
  A JavaScript networking example
&lt;/h2&gt;

&lt;p&gt;The interpreter supports the WasmEdge networking socket extension so that your JavaScript can make HTTP connections to the Internet. Here is an example of JavaScript.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let r = GET("http://18.235.124.214/get?a=123",{"a":"b","c":[1,2,3]})
print(r.status)

let headers = r.headers
print(JSON.stringify(headers))let body = r.body;
let body_str = new Uint8Array(body)
print(String.fromCharCode.apply(null,body_str))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To run the JavaScript in the WasmEdge runtime, you can do this on the CLI.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ wasmedge --dir .:. target/wasm32-wasi/release/quickjs-rs-wasi.wasm example_js/http_demo.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should now see the HTTP GET result printed on the console.&lt;/p&gt;

&lt;h2&gt;
  
  
  A JavaScript Tensorflow inference example
&lt;/h2&gt;

&lt;p&gt;The interpreter supports the WasmEdge Tensorflow lite inference extension so that your JavaScript can run an ImageNet model for image classification. Here is an example of JavaScript.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import {TensorflowLiteSession} from 'tensorflow_lite'
import {Image} from 'image'let img = new Image('./example_js/tensorflow_lite_demo/food.jpg')
let img_rgb = img.to_rgb().resize(192,192)
let rgb_pix = img_rgb.pixels()let session = new TensorflowLiteSession('./example_js/tensorflow_lite_demo/lite-model_aiy_vision_classifier_food_V1_1.tflite')
session.add_input('input',rgb_pix)
session.run()
let output = session.get_output('MobilenetV1/Predictions/Softmax');
let output_view = new Uint8Array(output)
let max = 0;
let max_idx = 0;
for (var i in output_view){
    let v = output_view[i]
    if(v&amp;gt;max){
        max = v;
        max_idx = i;
    }
}
print(max,max_idx)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To run the JavaScript in the WasmEdge runtime, you can do the following on the CLI to re-build the QuickJS engine with Tensorflow and then run the JavaScript program with Tensorflow API.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ cargo build --target wasm32-wasi --release --features=tensorflow
... ...
$ wasmedge-tensorflow-lite --dir .:. target/wasm32-wasi/release/quickjs-rs-wasi.wasm example_js/tensorflow_lite_demo/main.js
label:
Hot dog
confidence:
0.8941176470588236
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The &lt;code&gt;--features=tensorflow&lt;/code&gt; compiler flag builds a version of the QuickJS engine with WasmEdge Tensorflow extensions.&lt;/li&gt;
&lt;li&gt;The &lt;code&gt;wasmedge-tensorflow-lite&lt;/code&gt; program is part of the WasmEdge package. It is the WasmEdge runtime with the Tensorflow extension built in.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You should now see the name of the food item recognized by the TensorFlow lite ImageNet model.&lt;/p&gt;

&lt;h2&gt;
  
  
  Make it faster
&lt;/h2&gt;

&lt;p&gt;The above Tensorflow inference example takes 1–2 seconds to run. It is acceptable in web application scenarios but could be improved. Recall that WasmEdge is the fastest WebAssembly runtime today due to its AOT (Ahead-of-time compiler) optimization. WasmEdge provides a &lt;code&gt;wasmedgec&lt;/code&gt; utility to compile the &lt;code&gt;wasm&lt;/code&gt; file to a native &lt;code&gt;so&lt;/code&gt; shared library. You can use &lt;code&gt;wasmedge&lt;/code&gt; to run the &lt;code&gt;so&lt;/code&gt; file instead of &lt;code&gt;wasm&lt;/code&gt; file to get much faster performance.&lt;/p&gt;

&lt;p&gt;The following example uses the extended versions to &lt;code&gt;wasmedge&lt;/code&gt; and &lt;code&gt;wasmedgec&lt;/code&gt; to support the WasmEdge Tensorflow extension.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ wasmedgec-tensorflow target/wasm32-wasi/release/quickjs-rs-wasi.wasm quickjs-rs-wasi.so
$ wasmedge-tensorflow-lite --dir .:. quickjs-rs-wasi.so example_js/tensorflow_lite_demo/main.js
label:
Hot dog
confidence:
0.8941176470588236
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can see that the image classification task can be completed within 0.1s. It is at least 10x improvement!&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The &lt;code&gt;so&lt;/code&gt; shared library is not portable across machines and OSes. You should run &lt;code&gt;wasmedgec&lt;/code&gt; and &lt;code&gt;wasmedgec-tensorflow&lt;/code&gt; on the machine you deploy and run the application.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  A note on QuickJS
&lt;/h2&gt;

&lt;p&gt;Now, the choice of QuickJS as our JavaScript engine might raise the question of performance. Isn’t QuickJS &lt;a href="https://bellard.org/quickjs/bench.html" rel="noopener noreferrer"&gt;a lot slower&lt;/a&gt; than v8 due to a lack of JIT support? Yes, but …&lt;/p&gt;

&lt;p&gt;First of all, QuickJS is a lot smaller than v8. In fact, it only takes 1/40 (or 2.5%) of the runtime resources v8 consumes. You can run a lot more QuickJS functions than v8 functions on a single physical machine.&lt;/p&gt;

&lt;p&gt;Second, for most business logic applications, raw performance is not critical. The application may have computationally intensive tasks, such as AI inference on the fly. WasmEdge allows the QuickJS applications to drop to high-performance WebAssembly for these tasks while it is not so easy with v8 to add such extensions modules.&lt;/p&gt;

&lt;p&gt;Third, it is known that &lt;a href="https://www.theregister.com/2021/08/06/edge_super_duper_security_mode/" rel="noopener noreferrer"&gt;many JavaScript security issues arise from JIT&lt;/a&gt;. Maybe turning off JIT in the cloud-native environment is not such a bad idea!&lt;/p&gt;

&lt;h2&gt;
  
  
  What’s next?
&lt;/h2&gt;

&lt;p&gt;The examples demonstrate how to use the &lt;code&gt;quickjs-rs-wasi.wasm&lt;/code&gt; JavaScript engine in WasmEdge. Besides using the CLI, you could use &lt;a href="https://www.secondstate.io/articles/manage-webassembly-apps-in-wasmedge-using-docker-tools/" rel="noopener noreferrer"&gt;Docker / Kubernetes tools&lt;/a&gt; to start the WebAssembly application or to embed the application into your own applications or frameworks as we discussed earlier in this article.&lt;/p&gt;

&lt;p&gt;In the next two articles, I will focus on using JavaScript together with Rust to make the most out of both languages.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;How to incorporate simple JavaScript snippets into a high-performance Rust app in WasmEdge.&lt;/li&gt;
&lt;li&gt;How to make a high-performance native function available as a JavaScript API in WasmEdge.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;JavaScript in cloud-native WebAssembly is still an emerging area in the next generation of cloud and edge computing infrastructure. We are just getting started! If you are interested, join us in the &lt;a href="https://github.com/WasmEdge/WasmEdge" rel="noopener noreferrer"&gt;WasmEdge&lt;/a&gt; project (or tell us what you want by raising feature request issues).&lt;/p&gt;

</description>
      <category>webassembly</category>
      <category>javascript</category>
      <category>rust</category>
    </item>
    <item>
      <title>Rust and WebAssembly Serverless functions in Vercel</title>
      <dc:creator>alabulei1</dc:creator>
      <pubDate>Mon, 23 Aug 2021 15:07:12 +0000</pubDate>
      <link>https://dev.to/alabulei1/rust-and-webassembly-serverless-functions-in-vercel-47e4</link>
      <guid>https://dev.to/alabulei1/rust-and-webassembly-serverless-functions-in-vercel-47e4</guid>
      <description>&lt;p&gt;&lt;a href="https://vercel.com/" rel="noopener noreferrer"&gt;Vercel&lt;/a&gt; is a leading platform for developing and hosting &lt;a href="https://jamstack.org/" rel="noopener noreferrer"&gt;Jamstack&lt;/a&gt; applications. Unlike traditional web apps, where the UI is dynamically generated at runtime from a server, a Jamstack application consists of a static UI (in HTML and JavaScript) and a set of serverless functions to support dynamic UI elements via JavaScript.&lt;/p&gt;

&lt;p&gt;There are many benefits to the Jamstack approach. But perhaps one of the most significant benefits is performance. Since the UI is no longer generated at runtime from a central server, there is much less load on the server and we can now deploy the UI via edge networks such as CDNs.&lt;/p&gt;

&lt;p&gt;However, the edge CDN only solves the problem of distributing the static UI files. The backend serverless functions could still be slow. In fact, popular serverless platforms have well-known performance issues, such as slow cold start, especially for interactive applications. That's where WebAssembly could help.&lt;/p&gt;

&lt;p&gt;With &lt;a href="https://github.com/WasmEdge/WasmEdge" rel="noopener noreferrer"&gt;WasmEdge&lt;/a&gt;, a cloud-native WebAssembly runtime &lt;a href="https://www.secondstate.io/articles/wasmedge-joins-cncf/" rel="noopener noreferrer"&gt;hosted by the CNCF&lt;/a&gt;, developers can write high-performance serverless functions deployed on the public cloud or on edge computing nodes. In this article, we will explore how to use WasmEdge functions, written in Rust, to power a Vercel application backend.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why WebAssembly in Vercel Serverless?
&lt;/h2&gt;

&lt;p&gt;The Vercel platform already has a very easy-to-use &lt;a href="https://vercel.com/docs/runtimes" rel="noopener noreferrer"&gt;serverless framework&lt;/a&gt; for deploying functions hosted in Vercel. As we discussed above, the reason to use WebAssembly, and WasmEdge, is to &lt;strong&gt;further improve performance&lt;/strong&gt;. High-performance functions written in C/C++, Rust, and Swift can be easily compiled into WebAssembly. Those WebAssembly functions are much faster than JavaScript or Python commonly used in serverless functions. &lt;/p&gt;

&lt;p&gt;However, if raw performance is the only goal, why not just compile those functions to machine native executables (Native Client or NaCl)? Vercel already runs these functions safely in application containers like Docker or Firecracker.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Our vision for the future is to run WebAssembly as an alternative lightweight  runtime side-by-side with Docker and other containers in cloud native infrastructure. WebAssembly offers much higher performance and consumes much less resources than Docker-like containers or Firecracker-like microVMs. But for now, Vercel only supports running WebAssembly inside other containers.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Running WebAssembly functions inside Docker-like containers offer advantages over running NaCl programs directly in Docker. &lt;/p&gt;

&lt;p&gt;For starters, WebAssembly provides fine-grained runtime &lt;strong&gt;isolation for individual functions&lt;/strong&gt;. A microservice could have multiple functions and support services running inside a Docker-like container. WebAssembly can make the microservice more secure and more stable. &lt;/p&gt;

&lt;p&gt;Second, the WebAssembly bytecode is &lt;strong&gt;portable&lt;/strong&gt;. Developers only need to build it once and do not need to worry about changes or updates to the underlying Vercel serverless container (OS and hardware). It also allows developers to reuse the same WebAssembly functions in other cloud environments.&lt;/p&gt;

&lt;p&gt;Third, WebAssembly apps are easy to deploy and manage. They have much less platform dependencies and complexities compared with NaCl dynamic libraries and executables. &lt;/p&gt;

&lt;p&gt;Finally, the &lt;a href="https://www.secondstate.io/articles/wasi-tensorflow/" rel="noopener noreferrer"&gt;WasmEdge Tensorflow API&lt;/a&gt; provides the &lt;strong&gt;most ergonomic way&lt;/strong&gt; to execute Tensorflow models in the Rust programming language. WasmEdge installs the correct combination of Tensorflow dependency libraries, and provides a unified API for developers.&lt;/p&gt;

&lt;p&gt;Enough with the concepts and explanations. Without further ado, let's jump into the example apps!&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisite
&lt;/h2&gt;

&lt;p&gt;Since our demo WebAssembly functions are written in Rust, you will need a &lt;a href="https://www.rust-lang.org/tools/install" rel="noopener noreferrer"&gt;Rust compiler&lt;/a&gt;. Make sure that you install the &lt;code&gt;wasm32-wasi&lt;/code&gt; compiler target as follows, in order to generate WebAssembly bytecode.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ rustup target add wasm32-wasi
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The demo application front end is written in &lt;a href="https://nextjs.org/" rel="noopener noreferrer"&gt;Next.js&lt;/a&gt;, and deployed on Vercel. We will assume that you already have the basic knowledge of how to work with Vercel.&lt;/p&gt;

&lt;h2&gt;
  
  
  Example 1: Image processing
&lt;/h2&gt;

&lt;p&gt;Our first demo application allows users to upload an image and then invoke a serverless function to turn it into black and white. A &lt;a href="https://vercel-wasm-runtime.vercel.app/" rel="noopener noreferrer"&gt;live demo&lt;/a&gt; deployed on Vercel is available.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffupa9o4kmriyx1zeg5z8.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffupa9o4kmriyx1zeg5z8.gif" alt="vercel-webassembly"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Fork the &lt;a href="https://github.com/second-state/vercel-wasm-runtime" rel="noopener noreferrer"&gt;demo application’s GitHub repo&lt;/a&gt; to get started. To deploy the application on Vercel, just &lt;a href="https://vercel.com/docs/git#deploying-a-git-repository" rel="noopener noreferrer"&gt;import the Github repo&lt;/a&gt; from &lt;a href="https://vercel.com/docs/git/vercel-for-github" rel="noopener noreferrer"&gt;Vercel for Github&lt;/a&gt; web page.&lt;/p&gt;

&lt;p&gt;This repo is a standard Next.js application for the Vercel platform. The backend serverless function is in the &lt;a href="https://github.com/second-state/vercel-wasm-runtime/tree/main/api/functions/image-grayscale" rel="noopener noreferrer"&gt;&lt;code&gt;api/functions/image_grayscale&lt;/code&gt;&lt;/a&gt; folder. The &lt;a href="https://github.com/second-state/vercel-wasm-runtime/blob/main/api/functions/image-grayscale/src/main.rs" rel="noopener noreferrer"&gt;&lt;code&gt;src/main.rs&lt;/code&gt;&lt;/a&gt; file contains the Rust program’s source code. The Rust program reads image data from the &lt;code&gt;STDIN&lt;/code&gt;, and then outputs the black-white image to the &lt;code&gt;STDOUT&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;use hex;
use std::io::{self, Read};
use image::{ImageOutputFormat, ImageFormat};

fn main() {
  let mut buf = Vec::new();
  io::stdin().read_to_end(&amp;amp;mut buf).unwrap();

  let image_format_detected: ImageFormat = image::guess_format(&amp;amp;buf).unwrap();
  let img = image::load_from_memory(&amp;amp;buf).unwrap();
  let filtered = img.grayscale();
  let mut buf = vec![];
  match image_format_detected {
    ImageFormat::Gif =&amp;gt; {
        filtered.write_to(&amp;amp;mut buf, ImageOutputFormat::Gif).unwrap();
    },
    _ =&amp;gt; {
        filtered.write_to(&amp;amp;mut buf, ImageOutputFormat::Png).unwrap();
    },
  };
  io::stdout().write_all(&amp;amp;buf).unwrap();
  io::stdout().flush().unwrap();
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can use Rust’s &lt;code&gt;cargo&lt;/code&gt; tool to build the Rust program into WebAssembly bytecode or native code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ cd api/functions/image-grayscale/
$ cargo build --release --target wasm32-wasi 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy the build artifacts to the &lt;code&gt;api&lt;/code&gt; folder.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ cp target/wasm32-wasi/release/grayscale.wasm ../../
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Vercel runs &lt;a href="https://github.com/second-state/vercel-wasm-runtime/blob/main/api/pre.sh" rel="noopener noreferrer"&gt;&lt;code&gt;api/pre.sh&lt;/code&gt;&lt;/a&gt; upon setting up the serverless environment. It installs the WasmEdge runtime, and then compiles each WebAssembly bytecode program into a native &lt;code&gt;so&lt;/code&gt; library for faster execution. &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The &lt;a href="https://github.com/second-state/vercel-wasm-runtime/blob/main/api/hello.js" rel="noopener noreferrer"&gt;&lt;code&gt;api/hello.js&lt;/code&gt;&lt;/a&gt; file conforms Vercel serverless specification. It loads the WasmEdge runtime, starts the compiled WebAssembly program in WasmEdge, and passes the uploaded image data via &lt;code&gt;STDIN&lt;/code&gt;. Notice &lt;a href="https://github.com/second-state/vercel-wasm-runtime/blob/main/api/hello.js" rel="noopener noreferrer"&gt;&lt;code&gt;api/hello.js&lt;/code&gt;&lt;/a&gt; runs the compiled &lt;code&gt;grayscale.so&lt;/code&gt; file generated by &lt;a href="https://github.com/second-state/vercel-wasm-runtime/blob/main/api/pre.sh" rel="noopener noreferrer"&gt;&lt;code&gt;api/pre.sh&lt;/code&gt;&lt;/a&gt; for better performance.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const fs = require('fs');
const { spawn } = require('child_process');
const path = require('path');

module.exports = (req, res) =&amp;gt; {
  const wasmedge = spawn(
      path.join(__dirname, 'wasmedge'), 
      [path.join(__dirname, 'grayscale.so')]);

  let d = [];
  wasmedge.stdout.on('data', (data) =&amp;gt; {
    d.push(data);
  });

  wasmedge.on('close', (code) =&amp;gt; {
    let buf = Buffer.concat(d);

    res.setHeader('Content-Type', req.headers['image-type']);
    res.send(buf);
  });

  wasmedge.stdin.write(req.body);
  wasmedge.stdin.end('');
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's it. &lt;a href="https://vercel.com/docs/git#deploying-a-git-repository" rel="noopener noreferrer"&gt;Deploy the repo to Vercel&lt;/a&gt; and you now have a Vercel Jamstack app with a high-performance Rust and WebAssembly based serverless backend. &lt;/p&gt;

&lt;h2&gt;
  
  
  Example 2: AI inference
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://vercel-wasm-runtime-cozpr5z84-wangshishuo1.vercel.app/" rel="noopener noreferrer"&gt;second demo&lt;/a&gt; application allows users to upload an image and then invoke a serverless function to classify the main subject on the image.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F21gee7zfif3j7cxtvcme.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F21gee7zfif3j7cxtvcme.gif" alt="Vercel WebAssembly"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It is in &lt;a href="https://github.com/second-state/vercel-wasm-runtime" rel="noopener noreferrer"&gt;the same GitHub repo&lt;/a&gt; as the previous example but in the &lt;code&gt;tensorflow&lt;/code&gt; branch. Note: when you &lt;a href="https://vercel.com/docs/git#deploying-a-git-repository" rel="noopener noreferrer"&gt;import this GitHub repo&lt;/a&gt; on the Vercel website, it will create a &lt;a href="https://vercel.com/docs/platform/deployments#preview" rel="noopener noreferrer"&gt;preview URL&lt;/a&gt; for each branch. The &lt;code&gt;tensorflow&lt;/code&gt; branch would have its own deployment URL.&lt;/p&gt;

&lt;p&gt;The backend serverless function for image classification is in the &lt;a href="https://github.com/second-state/vercel-wasm-runtime/tree/tensorflow/api/functions/image-classification" rel="noopener noreferrer"&gt;&lt;code&gt;api/functions/image-classification&lt;/code&gt;&lt;/a&gt; folder in the &lt;code&gt;tensorflow&lt;/code&gt; branch.  The &lt;a href="https://github.com/second-state/vercel-wasm-runtime/blob/tensorflow/api/functions/image-classification/src/main.rs" rel="noopener noreferrer"&gt;&lt;code&gt;src/main.rs&lt;/code&gt;&lt;/a&gt; file contains the Rust program’s source code. The Rust program reads image data from the &lt;code&gt;STDIN&lt;/code&gt;, and then outputs the text output to the &lt;code&gt;STDOUT&lt;/code&gt;. It utilizes the WasmEdge Tensorflow API to run the AI inference.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pub fn main() {
    // Step 1: Load the TFLite model
    let model_data: &amp;amp;[u8] = include_bytes!("models/mobilenet_v1_1.0_224/mobilenet_v1_1.0_224_quant.tflite");
    let labels = include_str!("models/mobilenet_v1_1.0_224/labels_mobilenet_quant_v1_224.txt");

    // Step 2: Read image from STDIN
    let mut buf = Vec::new();
    io::stdin().read_to_end(&amp;amp;mut buf).unwrap();

    // Step 3: Resize the input image for the tensorflow model
    let flat_img = wasmedge_tensorflow_interface::load_jpg_image_to_rgb8(&amp;amp;buf, 224, 224);

    // Step 4: AI inference
    let mut session = wasmedge_tensorflow_interface::Session::new(&amp;amp;model_data, wasmedge_tensorflow_interface::ModelType::TensorFlowLite);
    session.add_input("input", &amp;amp;flat_img, &amp;amp;[1, 224, 224, 3])
           .run();
    let res_vec: Vec&amp;lt;u8&amp;gt; = session.get_output("MobilenetV1/Predictions/Reshape_1");

    // Step 5: Find the food label that responds to the highest probability in res_vec
    // ... ...
    let mut label_lines = labels.lines();
    for _i in 0..max_index {
      label_lines.next();
    }

    // Step 6: Generate the output text
    let class_name = label_lines.next().unwrap().to_string();
    if max_value &amp;gt; 50 {
      println!("It {} a &amp;lt;a href='https://www.google.com/search?q={}'&amp;gt;{}&amp;lt;/a&amp;gt; in the picture", confidence.to_string(), class_name, class_name);
    } else {
      println!("It does not appears to be any food item in the picture.");
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can use the &lt;code&gt;cargo&lt;/code&gt; tool to build the Rust program into WebAssembly bytecode or native code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ cd api/functions/image-classification/
$ cargo build --release --target wasm32-wasi
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy the build artifacts to the &lt;code&gt;api&lt;/code&gt; folder.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ cp target/wasm32-wasi/release/classify.wasm ../../
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Again, the &lt;a href="https://github.com/second-state/vercel-wasm-runtime/blob/tensorflow/api/pre.sh" rel="noopener noreferrer"&gt;&lt;code&gt;api/pre.sh&lt;/code&gt;&lt;/a&gt; script installs WasmEdge runtime and its Tensorflow dependencies in this application. It also compiles the &lt;code&gt;classify.wasm&lt;/code&gt; bytecode program to the &lt;code&gt;classify.so&lt;/code&gt; native shared library at the time of deployment.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://github.com/second-state/vercel-wasm-runtime/blob/tensorflow/api/hello.js" rel="noopener noreferrer"&gt;&lt;code&gt;api/hello.js&lt;/code&gt;&lt;/a&gt; file conforms Vercel serverless specification. It loads the WasmEdge runtime, starts the compiled WebAssembly program in WasmEdge, and passes the uploaded image data via &lt;code&gt;STDIN&lt;/code&gt;. Notice &lt;a href="https://github.com/second-state/vercel-wasm-runtime/blob/tensorflow/api/hello.js" rel="noopener noreferrer"&gt;&lt;code&gt;api/hello.js&lt;/code&gt;&lt;/a&gt; runs the compiled &lt;code&gt;classify.so&lt;/code&gt; file generated by &lt;a href="https://github.com/second-state/vercel-wasm-runtime/blob/tensorflow/api/pre.sh" rel="noopener noreferrer"&gt;&lt;code&gt;api/pre.sh&lt;/code&gt;&lt;/a&gt; for better performance.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const fs = require('fs');
const { spawn } = require('child_process');
const path = require('path');

module.exports = (req, res) =&amp;gt; {
  const wasmedge = spawn(
    path.join(__dirname, 'wasmedge-tensorflow-lite'),
    [path.join(__dirname, 'classify.so')],
    {env: {'LD_LIBRARY_PATH': __dirname}}
  );

  let d = [];
  wasmedge.stdout.on('data', (data) =&amp;gt; {
    d.push(data);
  });

  wasmedge.on('close', (code) =&amp;gt; {
    res.setHeader('Content-Type', `text/plain`);
    res.send(d.join(''));
  });

  wasmedge.stdin.write(req.body);
  wasmedge.stdin.end('');
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can now &lt;a href="https://vercel.com/docs/git#deploying-a-git-repository" rel="noopener noreferrer"&gt;deploy your forked repo to Vercel&lt;/a&gt; and have a web app for subject classification.&lt;/p&gt;

&lt;h2&gt;
  
  
  What's next
&lt;/h2&gt;

&lt;p&gt;Running WasmEdge from Vercel’s current serverless container is an easy way to add high-performance functions to Vercel applications. If you have created an interesting Vercel function using WasmEdge, please&lt;a href="https://forms.gle/ozSNLz1MCJTNj9n18" rel="noopener noreferrer"&gt;let us know and we will send you a WasmEdge SWAG&lt;/a&gt;!&lt;/p&gt;

&lt;p&gt;Going forward an even better approach is to use WasmEdge as the container itself. There will be no Docker and no Node.JS to bootstrap WasmEdge. This way, we can reach much higher efficiency for running serverless functions. WasmEdge is &lt;a href="https://www.secondstate.io/articles/manage-webassembly-apps-in-wasmedge-using-docker-tools/" rel="noopener noreferrer"&gt;already compatible with Docker tools&lt;/a&gt;. If you are interested in joining WasmEdge and CNCF for this exciting work, &lt;a href="https://github.com/WasmEdge/WasmEdge#contact" rel="noopener noreferrer"&gt;let us know&lt;/a&gt;!&lt;/p&gt;

</description>
      <category>rust</category>
      <category>webassembly</category>
      <category>serverless</category>
      <category>jamstack</category>
    </item>
    <item>
      <title>😎 Manage WebAssembly Apps in WasmEdge Using Docker Tools</title>
      <dc:creator>alabulei1</dc:creator>
      <pubDate>Tue, 29 Jun 2021 11:55:31 +0000</pubDate>
      <link>https://dev.to/alabulei1/manage-webassembly-apps-in-wasmedge-using-docker-tools-fp1</link>
      <guid>https://dev.to/alabulei1/manage-webassembly-apps-in-wasmedge-using-docker-tools-fp1</guid>
      <description>&lt;p&gt;Developers can leverage Docker tools such as the DockerHub and CRI-O to deploy, manage, and run lightweight WebAssembly applications in &lt;a href="https://github.com/WasmEdge/WasmEdge"&gt;WasmEdge&lt;/a&gt;. WasmEdge, an advanced WebAssembly runtime &lt;a href="https://www.cncf.io/sandbox-projects/"&gt;hosted by the CNCF&lt;/a&gt; (Cloud Native Computing Foundation), is an execution sandbox for Edge Computing applications. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--FhV9dvUL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3fswz98zqk3e6t70cv8o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FhV9dvUL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3fswz98zqk3e6t70cv8o.png" alt="WasmEdge"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;While WebAssembly was initially invented as a runtime for browser applications, its lightweight and high-performance sandbox design has made it an appealing choice as a general-purpose application container. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;If WASM+WASI existed in 2008, we wouldn't have needed to create Docker. — Solomon Hykes, co-founder of Docker&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Compared with Docker, &lt;a href="https://www.infoq.com/articles/arm-vs-x86-cloud-performance/"&gt;WebAssembly could be 100x faster at startup&lt;/a&gt;, have a much smaller memory and disk footprint, and have a better-defined safety sandbox. However, the trade-off is that WebAssembly requires its own language SDKs, and compiler toolchains, making it a more constrained developer environment than Docker. WebAssembly is increasingly used in Edge Computing scenarios where it is difficult to deploy Docker or when the application performance is vital.&lt;/p&gt;

&lt;p&gt;One of the great advantages of Docker is its rich ecosystem of tools. At WasmEdge, we would like to bring Docker-like tooling to our developers. To accomplish that, we created an alternative runner for CRI-O, called &lt;a href="https://github.com/second-state/runw"&gt;runw&lt;/a&gt;, to load and run WebAssembly bytecode programs as if they are Docker image files. &lt;/p&gt;

&lt;h2&gt;
  
  
  Install an WebAssembly runner in CRI-O
&lt;/h2&gt;

&lt;p&gt;In order to support WebAssembly in CRI-O, you just need to download the &lt;code&gt;runw&lt;/code&gt; binary release and install it into your CRI-O.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Since the &lt;code&gt;runw&lt;/code&gt; binary already includes WasmEdge, there is no need to install WasmEdge or any other WebAssembly VM separately. &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;First, make sure that you are on Ubuntu 20.04 with LLVM-10 installed. If you are on a different platform, please refer to &lt;a href="https://github.com/second-state/runw#build-from-source"&gt;the project documentation on how to build&lt;/a&gt; &lt;code&gt;runw&lt;/code&gt; for your OS.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt install -y llvm-10-dev liblld-10-dev
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Also make sure that you have &lt;a href="https://cri-o.io/"&gt;cri-o&lt;/a&gt;, &lt;a href="https://github.com/kubernetes-sigs/cri-tools"&gt;crictl&lt;/a&gt;, &lt;a href="https://github.com/containernetworking/plugins"&gt;containernetworking-plugins&lt;/a&gt;, and &lt;a href="https://github.com/containers/buildah"&gt;buildah&lt;/a&gt; or &lt;a href="https://github.com/docker/cli"&gt;docker&lt;/a&gt; installed. &lt;/p&gt;

&lt;p&gt;Next, download the &lt;code&gt;runw&lt;/code&gt; binary build.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;wget https://github.com/second-state/runw/releases/download/0.1.0/runw
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, you can install &lt;code&gt;runw&lt;/code&gt; into CRI-O as an alternative runtime for WebAssembly.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Get the wasm-pause utility
sudo crictl pull docker.io/beststeve/wasm-pause

# Install runw into cri-o
sudo cp -v runw /usr/lib/cri-o-runc/sbin/runw
sudo chmod +x /usr/lib/cri-o-runc/sbin/runw
sudo sed -i -e 's@default_runtime = "runc"@default_runtime = "runw"@' /etc/crio/crio.conf
sudo sed -i -e 's@pause_image = "k8s.gcr.io/pause:3.2"@pause_image = "docker.io/beststeve/wasm-pause"@' /etc/crio/crio.conf
sudo sed -i -e 's@pause_command = "/pause"@pause_command = "pause.wasm"@' /etc/crio/crio.conf
sudo tee -a /etc/crio/crio.conf.d/01-crio-runc.conf &amp;lt;&amp;lt;EOF
[crio.runtime.runtimes.runw]
runtime_path = "/usr/lib/cri-o-runc/sbin/runw"
runtime_type = "oci"
runtime_root = "/run/runw"
EOF
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally, restart &lt;code&gt;cri-o&lt;/code&gt; for the new WebAssembly runner to take effect.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo systemctl restart crio
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Build a Wasm application from Rust
&lt;/h2&gt;

&lt;p&gt;The sample Wasm application is written in Rust. To make it work, make sure that you have &lt;a href="https://www.rust-lang.org/tools/install"&gt;Rust&lt;/a&gt; and the &lt;a href="https://www.secondstate.io/articles/rustwasmc/"&gt;rustwasmc&lt;/a&gt; toolchains installed. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;You only need Rust compiler and rustwasmc to build Rust source into a wasm bytecode file. If you already have a wasm bytecode program, and just want to run it with cri-o, you can simply skip this section. &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The application source code is just a &lt;code&gt;main.rs&lt;/code&gt; function. &lt;a href="https://github.com/second-state/wasm-learning/tree/master/ssvm/wasi"&gt;It is here&lt;/a&gt;. The application demonstrates how to access the file system and other operating system resources from WasmEdge using the standard Rust API.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;fn main() {
  println!("Random number: {}", get_random_i32());
  println!("Random bytes: {:?}", get_random_bytes());
  println!("{}", echo("This is from a main function"));
  print_env();
  create_file("/tmp.txt", "This is in a file");
  println!("File content is {}", read_file("/tmp.txt"));
  del_file("/tmp.txt");
}

pub fn get_random_i32() -&amp;gt; i32 {
  let x: i32 = random();
  return x;
}

pub fn get_random_bytes() -&amp;gt; Vec&amp;lt;u8&amp;gt; {
  let mut rng = thread_rng();
  let mut arr = [0u8; 128];
  rng.fill(&amp;amp;mut arr[..]);
  return arr.to_vec();
}

pub fn echo(content: &amp;amp;str) -&amp;gt; String {
  println!("Printed from wasi: {}", content);
  return content.to_string();
}

pub fn print_env() {
  println!("The env vars are as follows.");
  for (key, value) in env::vars() {
    println!("{}: {}", key, value);
  }

  println!("The args are as follows.");
  for argument in env::args() {
    println!("{}", argument);
  }
}

pub fn create_file(path: &amp;amp;str, content: &amp;amp;str) {
  let mut output = File::create(path).unwrap();
  output.write_all(content.as_bytes()).unwrap();
}

pub fn read_file(path: &amp;amp;str) -&amp;gt; String {
  let mut f = File::open(path).unwrap();
  let mut s = String::new();
  match f.read_to_string(&amp;amp;mut s) {
    Ok(_) =&amp;gt; s,
    Err(e) =&amp;gt; e.to_string(),
  }
}

pub fn del_file(path: &amp;amp;str) {
  fs::remove_file(path).expect("Unable to delete");
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can build the application into a wasm bytecode file as follows.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;rustwasmc build
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The outcome wasm bytecode file is &lt;a href="https://github.com/second-state/wasm-learning/blob/master/ssvm/wasi/wasi_example_main.wasm"&gt;available here&lt;/a&gt;. &lt;/p&gt;

&lt;h2&gt;
  
  
  Build and publish a Docker Hub image for the Wasm app
&lt;/h2&gt;

&lt;p&gt;You can now publish the entire wasm bytecode file into Docker hub as if it is a Docker image.&lt;/p&gt;

&lt;p&gt;First, create a Dockerfile in the &lt;code&gt;pkg/&lt;/code&gt; directory as follows.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;FROM scratch
ADD wasi_example_main.wasm .
CMD ["wasi_example_main.wasm"]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create an image and publish it to Docker hub!&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo buildah bud -f Dockerfile -t wasm-wasi-example
sudo buildah push wasm-wasi-example docker://registry.example.com/repository:tag

# Example: the following command publishes the wasm image to the public Docker hub under user account "hydai"
sudo buildah push wasm-wasi-example docker://docker.io/hydai/wasm-wasi-example:latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, you can use Docker tools, such as the &lt;code&gt;crictl&lt;/code&gt;, to pull the publish wasm file as an image. Below is an example for the wasm file image we published.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo crictl pull docker.io/hydai/wasm-wasi-example
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Start the Wasm app using CRI-O
&lt;/h2&gt;

&lt;p&gt;To start and run the wasm file, you will need to create two configuration files for CRI-O. Create a &lt;code&gt;container_wasi.json&lt;/code&gt; file as follows. It tells the CRI-O runtime where to pull the wasm file image from the Docker repository.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "metadata": {
    "name": "podsandbox1-wasm-wasi"
  },
  "image": {
    "image": "hydai/wasm-wasi-example:latest"
  },
  "args": [
    "wasi_example_main.wasm", "50000000"
  ],
  "working_dir": "/",
  "envs": [],
  "labels": {
    "tier": "backend"
  },
  "annotations": {
    "pod": "podsandbox1"
  },
  "log_path": "",
  "stdin": false,
  "stdin_once": false,
  "tty": false,
  "linux": {
    "resources": {
      "memory_limit_in_bytes": 209715200,
      "cpu_period": 10000,
      "cpu_quota": 20000,
      "cpu_shares": 512,
      "oom_score_adj": 30,
      "cpuset_cpus": "0",
      "cpuset_mems": "0"
    },
    "security_context": {
      "namespace_options": {
        "pid": 1
      },
      "readonly_rootfs": false,
      "capabilities": {
        "add_capabilities": [
          "sys_admin"
        ]
      }
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, create a &lt;code&gt;sandbox_config.json&lt;/code&gt; file as follows. It defines the sandbox environment to run the wasm application.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "metadata": {
    "name": "podsandbox12",
    "uid": "redhat-test-crio",
    "namespace": "redhat.test.crio",
    "attempt": 1
  },
  "hostname": "crictl_host",
  "log_directory": "",
  "dns_config": {
    "searches": [
      "8.8.8.8"
    ]
  },
  "port_mappings": [],
  "resources": {
    "cpu": {
      "limits": 3,
      "requests": 2
    },
    "memory": {
      "limits": 50000000,
      "requests": 2000000
    }
  },
  "labels": {
    "group": "test"
  },
  "annotations": {
    "owner": "hmeng",
    "security.alpha.kubernetes.io/seccomp/pod": "unconfined"
  },
  "linux": {
    "cgroup_parent": "pod_123-456.slice",
    "security_context": {
      "namespace_options": {
        "network": 0,
        "pid": 1,
        "ipc": 0
      },
      "selinux_options": {
        "user": "system_u",
        "role": "system_r",
        "type": "svirt_lxc_net_t",
        "level": "s0:c4,c5"
      }
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now you can create a CRI-O pod as follows.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Create the POD. Output will be different from example.
sudo crictl runp sandbox_config.json
7992e75df00cc1cf4bff8bff660718139e3ad973c7180baceb9c84d074b516a4

# Set a helper variable for later use.
POD_ID=7992e75df00cc1cf4bff8bff660718139e3ad973c7180baceb9c84d074b516a4
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;From the pod, you can create a container to run the wasm bytecode program in isolation.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Create the container instance. Output will be different from example.
sudo crictl create $POD_ID container_wasi.json sandbox_config.json
1d056e4a8a168f0c76af122d42c98510670255b16242e81f8e8bce8bd3a4476f
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally, start the container and see the output from the wasm application.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# List the container, the state should be `Created`
sudo crictl ps -a

CONTAINER           IMAGE                           CREATED              STATE               NAME                     ATTEMPT             POD ID
1d056e4a8a168       hydai/wasm-wasi-example:latest   About a minute ago   Created             podsandbox1-wasm-wasi   0                   7992e75df00cc

# Start the container
sudo crictl start 1d056e4a8a168f0c76af122d42c98510670255b16242e81f8e8bce8bd3a4476f
1d056e4a8a168f0c76af122d42c98510670255b16242e81f8e8bce8bd3a4476f

# Check the container status again.# If the container is not finishing its job, you will see the Running state# Because this example is very tiny. You may see Exited at this moment.
sudo crictl ps -a
CONTAINER           IMAGE                           CREATED              STATE               NAME                     ATTEMPT             POD ID
1d056e4a8a168       hydai/wasm-wasi-example:latest   About a minute ago   Running             podsandbox1-wasm-wasi   0                   7992e75df00cc

# When the container is finished. You can see the state becomes Exited.
sudo crictl ps -a
CONTAINER           IMAGE                           CREATED              STATE               NAME                     ATTEMPT             POD ID
1d056e4a8a168       hydai/wasm-wasi-example:latest   About a minute ago   Exited              podsandbox1-wasm-wasi   0                   7992e75df00cc

# Check the container's logs
sudo crictl logs 1d056e4a8a168f0c76af122d42c98510670255b16242e81f8e8bce8bd3a4476f

Test 1: Print Random Number
Random number: 960251471

Test 2: Print Random Bytes
Random bytes: [50, 222, 62, 128, 120, 26, 64, 42, 210, 137, 176, 90, 60, 24, 183, 56, 150, 35, 209, 211, 141, 146, 2, 61, 215, 167, 194, 1, 15, 44, 156, 27, 179, 23, 241, 138, 71, 32, 173, 159, 180, 21, 198, 197, 247, 80, 35, 75, 245, 31, 6, 246, 23, 54, 9, 192, 3, 103, 72, 186, 39, 182, 248, 80, 146, 70, 244, 28, 166, 197, 17, 42, 109, 245, 83, 35, 106, 130, 233, 143, 90, 78, 155, 29, 230, 34, 58, 49, 234, 230, 145, 119, 83, 44, 111, 57, 164, 82, 120, 183, 194, 201, 133, 106, 3, 73, 164, 155, 224, 218, 73, 31, 54, 28, 124, 2, 38, 253, 114, 222, 217, 202, 59, 138, 155, 71, 178, 113]

Test 3: Call an echo function
Printed from wasi: This is from a main function
This is from a main function

Test 4: Print Environment Variables
The env vars are as follows.
PATH: /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
TERM: xterm
HOSTNAME: crictl_host
PATH: /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
The args are as follows.
/var/lib/containers/storage/overlay/006e7cf16e82dc7052994232c436991f429109edea14a8437e74f601b5ee1e83/merged/wasi_example_main.wasm
50000000

Test 5: Create a file `/tmp.txt` with content `This is in a file`

Test 6: Read the content from the previous file
File content is This is in a file

Test 7: Delete the previous file
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  What's next
&lt;/h2&gt;

&lt;p&gt;In this article, we have seen how to start, run, and manage WasmEdge applications using Docker-like CRI-O tools.&lt;/p&gt;

&lt;p&gt;Our next step is to use Kubernetes to manage WasmEdge containers. To accomplish that, we will need to install a runner binary inside Kubernetes so that it could support regular Docker images and wasm bytecode images at the same time. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Join us in the WebAssembly revolution!&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;👉 Slack Channel: #wasmedge on &lt;a href="https://slack.cncf.io/"&gt;CNCF Slack channel&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;👉 Mailing list: Send an email to &lt;a href="https://groups.google.com/g/wasmedge/"&gt;WasmEdge@googlegroups.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;👉 Be a contributor: &lt;a href="https://github.com/WasmEdge/WasmEdge/tree/master/doc/wish_list.md"&gt;checkout our wish list&lt;/a&gt; to start contributing!&lt;/p&gt;

</description>
      <category>webassembly</category>
      <category>rust</category>
      <category>docker</category>
    </item>
    <item>
      <title>Saving Uniswap v1</title>
      <dc:creator>alabulei1</dc:creator>
      <pubDate>Fri, 30 Oct 2020 16:15:08 +0000</pubDate>
      <link>https://dev.to/alabulei1/saving-uniswap-v1-5c3g</link>
      <guid>https://dev.to/alabulei1/saving-uniswap-v1-5c3g</guid>
      <description>&lt;p&gt;&lt;em&gt;New Ethereum compatible projects like the &lt;a href="https://www.oasiseth.org/" rel="noopener noreferrer"&gt;Oasis Ethereum ParaTime&lt;/a&gt; are choosing Uniswap v1 over v2 for good reasons. Few understand this. 🤔&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In the world of software, a new release typically means improvements every user should upgrade to. 🎉 For Uniswap, however, v2 is simply different software than v1. Uniswap v2 is a complete rewrite with exciting new features, but the developers also made compromises in terms of cost, complexity, (de)centralization, and openness. 😿 As a result, for many users and developers, there are compelling reasons to stick with v1. I think the community should invest the effort to keep Uniswap v1 software maintained and viable. Now, let me explain.&lt;/p&gt;

&lt;h2&gt;
  
  
  A brief history
&lt;/h2&gt;

&lt;p&gt;The idea of Automated Market Making (AMM) is not new. Bancor, one of the most successful ICO projects (in terms of money raised), is built on the idea of AMM. However, Bancor never took off because it needlessly complicated the user experience by bringing in a “smart token” into the system, which is great for the ICO but adds little value to the trading experience itself. Furthermore, while Bancor’s smart contracts are open source, its user interface application is never fully open making it difficult for outside developers to build or improve upon it.&lt;/p&gt;

&lt;p&gt;Then comes Uniswap. Created by a young and ambitious developer in a rising programming language (Vyper), it laser-focused on the trader’s user experience. Uniswap v1 has a very simple AMM formula. It is easy to understand and use, with an intuitive UI. Uniswap v1 smart contracts are highly optimized for gas efficiency. Its entire code base is open source and easily reusable by developers in the community. I, as a developer and crypto user, loved Uniswap v1.&lt;/p&gt;

&lt;p&gt;Then, comes Uniswap v2. It is an engineering achievement and designed to meet expectations of its investors and liquidity providers — as evident from the popularity of the UNI token. For users, it enables direct swapping between pairs of ERC-20 tokens as well as flash swaps in atomic transactions. But it also makes important compromises and sacrifices. In the rest of this article, I will make the case for continued v1 development by the community.&lt;/p&gt;

&lt;h2&gt;
  
  
  Centralization / Decentralization
&lt;/h2&gt;

&lt;p&gt;The first page of the Uniswap v2 &lt;a href="https://uniswap.org/whitepaper.pdf" rel="noopener noreferrer"&gt;whitepaper&lt;/a&gt; reveals that, unlike v1, a centralized private key could take a percentage (0.05%) of the fee on every trade.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“there is a private key that has the ability to update a variable on the factory contract to turn on an on-chain 5-basis-point fee on trades.” — The Uniswap v2 whitepaper&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This fee is sent to an address of the contract’s owner controls. The mere existence of an “owner” is uncomfortable in a project that champions decentralization.&lt;/p&gt;

&lt;p&gt;Furthermore, Uniswap v2 apparently allows a centralized party to mint and distribute governance tokens for the entire project. While I appreciate UNI airdrop, it is a slippery slope toward centralization.&lt;/p&gt;

&lt;h2&gt;
  
  
  Gas cost
&lt;/h2&gt;

&lt;p&gt;One of the best features of Uniswap v1 is its gas efficiency. That is especially important for users as Ethereum gas fees reach new heights during the DeFi boom.&lt;/p&gt;

&lt;p&gt;However, Uniswap v2, with its complicated and inter-dependent smart contracts, costs much more in gas fees compared with v1. Uniswap v2 relies on smart contracts ranging from the &lt;a href="https://uniswap.org/docs/v2/smart-contracts/factory" rel="noopener noreferrer"&gt;Factory&lt;/a&gt;, Router (&lt;a href="https://uniswap.org/docs/v2/smart-contracts/router02/" rel="noopener noreferrer"&gt;V2&lt;/a&gt;), &lt;a href="https://uniswap.org/docs/v2/smart-contracts/pair/" rel="noopener noreferrer"&gt;Pair&lt;/a&gt; and &lt;a href="https://uniswap.org/docs/v2/smart-contracts/pair-erc-20/" rel="noopener noreferrer"&gt;Pair ERC20&lt;/a&gt; contracts, along with a &lt;a href="https://uniswap.org/docs/v2/smart-contracts/library/" rel="noopener noreferrer"&gt;Library&lt;/a&gt; contract for common utilities. Let’s take the &lt;a href="https://twitter.com/pingchentw/status/1314220168092872705" rel="noopener noreferrer"&gt;proxy contract&lt;/a&gt; as an example. The proxy contract on Uniswap v1 is  46 bytes, however, the same contract on Uniswap v2 is 4349 bytes. The increased complexity translates to inflated gas fees for users.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“I noticed the fees are &lt;a href="https://www.reddit.com/r/ethereum/comments/h9qexi/uniswap_v2_now_has_more_liquidity_than_v1_go/fuyrb6i/?utm_source=reddit&amp;amp;utm_medium=web2x&amp;amp;context=3" rel="noopener noreferrer"&gt;a bit higher&lt;/a&gt; and it actually have me a notification to use v1 for my swap” - Comment from Reddit.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Now, to be fair, the added complexity does bring use of more features. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;ERC-20 to ERC-20 swap and quick swap are nice-to-have features.&lt;/li&gt;
&lt;li&gt;The pricing oracle is not relevant to me as an end-user. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;As a user, I prefer the cheaper and simpler v1 when I just need to swap ERC-20 tokens against ETH.&lt;/p&gt;

&lt;h2&gt;
  
  
  i18n
&lt;/h2&gt;

&lt;p&gt;Many non-English crypto users are dismayed to find that Uniswap v2 does not provide i18n support for non-English languages. Uniswap v1 already had decent i18n support, and this feature was actually removed in v2. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Fun fact: the Uniswap v1 i18n interface for non-English languages was contributed by Maggie Wang from CyberMiles.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;As the crypto community grows more diverse, Uniswap’s regression to English-only does not bode well for its position as a worldwide DeFi leader. &lt;/p&gt;

&lt;h2&gt;
  
  
  Portability
&lt;/h2&gt;

&lt;p&gt;The Uniswap v2 user interface (UI) is a complex JavaScript application. It is designed to deploy on Ethereum mainnet and various testnets. It requires a major effort to deploy the Uniswap v2 UI on any other Ethereum compatible blockchain as extensive code level changes must be made to modify hardcoded values such as the following.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The on-chain addresses and hash values of various important contracts, such as the Router.&lt;/li&gt;
&lt;li&gt;The web3 gateway provider’s URL structure.&lt;/li&gt;
&lt;li&gt;The chain ID of the host blockchain.&lt;/li&gt;
&lt;li&gt;Token list locations.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In contrast, the Uniswap v1 UI code is much simpler and cleaner. It is easy to run the Uniswap v1 UI on an alternative Ethereum blockchain, such as the Oasis Ethereum ParaTime chain.&lt;/p&gt;

&lt;h2&gt;
  
  
  User interface
&lt;/h2&gt;

&lt;p&gt;The Uniswap v2 UI was originally deployed with a toggle button where users could choose to use v1. Whilst the toggle did not actually fully allow the user to stay on v1 —  i.e. functions in relation to liquidity would automatically default to operating partially via the V2 smart contracts, the toggle has since been removed. Put simply, the current “official” Uniswap UI is geared wholly and solely towards herding users to the v2 smart contracts and liquidity pools. Leaving the original v1 smart contracts on the sidelines.&lt;/p&gt;

&lt;h2&gt;
  
  
  What’s next
&lt;/h2&gt;

&lt;p&gt;Most of Uniswap liquidity has moved to v2. According to data from the &lt;a href="https://v1.uniswap.info/home" rel="noopener noreferrer"&gt;official Uniswap V1 analytics tool&lt;/a&gt; in later Oct 2020, there are just $4 million in liquidity on Uniswap v1, and 742 transactions in the past 24 hours. That is a fraction of the volume of Uniswap v2, which is $2.9 billion in liquidity and 137,255 transactions in the past 24 hours.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fnu8qexg74ei061bcxtjc.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fnu8qexg74ei061bcxtjc.jpeg" alt="Uniswap V1 &amp;amp; V2"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Due to the strong network effects of exchanges, there is probably no turning back on Ethereum mainnet, although I will miss v1’s simplicity and low fees. &lt;/p&gt;

&lt;p&gt;However, on emerging Ethereum compatible blockchains, I believe many developers and users will still choose Uniswap v1 as the go-to DeFi exchange. The Uniswap v1 codebase should be revived and maintained by the community for this very purpose.&lt;/p&gt;

</description>
      <category>uniswap</category>
      <category>blockchain</category>
      <category>ethereum</category>
      <category>defi</category>
    </item>
    <item>
      <title>Face Detection in Node.js  with Rust and WebAssembly
</title>
      <dc:creator>alabulei1</dc:creator>
      <pubDate>Wed, 23 Sep 2020 10:14:54 +0000</pubDate>
      <link>https://dev.to/alabulei1/high-performance-and-safe-ai-as-a-service-in-node-js-43lg</link>
      <guid>https://dev.to/alabulei1/high-performance-and-safe-ai-as-a-service-in-node-js-43lg</guid>
      <description>&lt;p&gt;We introduced &lt;a href="https://dev.to/alabulei1/how-to-call-rust-funtion-in-node-js-p36"&gt;how to call Rust functions from Node.js&lt;/a&gt; in the last article. This article will introduce how to write a Node.js-based AI as a Service application.&lt;/p&gt;

&lt;p&gt;Today’s dominant programming language for AI is Python. Yet, the programming language for the web is JavaScript. To provide AI capabilities as a service on the web, we need to wrap AI algorithms in JavaScript, particularly Node.js.&lt;/p&gt;

&lt;p&gt;However, neither Python nor JavaScript by itself is suitable for computationally intensive AI applications. They are high-level, ie, slow, languages with heavy-weight runtimes. Their ease-of-use comes at the cost of low performance. Python got around this by wrapping AI computation in native C/C++ modules. Node.js could do the same, but we have a better way — WebAssembly.&lt;/p&gt;

&lt;p&gt;WebAssembly VMs provide tight integration with Node.js and other JavaScript runtimes. They are highly performant, memory safe, secure by default, and portable across operating systems. However, our approach combines the best features of WebAssembly and native code.&lt;/p&gt;

&lt;h2&gt;
  
  
  How it works
&lt;/h2&gt;

&lt;p&gt;The Node.js-based AI as a Service application consists of three parts.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The Node.js application provides web services and calls the WebAssembly function to perform computationally intensive tasks such as AI inference.&lt;/li&gt;
&lt;li&gt;The data preparation, post-processing, and integration with other systems, are done by a WebAssembly function. Initially, we support Rust. The application developer must write this function.&lt;/li&gt;
&lt;li&gt;The actual execution of the AI model is done by native code to maximize performance. This part of the code is very short and is reviewed for security and safety. Application developers just call this native program from the WebAssembly function — much like how native functions are used in Python and Node.js today.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F8pfph8a7ok2cizov11u5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F8pfph8a7ok2cizov11u5.png" alt="AI as a service"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's look at an example now!&lt;/p&gt;

&lt;h2&gt;
  
  
  A face detection example
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://github.com/second-state/AI-as-a-Service/tree/master/nodejs/face_detection_service" rel="noopener noreferrer"&gt;face detection web service&lt;/a&gt; allows a user to upload a photo, and it displays the image with all pictures marked in green boxes.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The Rust source code for executing the MTCNN face detection model is based on Cetra's excellent tutorial: &lt;a href="https://cetra3.github.io/blog/face-detection-with-tensorflow-rust/" rel="noopener noreferrer"&gt;Face Detection with Tensorflow Rust&lt;/a&gt;. We made changes to make the Tensorflow library work in WebAssembly.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fnoi1d7tdgag5yupg06rt.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fnoi1d7tdgag5yupg06rt.gif" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://github.com/second-state/AI-as-a-Service/blob/master/nodejs/face_detection_service/node/server.js" rel="noopener noreferrer"&gt;Node.js application&lt;/a&gt; handles the file uploading and the response.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/infer&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nf"&gt;function &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;image_file&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;files&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;image_file&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;result_filename&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;uuidv4&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;.png&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="c1"&gt;// Call the infer() function from WebAssembly (SSVM)&lt;/span&gt;
  &lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;infer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;detection_threshold&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;image_file&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="nx"&gt;fs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;writeFileSync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;public/&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="nx"&gt;result_filename&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;send&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;&amp;lt;img src="&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt;  &lt;span class="nx"&gt;result_filename&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;"/&amp;gt;&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As you can see, the JavaScript app simply passes the image data and a parameter called &lt;code&gt;detection_threshold&lt;/code&gt;, which specifies the smallest face to be detected, to the &lt;code&gt;infer()&lt;/code&gt; function, and then saves the return value into an image file on the server. &lt;a href="https://github.com/second-state/AI-as-a-Service/blob/master/nodejs/face_detection_service/src/lib.rs" rel="noopener noreferrer"&gt;The &lt;code&gt;infer()&lt;/code&gt; function&lt;/a&gt; is written in Rust and compiled into WebAssembly so that it can be called from JavaScript.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;infer()&lt;/code&gt; function flattens the input image data into an array. It sets up a TensorFlow model and uses the flattened image data as input to the model. The TensorFlow model execution returns a set of numbers indicating the coordinates for the four corners of each face box. The &lt;code&gt;infer()&lt;/code&gt; function then draws a green box around each face, and then it saves the modified image into a PNG file on the webserver.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="nd"&gt;#[wasm_bindgen]&lt;/span&gt;
&lt;span class="k"&gt;pub&lt;/span&gt; &lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="nf"&gt;infer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;detection_threshold&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;image_data&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;u8&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt; &lt;span class="k"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;Vec&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nb"&gt;u8&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;dt&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;detection_threshold&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="o"&gt;...&lt;/span&gt; &lt;span class="o"&gt;...&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;img&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;image&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;load_from_memory&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;image_data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="nf"&gt;.unwrap&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="c1"&gt;// Run the tensorflow model using the face_detection_mtcnn native wrapper&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;cmd&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Command&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"face_detection_mtcnn"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="c1"&gt;// Pass in some arguments&lt;/span&gt;
    &lt;span class="n"&gt;cmd&lt;/span&gt;&lt;span class="nf"&gt;.arg&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="nf"&gt;.width&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.to_string&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
        &lt;span class="nf"&gt;.arg&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="nf"&gt;.height&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.to_string&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
        &lt;span class="nf"&gt;.arg&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;dt&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="c1"&gt;// The image bytes data is passed in via STDIN&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;_x&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;_y&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;rgb&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="nf"&gt;.pixels&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="n"&gt;cmd&lt;/span&gt;&lt;span class="nf"&gt;.stdin_u8&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;rgb&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nb"&gt;u8&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="nf"&gt;.stdin_u8&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;rgb&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nb"&gt;u8&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="nf"&gt;.stdin_u8&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;rgb&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nb"&gt;u8&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;out&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cmd&lt;/span&gt;&lt;span class="nf"&gt;.output&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="c1"&gt;// Draw boxes from the result JSON array&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;line&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Pixel&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;from_slice&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;stdout_json&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Value&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;from_str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nn"&gt;str&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;from_utf8&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;out&lt;/span&gt;&lt;span class="py"&gt;.stdout&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="nf"&gt;.expect&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"[]"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;&lt;span class="nf"&gt;.unwrap&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;stdout_vec&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;stdout_json&lt;/span&gt;&lt;span class="nf"&gt;.as_array&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.unwrap&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="o"&gt;..&lt;/span&gt;&lt;span class="n"&gt;stdout_vec&lt;/span&gt;&lt;span class="nf"&gt;.len&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;xy&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;stdout_vec&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="nf"&gt;.as_array&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.unwrap&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
        &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;x1&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;i32&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;xy&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="nf"&gt;.as_f64&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.unwrap&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nb"&gt;i32&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;y1&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;i32&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;xy&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="nf"&gt;.as_f64&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.unwrap&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nb"&gt;i32&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;x2&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;i32&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;xy&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="nf"&gt;.as_f64&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.unwrap&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nb"&gt;i32&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;y2&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;i32&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;xy&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="nf"&gt;.as_f64&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.unwrap&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nb"&gt;i32&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;rect&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Rect&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;at&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="nf"&gt;.of_size&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="n"&gt;x2&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;x1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nb"&gt;u32&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;y2&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;y1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nb"&gt;u32&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="nf"&gt;draw_hollow_rect_mut&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;rect&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="n"&gt;line&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;   
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;buf&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Vec&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="c1"&gt;// Write the result image into STDOUT&lt;/span&gt;
    &lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="nf"&gt;.write_to&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;buf&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nn"&gt;image&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;ImageOutputFormat&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="n"&gt;Png&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="nf"&gt;.expect&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Unable to write"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;buf&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://github.com/second-state/AI-as-a-Service/blob/master/native_model_zoo/face_detection_mtcnn/src/main.rs" rel="noopener noreferrer"&gt;The &lt;code&gt;face_detection_mtcnn&lt;/code&gt; command&lt;/a&gt; runs the MTCNN TensorFlow model in native code.  It takes three arguments, image width, image height, and detection threshold. The actual image data prepared as flattened RGB values is passed in from the WebAssembly &lt;code&gt;infer()&lt;/code&gt; via &lt;code&gt;STDIN&lt;/code&gt;. The result from the model is encoded in JSON and returned via the &lt;code&gt;STDOUT&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Notice how we passed the model parameter &lt;code&gt;detection_threshold&lt;/code&gt; the model tensor named &lt;code&gt;min_size&lt;/code&gt;, and then used the &lt;code&gt;input&lt;/code&gt; tensor to pass in the input image data. The &lt;code&gt;box&lt;/code&gt; tensor is used to retrieves the results from the model.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="nf"&gt;main&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;Result&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="nb"&gt;Box&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="k"&gt;dyn&lt;/span&gt; &lt;span class="n"&gt;Error&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// Get the arguments passed in from WebAssembly&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Vec&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nb"&gt;String&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;env&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;args&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.collect&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;img_width&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;u64&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="py"&gt;.parse&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nb"&gt;u64&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.unwrap&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;img_height&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;u64&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="py"&gt;.parse&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nb"&gt;u64&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.unwrap&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;detection_threshold&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;f32&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="py"&gt;.parse&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nb"&gt;f32&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.unwrap&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;buffer&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Vec&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nb"&gt;u8&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Vec&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;flattened&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Vec&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nb"&gt;f32&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Vec&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="c1"&gt;// The image bytes are read from STDIN&lt;/span&gt;
    &lt;span class="nn"&gt;io&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;stdin&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.read_to_end&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;buffer&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;num&lt;/span&gt; &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="n"&gt;buffer&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="n"&gt;flattened&lt;/span&gt;&lt;span class="nf"&gt;.push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;num&lt;/span&gt;&lt;span class="nf"&gt;.into&lt;/span&gt;&lt;span class="p"&gt;());&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="c1"&gt;// Load up the graph as a byte array and create a tensorflow graph.&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nd"&gt;include_bytes!&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"mtcnn.pb"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;graph&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Graph&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="n"&gt;graph&lt;/span&gt;&lt;span class="nf"&gt;.import_graph_def&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;*&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="nn"&gt;ImportGraphDefOptions&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;args&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;SessionRunArgs&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="c1"&gt;// The `input` tensor expects BGR pixel data from the input image&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;input&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Tensor&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;img_height&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;img_width&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;&lt;span class="nf"&gt;.with_values&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;flattened&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="nf"&gt;.add_feed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;graph&lt;/span&gt;&lt;span class="nf"&gt;.operation_by_name_required&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"input"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;input&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="c1"&gt;// The `min_size` tensor takes the detection_threshold argument&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;min_size&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Tensor&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="p"&gt;[])&lt;/span&gt;&lt;span class="nf"&gt;.with_values&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;detection_threshold&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="nf"&gt;.add_feed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;graph&lt;/span&gt;&lt;span class="nf"&gt;.operation_by_name_required&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"min_size"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;min_size&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="c1"&gt;// Default input params for the model&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;thresholds&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Tensor&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;&lt;span class="nf"&gt;.with_values&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mf"&gt;0.6f32&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;0.7f32&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;0.7f32&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="nf"&gt;.add_feed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;graph&lt;/span&gt;&lt;span class="nf"&gt;.operation_by_name_required&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"thresholds"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;thresholds&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;factor&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Tensor&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="p"&gt;[])&lt;/span&gt;&lt;span class="nf"&gt;.with_values&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mf"&gt;0.709f32&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="nf"&gt;.add_feed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;graph&lt;/span&gt;&lt;span class="nf"&gt;.operation_by_name_required&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"factor"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;factor&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="c1"&gt;// Request the following outputs after the session runs.&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;bbox&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="nf"&gt;.request_fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;graph&lt;/span&gt;&lt;span class="nf"&gt;.operation_by_name_required&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"box"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;session&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Session&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="nn"&gt;SessionOptions&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;graph&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="n"&gt;session&lt;/span&gt;&lt;span class="nf"&gt;.run&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="c1"&gt;// Get the bounding boxes&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;bbox_res&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Tensor&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nb"&gt;f32&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="nf"&gt;.fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;bbox&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;iter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;json_vec&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Vec&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;f32&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Vec&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;iter&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;bbox_res&lt;/span&gt;&lt;span class="nf"&gt;.len&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="n"&gt;json_vec&lt;/span&gt;&lt;span class="nf"&gt;.push&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;
            &lt;span class="n"&gt;bbox_res&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;iter&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="c1"&gt;// x1&lt;/span&gt;
            &lt;span class="n"&gt;bbox_res&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;iter&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;     &lt;span class="c1"&gt;// y1&lt;/span&gt;
            &lt;span class="n"&gt;bbox_res&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;iter&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="c1"&gt;// x2&lt;/span&gt;
            &lt;span class="n"&gt;bbox_res&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;iter&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="c1"&gt;// y2&lt;/span&gt;
        &lt;span class="p"&gt;]);&lt;/span&gt;
        &lt;span class="n"&gt;iter&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;json_obj&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nd"&gt;json!&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;json_vec&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="c1"&gt;// Return result JSON in STDOUT&lt;/span&gt;
    &lt;span class="nd"&gt;println!&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"{}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;json_obj&lt;/span&gt;&lt;span class="nf"&gt;.to_string&lt;/span&gt;&lt;span class="p"&gt;());&lt;/span&gt; 
    &lt;span class="nf"&gt;Ok&lt;/span&gt;&lt;span class="p"&gt;(())&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Our goal is to create native execution wrappers for common AI models so that developers can just use them as libraries.&lt;/p&gt;

&lt;h2&gt;
  
  
  Deploy the face detection example
&lt;/h2&gt;

&lt;p&gt;As prerequisites, you will need to install Rust, Node.js, the &lt;a href="https://www.secondstate.io/ssvm/" rel="noopener noreferrer"&gt;Second State WebAssembly VM&lt;/a&gt;, and the &lt;a href="https://www.secondstate.io/articles/ssvmup/" rel="noopener noreferrer"&gt;ssvmup&lt;/a&gt; tool. &lt;a href="https://www.secondstate.io/articles/setup-rust-nodejs/" rel="noopener noreferrer"&gt;Check out the instruction&lt;/a&gt; for the steps, or simply use our Docker image. You will also need the TensorFlow library on your machine.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$ &lt;/span&gt;wget https://storage.googleapis.com/tensorflow/libtensorflow/libtensorflow-gpu-linux-x86_64-1.15.0.tar.gz
&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;sudo tar&lt;/span&gt; &lt;span class="nt"&gt;-C&lt;/span&gt; /usr/ &lt;span class="nt"&gt;-xzf&lt;/span&gt; libtensorflow-gpu-linux-x86_64-1.15.0.tar.gz
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To deploy the face detection example, we start from the native TensorFlow model driver. You can compile it from Rust source code in &lt;a href="https://github.com/second-state/AI-as-a-Service/tree/master/native_model_zoo/face_detection_mtcnn" rel="noopener noreferrer"&gt;this project&lt;/a&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# in the native_model_zoo/face_detection_mtcnn directory&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;cargo &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;--path&lt;/span&gt; &lt;span class="nb"&gt;.&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, go to the &lt;a href="https://github.com/second-state/AI-as-a-Service/tree/master/nodejs/face_detection_service" rel="noopener noreferrer"&gt;web application project&lt;/a&gt;. Run the ssvmup command to build the WebAssembly function from Rust. Recall that this WebAssembly function performs data preparation logic for the web application.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# in the nodejs/face_detection_service directory&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;ssvmup build
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With the WebAssembly function built, you can now start the Node.js application.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$ &lt;/span&gt;npm i express express-fileupload uuid

&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;cd &lt;/span&gt;node
&lt;span class="nv"&gt;$ &lt;/span&gt;node server.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The web service is now available at port 8080 of your computer. Try it with your own selfies or family and group photos!&lt;/p&gt;

&lt;h2&gt;
  
  
  The TensorFlow model zoo
&lt;/h2&gt;

&lt;p&gt;The native Rust crate &lt;a href="https://github.com/second-state/AI-as-a-Service/blob/master/native_model_zoo/face_detection_mtcnn/src/main.rs" rel="noopener noreferrer"&gt;&lt;code&gt;face_detection_mtcnn&lt;/code&gt;&lt;/a&gt; is a fragile wrapper around the TensorFlow library. It loads the trained TensorFlow model, known as a frozen saved model, sets up inputs for the model, executes the model, and retrieves output values from the model.&lt;/p&gt;

&lt;p&gt;In fact, our wrapper only retrieves the box coordinates around detected faces. The model actually provides a confidence level for each detected face and position of eyes, mouth, and nose on each face. By changing the retrieve tensor names in the model, the wrapper could get this information and return to the WASM function.&lt;/p&gt;

&lt;p&gt;If you would like to use a different model, it should be fairly easy to follow the example and create a wrapper for your own model. You just need to know the input and output of tensor names and their data types.&lt;/p&gt;

&lt;p&gt;To achieve this goal, we create a project, called the &lt;a href="https://github.com/second-state/AI-as-a-Service/tree/master/native_model_zoo" rel="noopener noreferrer"&gt;native model zoo&lt;/a&gt;, to develop ready-to-use Rust wrappers for as many TensorFlow models as possible. &lt;/p&gt;

&lt;h2&gt;
  
  
  What's next
&lt;/h2&gt;

&lt;p&gt;In this article, we demonstrated how to implement a real-world AI as a Service use case in Node.js using Rust and WebAssembly. Our approach provides a framework for the entire community to contribute to a “model zoo”, which can serve as the AI library for more application developers.&lt;/p&gt;

&lt;p&gt;In the next tutorial, we will review another TensorFlow model for image classification and demonstrate how the wrapper can be extended to support a whole class of similar models.&lt;/p&gt;

</description>
      <category>rust</category>
      <category>webassembly</category>
      <category>tensorflow</category>
      <category>ai</category>
    </item>
    <item>
      <title>Raspberry Pi on steroids with Rust and WebAssembly</title>
      <dc:creator>alabulei1</dc:creator>
      <pubDate>Fri, 10 Jul 2020 11:22:17 +0000</pubDate>
      <link>https://dev.to/alabulei1/raspberry-pi-on-steroids-with-rust-and-webassembly-43i</link>
      <guid>https://dev.to/alabulei1/raspberry-pi-on-steroids-with-rust-and-webassembly-43i</guid>
      <description>&lt;p&gt;The Raspberry Pi is a very powerful computer in a tiny package. The cheapest option, the &lt;a href="https://www.raspberrypi.org/products/raspberry-pi-zero/"&gt;Raspberry Pi Zero&lt;/a&gt;, is capable of running a fully featured Linux distribution and driving a high definition display. It is the size of 3 coins (US Quarters) and costs $5. At $10, the &lt;a href="https://www.raspberrypi.org/products/raspberry-pi-zero-w/"&gt;Raspberry Pi Zero W&lt;/a&gt; comes with integrated WiFi and Bluetooth.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;BTW, we are giving away &lt;a href="https://www.secondstate.io/articles/raspberry-pi-for-free-20200709/"&gt;FREE Raspberry Pi kits&lt;/a&gt;! All you need to do is to follow &lt;a href="https://github.com/second-state/ssvm-nodejs-starter"&gt;our simple examples&lt;/a&gt; and create a Node.js app with Rust. It is easy as Pi! 🍕&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PaMAuPV0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/jo94357trrubrkpgjw85.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PaMAuPV0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/jo94357trrubrkpgjw85.jpg" alt="Raspberry Pi"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With a generic ARM CPU and easy networking, the Raspberry Pi can easily become a personal application server for you. For example, you can put a web application (eg a collaborative record keeping app) on a Pi, bring it to a meeting, and make it accessible to everyone in the room. You do not even need the Internet. It is completely decentralized and censorship resistant.&lt;/p&gt;

&lt;p&gt;The personal server is especially useful for developers. You can have a separate environment to deploy and test your server-side applications without having to mess with your laptop. The personal dev server is like Docker on steroids.&lt;/p&gt;

&lt;p&gt;However, the $5 Pi is also, obviously, a resource constrained server. It only has 512MB of RAM and a single CPU core. It could benefit greatly from a lightweight and high-performance application runtime. But at the same time, we still like the ease-of-use and developer productivity of “heavy weight” scripting languages such as JavaScript. We want &lt;a href="https://www.secondstate.io/articles/why-webassembly-server/"&gt;best of both worlds&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;Our solution is to deploy high-performance and resource-efficient &lt;a href="https://www.secondstate.io/articles/getting-started-with-rust-function/"&gt;Rust functions inside Node.js&lt;/a&gt; JavaScript apps. The &lt;a href="https://www.secondstate.io/ssvm/"&gt;Second State WebAssembly VM (SSVM)&lt;/a&gt; provides a light, efficient, secure, and portable runtime for Rust code. In this article, I will teach you how to set it up.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setup Raspberry Pi OS
&lt;/h2&gt;

&lt;p&gt;The easiest way to set up your Raspberry device to install the Raspberry Pi OS from a MicroSD card. The Raspberry Pi OS is a Debian-based Linux distribution that is suitable for both desktop and server uses. You can buy a blank MicroSD card and use the &lt;a href="https://www.raspberrypi.org/downloads/"&gt;Raspberry Pi Imager&lt;/a&gt; to load the NOOBS system on it. Or, you can buy one of those MicroSD cards with NOOBS pre-loaded.&lt;/p&gt;

&lt;p&gt;Put the MicroSD card into your Pi device’s card slot, connect an HDMI display, keyboard, mouse, and power up! Follow the on-screen instructions to install Raspberry Pi OS. From there, you can create a user account, connect to WiFi, turn on SSH, and open the command line terminal. In order to use the Pi device as a “headless” server, you could request a static IP address from your router. In the future, you can just power it on, and connect to it via SSH from your laptop — there is no need for display, keyboard, and mouse. Once you are set up, use the following command to find your Pi’s IP address on your local network.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ hostname -I
192.168.2.108 172.17.0.1
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;As with all new Linux installations, it is a good idea to update and upgrade to the latest packages. Run the command below and be patient. It could take an hour.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ `sudo apt update &amp;amp;&amp;amp; sudo apt upgrade`
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Next, run the following command to install essential developer libraries.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ sudo apt install build-essential curl libboost-all-dev
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;h2&gt;
  
  
  Install Node.js and SSVM
&lt;/h2&gt;

&lt;p&gt;The following two commands install Node.js on your Pi.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ `curl ``-``sL https``:``//deb.nodesource.com/setup_10.x | sudo bash -`
`$ sudo apt install nodejs`
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;From here, you can use npm to install modules. Here we install the &lt;a href="https://www.secondstate.io/ssvm/"&gt;Second State VM (ssvm)&lt;/a&gt; to support high performance Rust functions in Node.js applications.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ npm install ssvm
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Next, let’s try to run a couple of demo applications.&lt;/p&gt;

&lt;h2&gt;
  
  
  Run a demo Node.js application
&lt;/h2&gt;

&lt;p&gt;Get the &lt;a href="https://github.com/second-state/wasm-learning/tree/master/nodejs/quadratic"&gt;demo application&lt;/a&gt; from the Internet, and unzip the compressed archive.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;`$ curl -O https://www.secondstate.io/download/quadratic.zip`
`$ unzip quadratic.zip`
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Next, run a test program to make sure that the Node.js JavaScript function can correctly call the Rust function through SSVM.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;`$ cd quadratic/node`
`$ node test.js`
`[0.5,-3.0]`
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Start the Node.js server application from the command line terminal.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;`$ npm install express // Required for the web app.`
`$ cd quadratic/node`
`$ node server.js`
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Point the browser to &lt;a href="http://localhost:8080/"&gt;http://localhost:8080/&lt;/a&gt; or you can access it from another computer on your network. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qeHMKG9x--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/px2icgl2gkukzptxifgx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qeHMKG9x--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/px2icgl2gkukzptxifgx.png" alt="quadratic"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It is a web application that teaches math and solves quadratic equations. It could come really handy in a small group in a classroom!&lt;/p&gt;

&lt;h2&gt;
  
  
  Install developer tools
&lt;/h2&gt;

&lt;p&gt;You do not really need developer tools on a personal server. But the Raspberry Pi device is powerful enough to compile and build software. In fact, one of its common uses cases is to teach programming. The Raspberry Pi OS comes pre-loaded with developer tools for Java, Python, and Scratch. Now, let’s install some serious tools on it! I always install Git on all my development environments.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ sudo apt install git
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;The following command installs the Rust compiler toolchain on the Pi.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;`$ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh`
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Run the following command to set up the correct path without logging out and back in again.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ source $HOME/.cargo/env
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Next, you can clone our &lt;a href="https://github.com/second-state/wasm-learning/"&gt;Rust learning repository&lt;/a&gt;, and learn from examples.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ git clone https://github.com/second-state/wasm-learning.git
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Here is the &lt;a href="https://www.secondstate.io/articles/a-rusty-hello-world/"&gt;hello world example&lt;/a&gt;. Have fun!&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ cd wasm-learning/rust/hello
$ cargo build
   Compiling hello v0.1.0 (/home/pi/Dev/wasm-learning/rust/hello)
    Finished dev [unoptimized + debuginfo] target(s) in 4.35s
$ target/debug/hello
Hello, world!
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Check out the &lt;a href="https://www.rust-lang.org/learn"&gt;official Rust web site&lt;/a&gt; and the &lt;a href="https://rust-by-example-ext.com/"&gt;Rust by Example&lt;/a&gt; books for more learning resources!&lt;/p&gt;

&lt;h2&gt;
  
  
  Next steps
&lt;/h2&gt;

&lt;p&gt;Now you have everything you need on the Raspberry Pi device. The next steps are to learn more about creating high-performance and resource-constrained Node.js web applications on your $5 Raspberry Pi personal dev server.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.secondstate.io/articles/why-webassembly-server/"&gt;Why WebAssembly is the perfect runtime for server-side (or serverless) applications&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.secondstate.io/articles/getting-started-with-rust-function/"&gt;Getting started with Rust functions in Node.js&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.secondstate.io/articles/rust-functions-in-nodejs/"&gt;Passing values between JavaScript and Rust&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.secondstate.io/articles/wasi-access-system-resources/"&gt;Access operating system resources from WebAssembly&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.secondstate.io/articles/artificial-intelligence/"&gt;High performance image recognition in Node.js&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.secondstate.io/articles/machine-learning/"&gt;Machine Learning: K-means clustering and visualization&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Have fun, and let me know how you used your Raspberry Pi device!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Don’t forget to create and publish a Node.js app to receive your &lt;a href="https://www.secondstate.io/articles/raspberry-pi-for-free-20200709/"&gt;FREE Raspberry Pi&lt;/a&gt;. Can’t wait to see what you can do with Rust and WebAssembly!&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Stay in touch! Get &lt;a href="https://webassemblytoday.substack.com/?ref=dev.to"&gt;the email newsletter&lt;/a&gt; on Rust, WebAssembly, serverless, blockchain, and AI.&lt;/p&gt;

</description>
      <category>rust</category>
      <category>webassembly</category>
      <category>raspberrypi</category>
      <category>node</category>
    </item>
    <item>
      <title>How to call Rust functions from Node.js 🦀</title>
      <dc:creator>alabulei1</dc:creator>
      <pubDate>Tue, 09 Jun 2020 11:03:04 +0000</pubDate>
      <link>https://dev.to/alabulei1/how-to-call-rust-funtion-in-node-js-p36</link>
      <guid>https://dev.to/alabulei1/how-to-call-rust-funtion-in-node-js-p36</guid>
      <description>&lt;p&gt;Create JavaScript + Rust hybrid applications in Node.js. 🦄 &lt;/p&gt;

&lt;p&gt;The &lt;a href="https://www.secondstate.io/articles/rust-functions-in-nodejs/" rel="noopener noreferrer"&gt;Rust + Node.js hybrid apps&lt;/a&gt; combine Rust's performance, WebAssembly's security and portability, and JavaScript's ease-of-use. A typical Rust + Node.js hybrid app works like this.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The host application is a Node.js web application written in JavaScript. It makes WebAssembly function calls.&lt;/li&gt;
&lt;li&gt;The WebAssembly bytecode program is written in Rust. It runs inside &lt;a href="https://www.secondstate.io/articles/ssvm-performance/" rel="noopener noreferrer"&gt;the open source Second State WebAssembly VM (SSVM)&lt;/a&gt;, and is called from the Node.js web application.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Hello world for VSCode
&lt;/h3&gt;

&lt;p&gt;We take this GitHub repo as an example.&lt;/p&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev.to%2Fassets%2Fgithub-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/second-state" rel="noopener noreferrer"&gt;
        second-state
      &lt;/a&gt; / &lt;a href="https://github.com/second-state/wasmedge-nodejs-starter" rel="noopener noreferrer"&gt;
        wasmedge-nodejs-starter
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      A template project to run Rust functions in Node.js through the Second State WebAssembly engine.
    &lt;/h3&gt;
  &lt;/div&gt;
&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;The Rust functions are in the src directory. You can put high-performance workload into Rust functions.&lt;/li&gt;
&lt;li&gt;The JavaScript functions are in the node directory and they can access the Rust functions.&lt;/li&gt;
&lt;li&gt;Use the node node/app.js command to run the application in Node.js.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The secret of no-software is VS codespaces. Code, build, and run directly from inside the browser. &lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/j85cbNsciOs"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;VS Codespaces runs entirely in your browser and costs around $1 per workday. It is cheaper than a cup of coffee in the office. Alternatively, use locally installed VSCode and Docker, and launch the IDE with your remote git repository.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;1 First, open the &lt;a href="https://online.visualstudio.com/" rel="noopener noreferrer"&gt;VS Codespaces&lt;/a&gt; web site and log in with your Azure account. You can get a &lt;a href="https://azure.microsoft.com/en-us/free/" rel="noopener noreferrer"&gt;free Azure account&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;2 Next, create a new Codespace. Put your forked repository into the Git Repository field.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fv6tsljah1hugxxw7ekb5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fv6tsljah1hugxxw7ekb5.png" alt="Create a new Codespace"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3 Then open the &lt;code&gt;src/lib.rs&lt;/code&gt;, &lt;code&gt;node/app.js&lt;/code&gt;, and &lt;code&gt;Cargo.toml&lt;/code&gt; files and see how the Node.js express app calls the Rust function to say hello.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F0d4snd66ufg5e9i542ro.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F0d4snd66ufg5e9i542ro.png" alt="Code in Codespace"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;4 Click on the Run button on the left panel, and then the Launch Program at the top to build and run the application.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fiwrxmf951s6djc2pyxbt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fiwrxmf951s6djc2pyxbt.png" alt="Build and run"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Terminal window at the bottom shows the build progress. It builds the Rust program and then launches the Node.js app.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgyh93y76h3uygnqxstn8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgyh93y76h3uygnqxstn8.png" alt="Build"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Debug window shows the Node.js server running and waiting for web requests.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F52br72mrx2cw1qadfomc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F52br72mrx2cw1qadfomc.png" alt="Debug"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;5 Now, you have two choices. You could use the proxy link for &lt;code&gt;127.0.0.1:3000&lt;/code&gt; to access the running server in a browser.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fnd21rrbeldjp9yri69od.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fnd21rrbeldjp9yri69od.png" alt="Browser link"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Or, you could open another terminal window in the IDE via the &lt;code&gt;Terminal -&amp;gt; New Terminal&lt;/code&gt; menu.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fdk22juyemyve6l87q84n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fdk22juyemyve6l87q84n.png" alt="Open Terminal"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From the terminal window, you can test the local server.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ curl http://127.0.0.1:3000/?name=SSVM
hello SSVM
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  More exercises
&lt;/h3&gt;

&lt;p&gt;Now, you can copy and paste code from &lt;a href="https://github.com/second-state/wasm-learning/tree/master/nodejs/functions" rel="noopener noreferrer"&gt;this project&lt;/a&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;src/lib.rs&lt;/code&gt; --&amp;gt; Replace with &lt;a href="https://github.com/second-state/wasm-learning/blob/master/nodejs/functions/src/lib.rs" rel="noopener noreferrer"&gt;code here&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;Cargo.toml&lt;/code&gt; --&amp;gt; Replace with &lt;a href="https://github.com/second-state/wasm-learning/blob/master/nodejs/functions/Cargo.toml" rel="noopener noreferrer"&gt;code here&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;node/app.js&lt;/code&gt; --&amp;gt; Replace with &lt;a href="https://github.com/second-state/wasm-learning/blob/master/nodejs/functions/node/app.js" rel="noopener noreferrer"&gt;code here&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Click on Run to see the build output in the Terminal window, and application console output in Debug window.&lt;/p&gt;

&lt;p&gt;Try to log into GitHub from the IDE, and use the IDE's GitHub integration features to commit the changes, push the changes back into your forked repository, and perhaps even send us a Pull Request from the IDE!&lt;/p&gt;

&lt;h3&gt;
  
  
  Resources
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://github.com/second-state/ssvm" rel="noopener noreferrer"&gt;The Second State VM (SSVM)&lt;/a&gt; is a high-performance WebAssembly virtual machine designed for server-side applications.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/second-state/ssvm-napi" rel="noopener noreferrer"&gt;The SSVM NPM addon&lt;/a&gt; provides access to the SSVM, and programs in it, through a Node.js host application.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/second-state/ssvmup" rel="noopener noreferrer"&gt;The SSVM ready tool, ssvmup&lt;/a&gt; is a toolchain for compiling Rust programs into WebAssembly, and then make them accessible from JavaScripts via the SSVM.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Stay in touch! Get the &lt;a href="https://webassemblytoday.substack.com/?ref=dev.to"&gt;email newsletter&lt;/a&gt; on Rust, WebAssembly, serverless, blockchain, and AI.&lt;/p&gt;

</description>
      <category>rust</category>
      <category>node</category>
      <category>webassembly</category>
      <category>webdev</category>
    </item>
    <item>
      <title>WebAssembly on the server side</title>
      <dc:creator>alabulei1</dc:creator>
      <pubDate>Thu, 05 Dec 2019 05:46:41 +0000</pubDate>
      <link>https://dev.to/alabulei1/webassembly-on-the-server-side-1o5p</link>
      <guid>https://dev.to/alabulei1/webassembly-on-the-server-side-1o5p</guid>
      <description>&lt;div class="ltag__stackexchange--container"&gt;
  &lt;div class="ltag__stackexchange--title-container"&gt;
    
      &lt;div class="ltag__stackexchange--title"&gt;
        &lt;h1&gt;
          &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pTF_nE4a--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://practicaldev-herokuapp-com.freetls.fastly.net/assets/stackoverflow-logo-b42691ae545e4810b105ee957979a853a696085e67e43ee14c5699cf3e890fb4.svg" alt=""&gt;
            &lt;a href="https://stackoverflow.com/questions/59167805/webassembly-on-the-server-side" rel="noopener noreferrer"&gt;
               WebAssembly on the server side
            &lt;/a&gt;
        &lt;/h1&gt;
        &lt;div class="ltag__stackexchange--post-metadata"&gt;
          &lt;span&gt;Dec  4 '19&lt;/span&gt;
            &lt;span&gt;Comments: 1&lt;/span&gt;
            &lt;span&gt;Answers: 0&lt;/span&gt;
        &lt;/div&gt;
      &lt;/div&gt;
      &lt;a class="ltag__stackexchange--score-container" href="https://stackoverflow.com/questions/59167805/webassembly-on-the-server-side" rel="noopener noreferrer"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5MiFESHx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://practicaldev-herokuapp-com.freetls.fastly.net/assets/stackexchange-arrow-up-eff2e2849e67d156181d258e38802c0b57fa011f74164a7f97675ca3b6ab756b.svg" alt=""&gt;
        &lt;div class="ltag__stackexchange--score-number"&gt;
          0
        &lt;/div&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Rk_a5QFN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://practicaldev-herokuapp-com.freetls.fastly.net/assets/stackexchange-arrow-down-4349fac0dd932d284fab7e4dd9846f19a3710558efde0d2dfd05897f3eeb9aba.svg" alt=""&gt;
      &lt;/a&gt;
    
  &lt;/div&gt;
  &lt;div class="ltag__stackexchange--body"&gt;
    
&lt;p&gt;If WASM+WASI existed in 2008, we wouldn't have needed to created Docker. That's how important it is. WebAssembly on the server is the future of computing. The co-founder of Docker, quoting Solomon Hykes.&lt;/p&gt;
&lt;p&gt;And an article I saw recently reads &lt;a href="https://medium.com/wasm/webassembly-on-the-server-side-c584f874b4a3" rel="nofollow noreferrer"&gt;as Wasm gains popularity on the client-side, Wasm is also&lt;/a&gt;…&lt;/p&gt;
    
  &lt;/div&gt;
  &lt;div class="ltag__stackexchange--btn--container"&gt;
    
      &lt;a href="https://stackoverflow.com/questions/59167805/webassembly-on-the-server-side" rel="noopener noreferrer"&gt;Open Full Question&lt;/a&gt;
    
  &lt;/div&gt;
&lt;/div&gt;


</description>
      <category>webassembly</category>
      <category>microservices</category>
    </item>
  </channel>
</rss>
