When Brendan Eich, during his time at Netscape created JavaScript in 1995, I doubt that he seldom had any idea of what the language will grow out to be in the coming future. When Netscape partnered with Sun to take on their competitor Microsoft, Brendan Eich decided to surf the tidal wave of hype surrounding Java. He found this reason compelling enough to rename Mocha - the language that he created to turn the web into a full-blown application platform - to JavaScript. He envisioned JavaScript to be marketed as a companion language to Java, in the same was as Visual Basic was to C++. So the name was a straightforward marketing ploy to gain acceptance.
By the 2000s, when Doughlas Crockford invented the JSON data format using a subset of JavaScript syntax, a critical mass of developers emerged who started viewing JavaScript as a serious language. However, due to some early design choices like: automatic semicolon insertion (ASI), the event loop, lack of classes, unusual prototypical inheritance, type coercion etc. turned out to be tools for developers to laugh at and to ridicule those who were using this language. This cycle still continues.
It was only until a few years earlier due to "Web 2.0" applications such as Flickr, Gmail etc. when the world realized what a modern experience on the web could be like. It was also due to a still ongoing healthy competition between many browsers who competed to offer users a better experience and a better performance that the JavaScript engines also started becoming considerably better. Development teams behind major browsers worked hard to offer better support for JavaScript and find ways to make JavaScript run faster. This triggered significant improvements in a particular JavaScript engine called as V8 (also known as Chrome V8 for being the open-source JavaScript engine of The Chromium Project).
It was in 2009, when Ryan Dahl paid special attention to this V8 engine to create Node.js. His focus, initially was heavily on building event-driven HTTP servers. The main aim of event-driven HTTP servers is resolving the C10k problem. Simply put, the event-driven architecture provides relatively better performance while consuming lesser resources at the same time. It achieves this by avoiding spawning additional threads and the overheads caused by thread context-switching. It instead uses a single process to handle every event on a callback. This attempt of Ryan Dahl turned out to be crucial for the popularity that server-side JavaScript enjoys today.
Node.js, since then, has proved to be a very successful software platform. People have found it useful for building web development tooling, building standalone web servers, and for a myriad of other use-cases. Node, however, was designed in 2009 when JavaScript was a much different language. Out of necessity, Node had to invent concepts which were later taken up by the standards organizations and added to the language differently. Having said that, there have also been a few design decisions that Node suffers from. These design mistakes, compelled Ryan step down from the Node.js project. He has, since then, been working on another runtime which aims at solving these issues: Deno . In this blog post, we will look at two of the major JavaScript runtimes that enable server-side JavaScript: Node.js and Deno. We will have a look at the problems with Node, and how Deno aims at resolving those.
Design mistakes in Node
A lot of the discussion that is about to follow is inspired from a talk that Ryan Dahl delivered at a JSConf. In the talk, he discusses about the problems that Node has. This doesn't necessarily mean that all Node projects should be abandoned at this very instance. It is important to note that Node is not going anywhere and that it is here to stay. It is only because of some of the inherent problems that Node has because of the not-so-rich JavaScript that was available at the time of it's design. This was in addition to some features and functionalities which were added on top of Node which made it a huge monolith thereby making things hard to change.
Event-emitters
Promises in Node.js promised to do some work and then had separate callbacks that would be executed for success and failure as well as handling timeouts. Another way to think of promises in Node.js was that they were emitters that could emit only two events: success and error. At the time of designing Node, JavaScript did not have the concept of Promises or async / await. Node's counterpart to promises was the EventEmitter, which important APIs are based around, namely sockets and HTTP. Async / await was later introduced more as a syntactic sugar to implement Promises. When implemented the right way, Promises are a great boon for the event-driven architecture.
Node's implementation of using EventEmitter though, has a small problem called as 'back-pressure'. Take a TCP socket, for example. The socket would emit "data" events when it received incoming packets. These "data" callbacks would be emitted in an unconstrained manner, flooding the process with events. Because Node continues to receive new data events, the underlying TCP socket does not have proper back-pressure, the remote sender has no idea the server is overloaded and continues to send data.
Security
The V8 engine, by itself, is a very good security sandbox. However, Node failed to capitalize big on this. In it's earlier days, there was no way telling what a package can do with the underlying file system unless and until someone really looked into it's code. The trust comes from community usage.
Build system
Build systems are very difficult and very important at the same time. Node uses GYP as it's build system. GYP is intended to support large projects that need to be built on multiple platforms (e.g., Mac, Windows, Linux), and where it is important that the project can be built using the IDEs that are popular on each platform as if the project is a βnativeβ one. If a Node module is linking to a C-library, GYP is used to compile that C-library and link it to Node. GYP was something that Chrome used at that time when Node was designed. Chrome, eventually, for various reasons, abandoned GYP for GN. This left Node as the sole GYP user.
Node modules
When npm version 1 was released by Isaac Schlueter, it soon became the defacto standard. It solved some problems like ' dependency hell '. Before npm, a 'dependency hell' usually occurred if one tried to install two versions of a package within the same folder. This resulted in the app to break. Thanks to npm, dependencies were now stored within the node_modules folder. But an unintended side-effect of this was that now every project had a 'node_modules' directory in it. This resulted in increasing consumption of disk space. In addition to it, it added some overhead to the Module Resolution Algorithm. Node has to first look out in one of the local folders, followed by the project's node_modules, failing which it had to search in the global node_modules. More complexity was added to this as the modules didn't have any extensions to it. The module loader has to query the file system at multiple locations trying to guess what the user intended.
Having said all this, it is important to mention that there are no inherent breaking faults in Node. Node.js is a time-tested and proven runtime. It recently completed ten years of it's existence. The awesome community has been instrumental in the humongous success that node enjoys today. npm, today, is one of the biggest package repositories ever. But as a developer who cannot unsee the bugs that he himself introduced in the system, Ryan couldn't help but move on to a different endeavor. The above reasons motivated him to work on Deno: A secure runtime for Javascript and Timescript .
Deno
The name, Deno is actually derived as an anagram of Node. It is best described as per it's website:
Deno is a simple, modern and secure runtime for JavaScript and TypeScript that uses V8 and is built in Rust.
There are a lot of things to pay attention to in this simple description. Let's go over them one-by-one:
Security
Security is one of the biggest USPs of Deno. Deno aims to mimic the browser. And just like any browser, the JavaScript running in it does not have any access to the underlying file-system, etc., by default. Deno, in the same way, provides a secure sandbox for JavaScript to run in. By default, the JavaScript running within the runtime has no permissions. The user has to explicitly grant each and every individual permission which his app requires.
Module system
At the moment, there is no package.json in Deno, neither there is any intention to bring anything like that anytime sooner. Imports will always be via relative or absolute URLs only. At the time of this writing, Deno does not support any of the npm package. During the early stage of it's design, it was made clear that there are no plans to support Node modules due to the complexities involved. However, there have been some discussions making rounds about the same, but it has not arrived at any conclusion yet.
TypeScript Support
Deno's standard modules are all written in TypeScript. The TypeScript compiler is directly compiled into Deno. Initially, this caused the startup time to be almost around ~1 minute. But this problem was quickly addressed, thanks to V8 snapshots. This greatly brought down the startup times. This enabled TS compilers to start-up scripts very quickly. TypeScript is treated as a first class language. Users can directly import TypeScript code (with the .ts extension) immediately.
Rust
In it's early days, Deno was prototyped in Go. Now, however, for various reasons, Deno has been converted in a solid Rust project. Unlike Node, Deno is not a huge monolith, but rather a collection of Rust crates. This was done to facilitate opt-in functionality for users who may not desire to have the entire Deno executable packaged into one, but would rather be happy with only a collection of selective modules. This allows users to build their own executables.
Limitations
It should be noted that Deno is not a fork of Node. While Node is over a decade old, Deno has been in development only from the past two years. At the time of this writing, Deno v1.0.0 was released only a few days ago, on the 13th of May, 2020. Deno may not be suitable for many use-cases today as it still has some limitations:
- at this moment, Deno is not compatible with Node (NPM) package managers
- accessing native systems beyond that which is provided by Deno is difficult. Hence it has a very nascent plugins / extensions system at the moment
- the TypeScript compiler may prove to be a bottleneck in some cases. Plans are in place to port TSC to Rust
- the HTTP server performance is just at par with that of Node (25k requests served by Deno vs 34k requests served by Node for a hello-world application)
Final Thoughts
The history of JavaScript has been long and full of bumps. Today, it is one of the most trending and fastest growing languages. The community is as active as ever. Node.js, V8 and other projects have brought JavaScript to places it was never thought for. With Deno, another important chapter is being written in the history of JavaScript. As of now, according to me, Deno cannot be looked at as a replacement of Node. It can definitely be considered as an alternative to Node. But even for that, we may have to wait for some future releases of Deno for it to mature. Having said that, this is a great time to be alive as a JavaScript developer. With the ecosystem thriving, today a JavaScript developer can function at any vertical of the system, be it front-end, back-end, database, etc. With the release of Deno, we can easily bet on runtimes enabling JavaScript to be run on servers for many years that are yet to come.
This blog first appeared on: https://blog.pratikms.com
Top comments (0)