DEV Community

loading...

Deno: The next step in Node.js

Siddharth
13. Coding for a hobby
・4 min read

Deno, introduced by Ryan Dahl, the creator of
Node during JSConf 2018 has been growing into a major alternative to Node.js. Deno is similar to Node.js – you write your scripts in JavaScript and run them – but Deno get's more powerful once you use it. It has first class TypeScript support, simplifies modules, is more secure, and bridges the gap between browsers and Node, and much more.

Node

Released in 2009, Node took over really quickly. Even though there was initially some skepticism about Node, support from the community was unrivalled.

Today, Node is one of the most popular tools used for backend development.

Enter Deno

Fun fact: Deno is just node reversed. no + de = node, de + no = deno.

Even though Node was great, there are many design mistake in it. You can check out the talk by Ryan Dahl to learn more, but here's a few:

  • Node didn't stick with promises. Node had added them way back in 2009, but removed them almost a year later in 2010.
  • Node wasn't secure enough. Any node program has access to system calls, http requests, filesystem calls. Your linter shouldn't have complete access to your computer and network.
  • more...

Essentially, Node was focused on IO. Modules were an afterthought. To fix all this, Ryan introduced Deno.

Deno is secure by design

Suppose you want to run a lint script. If you were using node, you would just do this:

~$ node linter.js
Enter fullscreen mode Exit fullscreen mode

But in Deno, you do this:

~$ deno run --allow-read linter.js
Enter fullscreen mode Exit fullscreen mode

There's a couple of things to note here. First is the run subcommand. Deno has a bunch of other tools, which we'll get to later.

Next thing to note is the flag --allow-read. It, along with a bunch of other flags are part of deno's security system. By default, when a script is run using deno run, it can't use anything more than the console.

Now, more security is great, but nobody wants to be putting in a bunch of --allow flags everytime you need to run stuff. Fortunately, deno provides an install command which can "install" stuff. Installing as an creating a thin wrapper in a platform-specific directory (~/.deno/bin on MacOS and Linux, not sure about Windows).

~$ deno install --allow-read linter.js
✅ Successfully installed linter
/Users/APPLE/.deno/bin/linter
~$ linter
linter running!
Enter fullscreen mode Exit fullscreen mode

The file at .deno/bin/linter is very simple:

#!/bin/sh
# generated by deno install
exec deno run --allow-read 'file:///Users/APPLE/Sites/Projects/deno-test/linter.js' "$@"
Enter fullscreen mode Exit fullscreen mode

No package managers here

Deno uses ES Modules import syntax, which means that imports must be full or relative paths to files. And unlike Node.js, there's no deno_modules (thank goodness!), and deno doesn't look anywhere special for modules.

// These work
+ import {lint} from './linter.js';
+ import {lint} from 'absolute/path/to/linter.js';
+ import {WebSocket} from "https://deno.land/std@0.103.0/ws/mod.ts";

// But these wont:
- import {lint} from './linter'; // Note the extension is missing
- import {WebSocket} from "ws"; // ws who?
Enter fullscreen mode Exit fullscreen mode

You don't have to relearn (most of) JavaScript

Deno tries to use web platform APIs (like fetch) instead of inventing a new API. These APIs generally follow the specifications and should match the implementation in Chrome and Firefox. Deno even uses web standards in it's own APIs, for example Deno's http API uses the standard Request and response objects. Deno's even got window

Node.js goes the other way replacing stuff with it's own APIs, usually using callbacks, making us reach for modules. Deno gets to take advantage of all the evolution of JavaScript instead of having to build it all again. Also, it's easier to port stuff to the web if you use Deno (and vice versa).

TypeScript is a first class citizen here

Deno has built in support for TypeScript ! This isn't just used as an external modules or anything, no extra flags, not even a tsconfig.json. There is even interoperability – import JS in TS, import TS in JS

Simpler distribution

Unlike Node, Deno is just a single binary. This makes installation and deployment a breeze. Deno can even compile programs to binaries, which is absolutely awesome! It can even cross compile!

A simple demo

Here's a simple cat implementation in deno:

// mycat.ts
import { expandGlob } from "https://deno.land/std@0.102.0/fs/expand_glob.ts";

// no need to remove the path to deno, etc.
const files = Deno.args;

files.forEach(async file => {
    for await (const fileExpansion of expandGlob(file)) {
        const contents = await Deno.readTextFile(fileExpansion.path);
        console.log(contents);
    }
});
Enter fullscreen mode Exit fullscreen mode

This script takes filenames as arguments and prints them to the console.

~$ deno run --allow-read mycat.ts cat.ts
// cat.ts
import { expandGlob } from "https://deno.land/std@0.102.0/fs/expand_glob.ts";

// no need to remove the path to deno, etc.
const files = Deno.args;
...
Enter fullscreen mode Exit fullscreen mode

Note that you don't need to install or configure anything - Deno handles that for you.

Now, we can install the script:

~$ deno install --allow-read mycat.ts
✅ Successfully installed mycat
/Users/APPLE/.deno/bin/mycat
~$
Enter fullscreen mode Exit fullscreen mode

Summary

Deno is still new. It has a thriving community and a bunch of libraries (many node libraries have been ported to deno). But it's not as popular or as supported as node. But deno's ease of use and simplicity make it useful for writing everyday scripts, and it's url-based system of sharing modules makes distributing programs as easy as putting them on a GitHub repo or personal site.

Discussion (26)

Collapse
ryands17 profile image
Ryan Dsouza

Deno looks quite interesting, but I really don't like the way you need to import packages from URL's. I still think there should be a package management system as it makes handling and updating packages easier.

Collapse
siddharthshyniben profile image
Siddharth Author • Edited

URLs are more flexible, and it may be a good idea in the long run. But package management systems make stuff easier to use.

Deno has a section in its docs on Managing modules. One of the conventions is to place all imports in a single deps.ts file. Functionality is then exported out of deps.ts for use by local modules. This makes it easier to manage dependency versions and stuff

You don't have to worry about network problems though – once you run it, Deno caches all the modules required.

Collapse
metammodern profile image
Bunyamin Shabanov

Agree on this a little, because sometimes I just want to download all packages and go code somewhere without the internet. Or a situy may happen when a dev does not have access to internet, but while having all the packages he can still do some work. With urls it becomes impossible until the caching mechanism is created(which is almost node modules). + Yoully need to always specify the needed version directly, because the package may change a lot(like react router does) and your app won't work after that.
And also as well as I understand we'll need to specify the version in all urls in all files where it is imported, not in one config.

Collapse
siddharthshyniben profile image
Siddharth Author

Caching is not like node_modules, as far as I understand. Deno caches in a directory somewhere at ~/. So, once a package is cached, it may not have to be cached again, even for a seperate program

Collapse
ivanjeremic profile image
Ivan Jeremic • Edited

You misunderstand how url imports work you need to read more about it... nothing will break because version are kept in the url. And also No internet No problem if it is cached/downloaded you can work just like with npm if there is no internet it behaves the same as npm if your internet is dead you cant do npm install what do you think npm has under the hood? Correct there are a bunch of web urls who get requested. I have no idea why people always assume you can't work without internet when using url imports all you can't do is cache new packages same with npm you can't npm install without internet but you can work with whats in the node_modules or in case of Deno whats in the cache.

Collapse
vonheikemen profile image
Heiker

I still think there should be a package management system as it makes handling and updating packages easier.

Funny that you mention this. A few months ago I did a silly experiment, I manage to use npm packages in deno using vite. It is totally possible to use npm to manage some packages of your deno app.

Collapse
ducaale profile image
Mohamed Dahir • Edited

You might be interested in Deno's import maps which could be generated by a package manager like Trex.

The only caveat is that the import map file has to be specified explicitely as opposed to node's package.json which gets picked up automatically. That is unless you use somthing like Denon.

Collapse
ivanjeremic profile image
Ivan Jeremic

This is how the web works. Try to learn more about it.

Collapse
ryands17 profile image
Ryan Dsouza • Edited

Maybe you haven't worked a lot with the web. Python, Ruby, Rust and Node all work with the same philosophy.

Thread Thread
ivanjeremic profile image
Ivan Jeremic

What I mean this is ECMA standard.

Thread Thread
rangercoder99 profile image
RangerCoder99

ECMA standards, just because someone dictate a worst way to work doesn't make it law

Thread Thread
ivanjeremic profile image
Ivan Jeremic

I mean they dictate how JS works Deno just implements it correctly.

Collapse
siddharthshyniben profile image
Siddharth Author

I think that if people are dissatisfied with URLs, people will come up with new, better ways to import modules.

This has happened before, for example, in Vim. There was no standard way to load plugins, other than by cloning them via git, but third party plugin manager were created to make it simpler. This even led to vim introducing their own plugin system.

Collapse
santanag profile image
Gustavo Santana

Siddharth makes a fantastic point. It is not like Deno does not bring along a way to manage your packages/dependencies. With that in mind, just because a method of doing something in one place works well, does not make it the superior method in another environment. It makes total sense for JS to have packages be imported via urls. Functionality can always be added on top to make it work like you are used to after the fact. People tend to think that a method of doing something is superior or the best just because they are used to it and not necessarily because it IS the better method.

Collapse
ryands17 profile image
Ryan Dsouza

My only concern is dependency management with URL imported packages. How would one go about changing the version of the package if it is been imported in 10 different places. Also solving resolution conflicts that package managers like yarn and npm perform when one package depends on several is quite handy. Having URL's is fine as long as there's a central way of managing the dependencies.

import lodash from 'lodash';
Enter fullscreen mode Exit fullscreen mode

vs

import lodash from 'some-long-url/lodash';
Enter fullscreen mode Exit fullscreen mode

I personally find the former easier to read and write. Also for autocomplete and intellisense, the dependencies need to be downloaded anyway.

Thread Thread
siddharthshyniben profile image
Siddharth Author

My only concern is dependency management with URL imported packages. How would one go about changing the version of the package if it is been imported in 10 different places.

As I said here, Deno has a convention of putting all imports in a deps.ts file. This makes it easier to change versions, as there is only a single import.

Also solving resolution conflicts that package managers like yarn and npm perform when one package depends on several is quite handy.

Deno has no "magical" module resolution. Instead, imported modules are specified as files (including extensions) or fully qualified URL imports. This makes it harder to mess up.

I personally find the former easier to read and write.

If you use the deps.ts convention, you could have this:

deps.ts

import * as _ from 'some-long-url/lodash';

// If you want it all
export {_};

// If you need only some
export {uniq: _.uniq, isEqual: _.isEqual}
Enter fullscreen mode Exit fullscreen mode

main.ts or wherever you use it

import {uniq, isEqual} from './deps.ts';

const foo = isEqual(...);
Enter fullscreen mode Exit fullscreen mode

Also for autocomplete and intellisense, the dependencies need to be downloaded anyway.

Deno caches modules once they are required once, so intellisense can work at that time.

Thread Thread
ryands17 profile image
Ryan Dsouza

Yup deps/ts is a much better way :)

Deno has no "magical" module resolution. Instead, imported modules are specified as files (including extensions) or fully qualified URL imports. This makes it harder to mess up.

Great to know! Will try this out with deps.ts. Thanks!

Collapse
siddharthshyniben profile image
Siddharth Author

BTW there is a deno registry

Collapse
marcelc63 profile image
Marcel Christianis

Used Deno in production and recently ported everything back to Node.

Tbh Deno is great and I love it, would like to use it a lot more. The only downside at the moment is the lack of libraries available. The deal breaker for me is there aren't any mature Database ORM Library for Deno right now. Aside from that, Deno is great and a pleasant to work with.

Putting my Deno project in archive and looking forward to get back to it in the future.

Collapse
rangercoder99 profile image
RangerCoder99

Deno was hyped as the Node killer when it released 1.0, quicky after no one talked about it any more when they figure out that it would be too much work convert Node apps too Deno, without that hype people not create packets for it what really is key to Node's success. It will take years to catch on again and likey the lead dev that got a history of exit projects will leave before it will be....

Collapse
ayushks11560 profile image
Ayushks11560 • Edited

Demo should make its own package manager with restricted uses of control to the package or they have to implement a system like if I import package from "react" or "any package name" it should under the hood converts it into the url or import it from local cache.
I think if it is implemented we can save lot of space in hard disk. As it is first party implementation unlike pnpm.

Collapse
zgrmrts profile image
Ozgur Murat

Nice, thank you.
The code example has an error though. I think it needs an "async" before file, and ".path" after fileExpansion as:

// mycat.ts
import { expandGlob } from "deno.land/std@0.102.0/fs/expand_gl...";

// no need to remove the path to deno, etc.
const files = Deno.args;

files.forEach(async file => {
for await (const fileExpansion of expandGlob(file)) {
const contents = await Deno.readTextFile(fileExpansion.path);
console.log(contents);
}
});

Collapse
siddharthshyniben profile image
Siddharth Author

Nice catch in the .path

The async is not needed, as deno supports top level await. It could be added though.

Collapse
webdevken profile image
Web Dev Ken

Would be cool if web browsers can make use of deno in a way so we can directly reference .ts script files in HTML and actually run them.

Collapse
santanag profile image
Gustavo Santana

I am glad that TS support is encouraged but not mandatory. In a few months I might get into Deno as well.

Collapse
adamtang79 profile image
Adam Tang

security to the max.

clone or fork and host the lib

import into coding

any bad changes won't effect your coding.