- JS-PyTorch is a Deep Learning JavaScript library built from scratch, to closely follow PyTorch's syntax.
- Feel free to try out a Web Demo!
1. TL;DR
- In this article, we will cover simple instructions and use-cases for JS-PyTorch.
Note: The project's Documentation contains details on all available operations and layers.
2. Running it Yourself
Install & Import
To start off, you can install the package locally running npm install js-pytorch
on the terminal.
Then, on your JavaScript file, import the package with:
const torch = require("js-pytorch");
Create Tensors
To use all of these cool deep learning Tensor Operations, we need to instantiate some Tensors:
// Instantiate Tensors:
let x = torch.randn([8,4,5]);
let w = torch.randn([8,5,4], requires_grad = true);
let b = torch.tensor([0.2, 0.5, 0.1, 0.0], requires_grad = true);
The syntax is the same as PyTorch's:
-
torch.tensor(Array)
recieves an array and turns it into a Tensor. -
torch.randn(shape)
creates a Tensor filled with normally-distributed random numbers, with the provided shape. - The
requires_grad
argument is set totrue
if we want to optimize this parameter (by tracking it's gradients).
Tensor Operations
Now, let's run some operations on these Tensors:
// Make calculations:
let out = torch.matmul(x, w);
out = torch.add(out, b);
-
torch.matmul(x, w)
performs matrix multiplication betweenx
andw
(just like in PyTorch). -
torch.add(out, b)
adds both Tensors.
Note: As
w
hasrequire_grad
set totrue
, its childrenout
will also have it's gradients tracked.
Getting Gradients
// Compute gradients on whole graph:
out.backward();
// Get gradients from specific Tensors:
console.log(w.grad);
console.log(b.grad);
- Calling
out.backward()
calculates the gradients of every Tensor that let to it (its parents), relative toout
. - IRL, we will call
.backward()
on theloss
Tensor, to get the gradients necessary to reduce it. - To access a Tensor's gradients, simply call
Tensor.grad
.
3. Full Example (Neural Network):
In this example, we implement a full Neural Network, with three Linear layers, and ReLU activations. The syntax for the nn.Module
class is identical to PyTorch's.
const torch = require("js-pytorch");
const nn = torch.nn;
const optim = torch.optim;
// Implement Module class:
class NeuralNet extends nn.Module {
constructor(in_size, hidden_size, out_size) {
super();
// Instantiate Neural Network's Layers:
this.w1 = new nn.Linear(in_size, hidden_size);
this.relu1 = new nn.ReLU();
this.w2 = new nn.Linear(hidden_size, hidden_size);
this.relu2 = new nn.ReLU();
this.w3 = new nn.Linear(hidden_size, out_size);
};
forward(x) {
let z;
z = this.w1.forward(x);
z = this.relu1.forward(z);
z = this.w2.forward(z);
z = this.relu2.forward(z);
z = this.w3.forward(z);
return z;
};
};
// Instantiate Model:
let in_size = 16;
let hidden_size = 32;
let out_size = 10;
let batch_size = 16;
let model = new NeuralNet(in_size,hidden_size,out_size);
// Define loss function and optimizer:
let loss_func = new nn.CrossEntropyLoss();
let optimizer = new optim.Adam(model.parameters(), 3e-3);
// Instantiate input and output:
let x = torch.randn([batch_size, in_size]);
let y = torch.randint(0, out_size, [batch_size]);
let loss;
// Training Loop:
for (let i = 0; i < 256; i++) {
let z = model.forward(x);
// Get loss:
loss = loss_func.forward(z, y);
// Backpropagate the loss using torch.tensor's backward() method:
loss.backward();
// Update the weights:
optimizer.step();
// Reset the gradients to zero after each training step:
optimizer.zero_grad();
// Print current loss:
console.log(`Iter: ${i} - Loss: ${loss.data}`);
}
4. Building for distribution & DevTools
- To build for distribution, run
npm run build
. CJS and ESM modules andindex.d.ts
will be output in thedist/
folder. - To check the code with ESLint at any time, run
npm run lint
. - To improve code formatting with prettier, run
npm run prettier
.
5. Conclusion
- Hope you enjoyed the package!
Top comments (30)
It is much easier to deploy Javascript than Python these days (Unless you want to manage your servers). Javascript is superior to Python when it comes to the project’s isolation, environment and management (it is straight forward while Python is a hot mess). Almost all developers know some Javascript, therefore I raise you the opposite question:
Why not?
Well, that is just like, your opinion man (Haha).
Languages are tools, JS is more approachable than Python and has many other advantages over it (In my opinion). I did code in Python (using Venvs mind you) professionally for a while, JS is still much simpler to handle. I can't see why we should not use JS for ML.
Edit: Also, what does Jupyter Notebook have anything to do with? As I said, languages are nothing but tools, some tools are good for some tasks. You wouldn't want to fry an egg using a sledgehammer, right? If a language offers advantages (such as ease of access, popularity, etc) for a given task over a different language, it is worth considering. I advise you not to be so attached to the languages you code with, there is no reason to.
That’s very close to what I’d say: JavaScript can be used more easily for some applications, and while Python will likely remain the most used ML language, there’s a benefit to being able to write PyTorch-like code in native JS. Especially for simpler web demos and educational content.
There are a couple of reasons why I started the project:
Indeed !
I have not had the time to evaluate your library, but I applaud your efforts in implementing it. Well done !
Thank you for the support! If you try it out and have any questions or tips, let me know!
Great stuff, I've been keen to add some deep learning to our code stack and would prefer not to litter the source code with multiple languages. This looks to be a great start. I am worried that it will take a lot to maintain though.
Thanks for the feedback! I'm currently working really hard to make it as easy-to-use and easy-to-mantain as possible, with Developer Tools, solid documentation, unit tests and all of that. Hope it'll help you with your needs :D
Yeah, I meant hard to maintain for you too... Extra modules, keeping up to date with something else etc.
That’s true! But having good testing and tools can help with that too.
However, that’s a good thing to keep in mind, you’re right
Really cool library! I’d love to see an integration with GPU.js
Hey, I’m not really an expert on this, it GPU.js similar to WebGPU?
From what I understand about WebGPU, it provides an interface and browser support for GPU usage. GPU.js offers a back-end library to run matrix multiplications on GPU.
Got it. Thanks for the reply!
Thanks for the feedback! Working on it.
Cool! Are you open to pull requests at all on your GitHub? I’d love to get involved
Absolutely, any help is welcome :D
Deep Learning is very much about Python. Period. No need to bring everything to JS, just inappropriate and it doesn’t have any real advantages over it.
As I had replied to another comment, I'm not trying to replace PyTorch or bring Deep Learning to JavaScript. I'm simply creating a tool for DL learners to be able to use.
Sometimes it's easier to have direct access to the browser to run demos or small web-based AI applications. Or, if you don't know Python, this library could be a good place for a JS programmer to start experimenting with Deep Learning as well.
So I do agree with you about Python, I just don't think that these things are mutually exclusive :)
To further ruin our lives and computers with js
just a legit joke
Wow, I got more like to my comment than my post 🙃
Any application that can be written in JavaScript, will eventually be written in JavaScript.
Atwood's Law
Cheers to you brother 👏🏼🥃
Thank you! Guess I’ve become a part of the problem lol
thanks man
Would like to give this a try sometime, thanks for the well written overview. Is it truly native JS? No bindings at all?
Glad you like it!
The source code is actually in TypeScript, but the deployed package is native JS, ready-to-use!