In my previous text on gRPC for web apps I explored on the high level why one might use Protocol Buffers and gRPC. Now I'll start to go into the details how this is actually implemented.
There's a web-grpc repository on GitHub that shows a full example implementation of what I describe in this post.
There are various libraries for protobuf and gRPC on web, and they have significant differences. I will use the ts-proto library - this library generates idiomatic Typescript classes at compile time from .proto
message definitions, giving a good development experience. You get working autocomplete in IDE, compile-time type checking and reading and writing message fields works like with any other TS object. It also supports both protobuf and JSON wire formats for data. It does not implement gRPC out of the box, but provides support for gRPC-Web proxy connections. For my use case, where I encode gRPC as HTTP+JSON requests, there's unfortunately no direct support currently.
In order to integrate protobuf build into Yarn build pipeline, we'll need multiple parts, and unfortunately many of them are quite non-obvious to set up:
- protoc compiler: reads
.proto
files, and uses plugins to write implementations for them in various languages - protoc plugin for Typescript: included in the
ts-proto
library, creates Typescript class definitions and encoding/decoding methods for them - Protocol Buffers common types, distributed as a large collection of
.proto
files - Compilation script to call
protoc
with the right parameters, paths, etc.
protoc compiler
The protoc
compiler can be installed by various means. I am going to use the grpc-tools NPM package - a method not included in the official documents - since that allows building the packages with Yarn without installing additional software. The downside of this is limiting the platforms where the build can be done: ARM processors are not supported, Linux builds must be done on glibc-based distros and likely other similar limitations.
Protocol Buffer common types
Common types extend what can be described in protocol buffer messages. Some of the types, such as types for dates and times, are distributed with the compiler. Vast majority of the types are, however, distributed separately. Most importantly, we'll need the google.api.http
type that is used to describe the gRPC to HTTP+JSON mapping.
There are again many ways to get the common types. They can be copied into your codebase next to your own .proto
files. They can be linked as a submodule to your Git repository. I prefer not to pull this kind of dependencies to my codebase, however, so I'm using the google-proto-files NPM package.
Compilation script
It's time to put the compilation script together. The script will need to locate the protoc
binary, the Typescript plugin, the common type files and your own .proto
files on disk and to call the protoc
binary with the correct parameters.
When using Yarn with the PnP mode of installing packages, we need special considerations. Usually Yarn in PnP mode installs packages as .zip
files instead of populating a massive node_modules
directory. However, running a binary from inside a .zip
file is non-trivial and even if we'd manage that, the protoc
binary would not know how to load a plugin or .proto
files from inside a .zip
file. To support such cases, Yarn allows installing packages in unplugged mode, where the contents of single packages are extracted as ordinary files on disk.
Some packages define unplugged mode by default, for example grpc-tools, but all packages might not. Either they haven't been written with PnP in mind, or their main use case might not require unplugging. To unplug these, we can add a dependenciesMeta
section to our package.json
.
"dependenciesMeta": {
"google-proto-files@4.2.0": {
"unplugged": true
},
"ts-proto@2.4.0": {
"unplugged": true
}
}
Note that the packages in dependenciesMeta
are specifies with their exact version number. This number needs to match whichever version is actually installed, so I would highly recommend editing devDependencies
section that these packages are imported by their exact version number, too. For example, the dependency should be "ts-proto": "2.4.0",
instead of the default format "ts-proto": "^2.4.0",
- note the caret in the latter one.
With the dependencies installed and unplugged, we can create a Node program to locate the various pieces and run the protoc
program with correct parameters. The full version of this script is build-proto.js (GitHub)
In this script we can import the pnp
module to locate any dependencies.
import pnp from '../.pnp.cjs';
pnp.setup();
const selfName = process.argv[1];
const protocPath = pnp.resolveRequest('grpc-tools/bin/protoc', selfName);
const tsExtension = pnp.resolveRequest('ts-proto/protoc-gen-ts_proto', selfName);
const protoLibsPath = pnp.resolveToUnqualified('google-proto-files', selfName);
These functions are further described in the PnP API documentation. Importantly here, resolveRequest
locates a specific file inside a package, whereas resolveToUnqualified
can locate directories, like the root directory of a package here. We also could use import pnp from 'pnpapi'
as described in the documentation, but then this script could only be run through yarn node bin/build-proto.js
. As a flip side, when importing the .pnp.cjs
directly, we need to be explicit about where the build script is located compared to the package root directory.
Before invoking the compiler, we still need to locate the .proto
files to compile. Here, the code is written with the assumption that the files are all located in a single directory and the directory name is passed as a command line argument to the script.
import path from 'node:path';
import { readdir } from 'node:fs/promises';
const protoDir = process.argv[2];
const protoFiles = (await readdir(protoDir))
.filter((x) => x.endsWith('.proto'))
.map((x) => path.join(protoDir, x).replaceAll('\\', '/'));
As an improvement, it would be useful to change this so that it recurses into any subdirectories and locates any .proto
files in those, too.
Finally, the script invokes protoc
with parameters derived from these.
const outputDir = 'src/proto';
const args = [
`--plugin=${tsExtension}`,
`--ts_proto_out=${outputDir}`,
'--ts_proto_opt=esModuleInterop=true',
`--proto_path=${protoDir}`,
`--proto_path=${protoLibsPath}`,
...protoFiles,
];
const protoc = spawn(protocPath, args);
protoc.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
protoc.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});
protoc.on('close', (code) => {
console.log(`child process exited with code ${code}`);
});
Now we have the ability to run the compilation manually. In order to run it automatically whenever the package is installed, we can use the https://github.com/mhassan1/yarn-plugin-after-install plugin for Yarn. This plugin can be installed by calling yarn plugin import
.
$ yarn plugin import https://raw.githubusercontent.com/mhassan1/yarn-plugin-after-install/v0.6.0/bundles/@yarnpkg/plugin-after-install.js
When installed, we can add a afterInstall
clause to .yarnrc.yml
configuration file.
afterInstall: yarn node bin/build-proto.js ../proto
Now, whenever we run yarn
to install dependencies, the .proto
files will get built automatically, too.
$ yarn
➤ YN0000: · Yarn 4.5.3
➤ YN0000: ┌ Resolution step
➤ YN0000: └ Completed
➤ YN0000: ┌ Fetch step
➤ YN0000: └ Completed in 0s 259ms
➤ YN0000: ┌ Link step
➤ YN0000: │ ESM support for PnP uses the experimental loader API and is therefore experimental
➤ YN0000: └ Completed
Running `afterInstall` hook...
Working directory: /workspaces/web-grpc/web-client
Script directory: /workspaces/web-grpc/web-client/bin
Script name /workspaces/web-grpc/web-client/bin/build-proto.js
/workspaces/web-grpc/web-client/.yarn/unplugged/google-proto-files-npm-4.2.0-28512554de/node_modules/google-proto-files/
Running protoc [
'/workspaces/web-grpc/web-client/.yarn/unplugged/grpc-tools-npm-1.12.4-956df6794d/node_modules/grpc-tools/bin/protoc',
'--plugin=/workspaces/web-grpc/web-client/.yarn/unplugged/ts-proto-npm-2.4.0-c5c2c1ec55/node_modules/ts-proto/protoc-gen-ts_proto',
'--ts_proto_out=src/proto',
'--ts_proto_opt=esModuleInterop=true',
'--proto_path=../proto',
'--proto_path=/workspaces/web-grpc/web-client/.yarn/unplugged/google-proto-files-npm-4.2.0-28512554de/node_modules/google-proto-files/',
'../proto/notes.proto',
'../proto/ping.proto'
]
child process exited with code 0
➤ YN0000: · Done with warnings in 1s 101ms
This setup gets us a long way to an excellent build pipeline. Without installing extra tools on your dev system, or on a build machine, this setup gives us Typescript definitions for our protobuf messages. There are improvements to be had as well. Automated builds as the .proto
files are modified would be a big one. Expanding the range of supported platforms with Windows and ARM would be big, too.
In next posts I'll look into building the same .proto
files for a .NET server and connecting the web client and the server together. Also, remember to check out web-grpc repository on GitHub for a full implementation of the techniques described in this post.
Top comments (0)