I had an idea to use Bun runtime (https://bun.sh) for running Angular server-side rendering. I began by creating a simple 'Hello World' example from scratch. Afterward, I attempted to run an Express server by incorporating the Express engine through the ng add @nguniversal/express-engine
command.
Let's begin by creating a basic standalone app and then adding the Express engine:
$ ng new bun-universal --minimal --style scss --routing false --standalone
$ cd bun-universal
$ yarn ng add @nguniversal/express-engine
Next, let's execute the command provided by schematics to build both the browser and server bundles:
$ yarn build:ssr
Afterward, we'll attempt to start the server using Bun:
$ bun dist/bun-universal/server/main.js
The output should display Node Express server listening on http://localhost:4000
. When we navigate to 'localhost' in the browser, we would anticipate seeing the server-side rendered app with the content that Angular generates when running ng new. However, what we actually observe is the following:
I invested a considerable amount of time in attempting to comprehend the exact nature of the failure. To begin, I created an additional file named server.js
in the root directory. This file contains a basic standalone Express server, which I used to determine whether it would provide a valid response:
// server.js
import path from 'node:path';
import express from 'express';
const app = express();
const distFolder = path.join(process.cwd(), 'dist/bun-universal/browser');
app.get('*.*', express.static(distFolder));
app.get('/', (req, res) => {
res.sendFile(path.join(distFolder, 'index.html'));
});
app.listen(4200, () => {
console.log(`Node Express server listening on http://localhost:4200`);
});
The code above functions as a static file server, responding only with requested files to specific URLs. I began by importing the Angular dependencies required for server-side rendering. During this process, I observed that zone.js/node
disrupts the expected behavior in certain scenarios. If we include the following line at the top of the server.js
file:
import 'zone.js/node';
When we run the server again using bun server.js
, it becomes apparent that the server never responds. This indicates that certain patches, loaded by zone.js
, are causing disruptions. These patches seem to affect either native classes exposed within the Bun runtime or built-in modules such as fs
. Notably, Express relies on internal usage of the setImmediate
and fs
module (in the send
package, which is used for streaming files). Interestingly, if we disable the node_timers
patch, the server becomes functional:
globalThis.__Zone_disable_node_timers = true;
require('zone.js/node');
import path from 'node:path';
import express from 'express';
By disabling the node_timers
patch, the zone.js
library becomes unaware of any scheduled setTimeout
or setInterval
tasks. Consequently, Angular won't wait for all of these scheduled tasks to be invoked, leading to an early return of the serialized HTML. Another issue that I've encountered involves a patch related to promises. In this case, Bun doesn't recognize ZoneAwarePromise
as a Promise
and fails to wait for promise resolution. The following example illustrates this issue:
app.get('/', async (req, res) => {
await new Promise(resolve => setTimeout(resolve, 2000));
console.log('after await');
res.sendFile(path.join(distFolder, 'index.html'));
});
Using this code, the server would once again return the default Bun page displaying the message that fetch did not return a response object. However, you would still observe the console.log
after a 2-second delay. If we disable the promise patch using the following code:
globalThis.__Zone_disable_ZoneAwarePromise = true;
We would then observe the content of the index.html
after a 2-second delay.
With all of the aforementioned points in mind, I have come to the conclusion that I need to bootstrap the server with zone.js
disabled. This is because the patches applied by zone.js
are not compatible with the Bun runtime.
Bridging the Gap
Let's begin by editing the server.ts
file. We can remove the maxAge
option from express.static since caching is not required at the moment. Additionally, let's remove the import of zone.js/node
from the top of the file.
If we follow these steps: build the app and run it again, we'll encounter the NG0908
exception. This exception indicates that Angular requires zone.js
as the NgZone
constructor relies on it:
if (typeof Zone == 'undefined') {
throw new RuntimeError(908 /* RuntimeErrorCode.MISSING_ZONEJS */, ngDevMode && `In this configuration Angular requires Zone.js`);
}
Zone.assertZonePatched();
To address the issue above, we need to replace the NgZone
injectee with NoopNgZone
. Let's edit the app.server.config.ts
file in the src/app
directory:
import {
mergeApplicationConfig,
ApplicationConfig,
NgZone,
ɵNoopNgZone,
} from '@angular/core';
import { provideServerRendering } from '@angular/platform-server';
import { appConfig } from './app.config';
const serverConfig: ApplicationConfig = {
providers: [
provideServerRendering(),
{ provide: NgZone, useClass: ɵNoopNgZone },
],
};
export const config = mergeApplicationConfig(appConfig, serverConfig);
Build the app again, run the server, and use curl
. You should observe the following rendered HTML:
With the setup described above, this approach would only work successfully for apps that don't use any asynchronous APIs during rendering. However, this isn't a realistic scenario, as HTTP requests are often made during server-side rendering, typically to fetch data once and save it to the transfer state.
Angular waits until all tasks scheduled during rendering are completed, and then it serializes the HTML to return it to the client:
await applicationRef.isStable.pipe((first((isStable: boolean) => isStable))).toPromise();
Let's add provideHttpClient()
to the app.config.ts
file and incorporate the following code into our AppComponent
:
export class AppComponent {
title = 'bun-universal';
constructor() {
inject(HttpClient)
.get('https://jsonplaceholder.typicode.com/todos/1')
.subscribe(() => {
this.title = 'bun-universal-v2';
});
}
}
If we build the app, run the server again, and perform the curl
request, we'll observe that the title
remains bun-universal
:
<h1>Welcome to bun-universal!</h1>
This is because the application becomes stable immediately, as there are no running micro and macro tasks.
We need to implement a mechanism to wait until all HTTP requests, scheduled during the render, are completed. Angular already includes a class called ɵInitialRenderPendingTasks
which encapsulates the behavior subject hasPendingTasks
. The value of hasPendingTasks
changes whenever all HTTP tasks are completed. This class is utilized by the HttpInterceptorHandler
in the @angular/common/http
package. Angular increments the count of pending tasks when an HTTP request is initiated.
We also need to implement a custom isStable
behavior in the ApplicationRef
as the default implementation relies on ngZone.isStable
. Furthermore, it's essential to wait until the app is bootstrapped before subscribing to hasPendingTasks
. This precaution is necessary because the hasPendingTasks
might emit false
prematurely, before any HTTP request is scheduled.
Next, we will create the AppBootstrapped
class. Please note that I'll place everything in the app.config.server.ts
file:
@Injectable({ providedIn: 'root' })
export class AppBootstrapped extends BehaviorSubject<boolean> {
constructor() {
super(false);
}
}
The above subject will emit true when the app is bootstrapped (when the ApplicationRef.bootstrap
is being called and invokes listeners resolved from the APP_BOOTSTRAP_LISTENER
injection token).
And the custom class that extends the original ApplicationRef
and offers a customized isStable
implementation:
@Injectable()
export class NoopNgZoneApplicationRef extends ApplicationRef {
override isStable: Observable<boolean>;
constructor() {
super();
const pendingTasks = inject(ɵInitialRenderPendingTasks);
this.isStable = inject(AppBootstrapped).pipe(
filter(appBootstrapped => appBootstrapped),
mergeMap(() => pendingTasks.hasPendingTasks),
map(hasPendingTasks => !hasPendingTasks)
);
}
}
We inject the AppBootstrapped
class, wait for the app to be bootstrapped, and then we switch the subscription to hasPendingTasks
, negating the value to align it with isStable
. The final content of the app.server.config.ts
:
import {
mergeApplicationConfig,
ApplicationConfig,
NgZone,
ɵNoopNgZone,
ApplicationRef,
Injectable,
inject,
ɵInitialRenderPendingTasks,
APP_BOOTSTRAP_LISTENER,
} from '@angular/core';
import { provideServerRendering } from '@angular/platform-server';
import { BehaviorSubject, Observable, filter, map, mergeMap } from 'rxjs';
import { appConfig } from './app.config';
@Injectable({ providedIn: 'root' })
export class AppBootstrapped extends BehaviorSubject<boolean> {
constructor() {
super(false);
}
}
@Injectable()
export class NoopNgZoneApplicationRef extends ApplicationRef {
override isStable: Observable<boolean>;
constructor() {
super();
const pendingTasks = inject(ɵInitialRenderPendingTasks);
this.isStable = inject(AppBootstrapped).pipe(
filter(appBootstrapped => appBootstrapped),
mergeMap(() => pendingTasks.hasPendingTasks),
map(hasPendingTasks => !hasPendingTasks)
);
}
}
const serverConfig: ApplicationConfig = {
providers: [
provideServerRendering(),
{ provide: NgZone, useClass: ɵNoopNgZone },
{ provide: ApplicationRef, useClass: NoopNgZoneApplicationRef },
{
provide: APP_BOOTSTRAP_LISTENER,
multi: true,
useFactory: () => {
const appBootstrapped = inject(AppBootstrapped);
return () => appBootstrapped.next(true);
},
},
],
};
export const config = mergeApplicationConfig(appConfig, serverConfig);
The final step is to manually run change detection when the title is altered, as there's no automatic trigger for the ApplicationRef.tick
method. It's important to note that manual change detection triggering is also necessary within OnPush
components. Consequently, the code can be shared interchangeably between both the browser and the server:
export class AppComponent {
title = 'bun-universal';
constructor() {
const ref = inject(ChangeDetectorRef);
inject(HttpClient)
.get('https://jsonplaceholder.typicode.com/todos/1')
.subscribe(() => {
this.title = 'bun-universal-v2';
ref.detectChanges();
});
}
}
Now, after building and running the server, perform the curl request once more:
We can notice the <h1>Welcome to bun-universal-v2</h1>
element.
Tracking Timers
There's currently no ability to track timers scheduled with setTimeout
, and there's also no practical reason to permit timers to be scheduled when the code is running on the server side. The primary rationale behind this restriction is that any timer could potentially introduce delays in the response. In the conventional approach, Angular would await app stability (await appRef.isStable
) before responding. While it's possible to wrap all timers with isPlatformBrowser
, it's often the case that there's limited control over the code where these timers are scheduled.
Consider a scenario where someone subscribes to a stream of values and pipes the stream with debounceTime
, which internally uses asyncScheduler
by default. Each time the stream emits a value, the operator will re-schedule the internal timer.
The code running on the server side typically isn't concerned with the synchronous or asynchronous nature of Angular code. To be candid, timers and animation frames are commonly utilized in the browser to enhance UI performance and prevent potential frame drops during rendering. However, on the server side, frame drops aren't a concern, thus there's no reason to schedule timers. In cases where the code is executed on the server side, the identity
function from RxJS can serve as a fallback:
export class AppComponent {
constructor() {
const isBrowser = isPlatformBrowser(inject(PLATFORM_ID));
source$
.pipe(isBrowser ? debounceTime(1000) : identity, takeUntilDestroyed())
.subscribe(() => {
// ...
});
}
}
However, as I mentioned previously, since we don't have the necessity to schedule timers explicitly in our code, they may still be scheduled by third-party libraries being used on the server-side.
Adding and Removing Tasks
Let's consider the following example: we lazy-load the library that generates a random nonce (required for lazy-loading Node-only libraries), hand over control to a function that performs asynchronous operations internally, and then we write the result to the transfer state:
export class AppComponent implements AfterViewInit {
title = 'bun-universal';
private readonly _transferState = inject(TransferState);
private readonly _isServer = isPlatformServer(inject(PLATFORM_ID));
async ngAfterViewInit(): Promise<void> {
if (this._isServer) {
const { cryptoRandomStringAsync } = await import('crypto-random-string');
const nonce = await cryptoRandomStringAsync({
length: 20,
type: 'base64',
});
this._transferState.set(CSP_NONCE_KEY, nonce);
}
}
}
If we run the server and perform a curl
command, we'll notice that the state is not serialized. On the server side, import is equivalent to Promise.resolve().then(() => require(...))
. The import
microtask is scheduled before the appRef.isStable.toPromise()
microtask. Similarly, the cryptoRandomStringAsync
is also scheduled before the isStable
microtask. However, the isStable
microtask gets resolved earlier due to cryptoRandomStringAsync
scheduling other microtasks. Any newly scheduled microtasks will be added to the end of the microtask queue. This is why the HTML is serialized before the value is set on the transfer state.
Since we're already familiar with the InitialRenderPendingTasks
class, we can benefit from its functionality by notifying Angular that there are still pending tasks that must be awaited until they are completed:
export class AppComponent implements AfterViewInit {
title = 'bun-universal';
private readonly _transferState = inject(TransferState);
private readonly _isServer = isPlatformServer(inject(PLATFORM_ID));
private readonly _pendingTasks = inject(ɵInitialRenderPendingTasks);
async ngAfterViewInit(): Promise<void> {
if (this._isServer) {
const taskId = this._pendingTasks.add();
const { cryptoRandomStringAsync } = await import('crypto-random-string');
const nonce = await cryptoRandomStringAsync({
length: 20,
type: 'base64',
});
this._transferState.set(CSP_NONCE_KEY, nonce);
this._pendingTasks.remove(taskId);
}
}
}
Benchmarking Node and Bun
Bun has a benchmarking for a 'hello world' server-sire rendering React app, stating that it's 2 times faster than Deno and 3 times faster than Node.
I've opted to create Docker images and run them locally in containers. Subsequently, I will utilize the autocannon
tool to conduct load testing on the root endpoint.
Let's begin by creating a .dockerignore
file to prevent unnecessary folders from being copied into the container during the build:
.angular
dist
node_modules
.DS_Store
Now, let's proceed to add the Dockerfile
:
FROM node:18-alpine AS build
WORKDIR /tmp
COPY . .
RUN yarn --pure-lockfile && yarn build:ssr
# FROM oven/bun
FROM node:18-alpine
WORKDIR /usr/src/app
COPY --from=build /tmp/dist ./dist
EXPOSE 4200
# CMD ["bun", "dist/bun-universal/server/main.js"]
CMD ["node", "dist/bun-universal/server/main.js"]
Observe the commented FROM
and CMD
commands. The instructions for building both Node and Bun images are quite similar, with only minor differences.
I'll build 2 images by alternating these commands. When building for Node, I'll comment out FROM oven/bun
and its CMD
. Conversely, when building for Bun, I'll comment out FROM node:18-alpine
and its CMD
:
$ docker build -t node-universal .
$ # Now comment `FROM` and `CMD` for Node and uncomment for Bun
$ docker build -t bun-universal .
Now let's run the node-universal
container and use autocannon
:
$ docker run -dp 4200:4200 -e 'PORT=4200' node-universal
$ autocannon -c 100 -d 10 http://localhost:4200
$ docker stop containerId
Let's do the same with Bun:
$ docker run -dp 4200:4200 -e 'PORT=4200' bun-universal
$ autocannon -c 100 -d 10 http://localhost:4200
$ docker stop containerId
So it's 3k requests for Node and 5k requests for Bun. Please note that these results may actually differ across operating systems and hardware. I conducted these tests on a Mac. On another computer with Ubuntu installed, I observed 11k requests for Node and 14k requests for Bun. These results might be somewhat unstable even for a simple 'hello world' app, and I can't be certain if Bun will be significantly faster in real-life examples.
The code can be found at https://github.com/arturovt/bun-angular-universal.
Top comments (2)
NODE
BUN
Thanks for sharing this super insightful blog!
Nice work