We shipped @hazeljs/pubsub to make Google Cloud Pub/Sub feel native inside HazelJS applications.
If your team already runs on GCP, Pub/Sub usually becomes the backbone for async communication. But most app code ends up repeating the same integration work: wiring clients, parsing payloads, handling ack/nack, and registering handlers with slightly different conventions in every service.
This package turns that into a consistent HazelJS module + decorator experience.
TL;DR
- Use
@hazeljs/pubsubto publish/consume Pub/Sub messages in HazelJS with less boilerplate. - You get a DI-friendly publisher service, decorator-based consumers, and explicit acknowledgement control.
- It helps teams standardize event-driven code across multiple services.
Why we built it
We built this package because we kept seeing the same pain points in event-driven Node.js services:
Too much repeated plumbing
Every service re-implements Pub/Sub initialization and subscription handling.Inconsistent handler behavior
Some handlers auto-ack, some forget to nack, some swallow errors, and reliability suffers.Leaky transport concerns in business code
Product logic gets mixed with low-level broker/client setup.Harder onboarding
New developers need to learn each service’s custom Pub/Sub pattern instead of one framework pattern.
HazelJS already gives a clean, declarative style for modules, providers, and decorators. Pub/Sub should follow the same principle.
Purpose of @hazeljs/pubsub
The purpose is simple: make Pub/Sub integration predictable, testable, and framework-native.
@hazeljs/pubsub gives you:
- a single module entrypoint (
PubSubModule.forRoot/forRootAsync) - one publisher service (
PubSubPublisherService) - declarative consumers (
@PubSubConsumer+@PubSubSubscribe) - clear acknowledgement behavior (
ackOnSuccess,nackOnError, plus manualack()/nack())
So instead of each service inventing its own event-consumer framework, your team uses one shared pattern.
What problems it solves
1) Boilerplate client setup
Without a package abstraction, every service manually creates and passes Pub/Sub clients.
With PubSubModule, setup is centralized and DI-ready.
2) Message handling drift
Ack/nack logic is usually spread across handlers and easy to get wrong.
With @PubSubConsumer + @PubSubSubscribe, defaults are explicit and overrideable.
3) Payload parsing repetition
Teams repeatedly decode and parse message payloads.
With package defaults, JSON parsing and handler payload typing are built in.
4) Uneven production behavior
Operationally, small differences in handler semantics lead to retries, duplicates, or dropped work.
Standardized patterns reduce those surprises.
What’s in the box
PubSubModule
Configure once with forRoot() or forRootAsync() and use everywhere via DI.
PubSubPublisherService
Publish events from your controllers/services:
-
publish(topic, data, options?)for string/buffer/object payloads -
publishJson(topic, data, options?)for JSON-first workflows
Decorator-based consumers
Define consumers declaratively:
-
@PubSubConsumer({...defaults})at class level -
@PubSubSubscribe({...})at method level
Acknowledgement controls
Use defaults (ackOnSuccess, nackOnError) or control each message explicitly:
- return
'ack' | 'nack' - call
payload.ack()/payload.nack()
Optional subscription auto-create
Enable autoCreateSubscription for bootstrap convenience when topic is provided.
Practical example: Order workflow fan-out
A common SaaS pattern:
- API creates an order.
- API publishes an
order.createdevent. - Multiple consumers react independently:
- billing creates invoice
- notification sends confirmation email
- analytics tracks conversion event
This decouples services while keeping each handler focused.
Producer
@Service()
export class OrderService {
constructor(private readonly publisher: PubSubPublisherService) {}
async createOrder(order: { id: string; userId: string; total: number }) {
// Persist order first...
await this.publisher.publishJson('orders-topic', order, {
attributes: {
event: 'order.created',
source: 'order-service',
},
orderingKey: order.id,
});
}
}
Consumer
@PubSubConsumer({ ackOnSuccess: true, nackOnError: true, parseJson: true })
@Service()
export class BillingConsumer {
@PubSubSubscribe({
subscription: 'billing-orders-subscription',
topic: 'orders-topic',
autoCreateSubscription: true,
})
async handleOrder(
payload: PubSubSubscriptionHandlerPayload<{ id: string; userId: string; total: number }>
) {
// idempotency check recommended
// create invoice, emit internal metrics, etc.
}
}
Quick start
Install:
npm install @hazeljs/pubsub
Register module:
@HazelModule({
imports: [
PubSubModule.forRoot({
projectId: process.env.GCP_PROJECT_ID,
}),
],
})
export class AppModule {}
Pub/Sub vs Queue vs Kafka
-
Use
@hazeljs/pubsubwhen you’re on GCP and want managed Pub/Sub semantics. -
Use
@hazeljs/queuefor Redis/BullMQ background job processing. -
Use
@hazeljs/kafkawhen Kafka is your streaming/event backbone.
Production notes
- Keep handlers idempotent (at-least-once delivery can reprocess messages).
- Attach correlation IDs in message attributes for tracing/debugging.
- Monitor nack/error rates for early schema/runtime regressions.
- Plan retry + dead-letter strategy at the platform level.
- Keep handlers fast and offload heavy work when needed.
Backlinks and resources
- PubSub docs: PubSub Docs
- NPM: @hazeljs/pubsub
- GitHub monorepo: HazelJS Github
- Landing docs root: HazelJS Docs
If you’re building event-driven systems on GCP with HazelJS, try it and share feedback via GitHub issues or Discord.
Top comments (0)