In my previous article, we explored how NgRx Signal Store simplifies state management in Angular by removing traditional boilerplate. We built a task tracker that used withMethods() to handle all state changes and async operations.
While that approach works well for small features, it has a hidden cost: tight coupling between what happens and how it happens. When a component calls store.createTask(), it assumes the store will handle validation, API calls, state updates, and error handling in a specific way. As the app grows, this coupling makes maintenance and cross team collaboration.. difficult.
And this is where the Events Plugin for NgRx Signal Store comes into play, a modern take on Flux architecture that decouples events from their handlers, making state flows more predictable and scalable. In this post, we'll develop an event-based Signal Store using the NgRx Events Plugin. You'll find the complete implementation in the repository linked at the end.
The Problem with Direct Method Calls
Let's look at a method from our original task store:
async createTask(task: Omit<Task, 'id' | 'createdAt'>) {
patchState(store, { isLoading: true });
try {
const newTask = await firstValueFrom(service.createTask(task));
const currentTasks = store.taskEntities();
updateTasks([...currentTasks, newTask]);
patchState(store, { isLoading: false });
return newTask;
} catch (error) {
patchState(store, { isLoading: false });
throw error;
}
}
What's happening here?
- The method handles loading state
- Makes an API call
- Updates task entities
- Manages error handling
All in one place. This seems convenient, but consider these scenarios:
- What if multiple actions need to trigger the same side effect?
- What if creating a task should also trigger analytics?
- What if we want to replay events for debugging?
💡 With direct method calls, we'd need to modify the method itself or duplicate logic. Events solve this by decoupling the "what" from the "how".
Why Event-Driven Architecture?
Events introduce a layer of indirection that transforms our tightly coupled method into a loosely coupled system. Instead of create this task, we emit task creation was requested. This shift enables multiple parts of our application to react independently to the same event.
But events alone aren't enough. We need a pattern that defines how they flow through our application. That's where Flux comes in.
Understanding Flux Architecture
Flux is an application architecture pattern that enforces unidirectional data flow. Unlike traditional MVC patterns where data can flow in multiple directions, Flux makes application behavior predictable and easier to trace.
The core principle: Actions trigger changes → Updates happen to state → Views reflect new state → Views can trigger new actions. This creates a circular but unidirectional flow.
As the diagram shows, events flow in cycles through our application:
- Event: User interactions or system events trigger the flow
- Dispatcher: Routes events to the appropriate handlers
- Store: Reducers update state, effects handle side effects like API calls
- View: Components react to state changes
- Back to Event: View interactions create new events, completing the cycle
💡 This separation means that components don’t need to know about API details, effects can be modified without touching the components, multiple effects can react to the same event, and all state changes remain fully traceable through events.
How the Events Plugin Implements Flux
The Events Plugin brings Flux to NgRx Signal Store with modern TypeScript and Signals. Events replace traditional actions through type-safe event creators with eventGroup(), while reducers handle state updates via pure functions using withReducer() and on(). Side effects are managed through observable-based effects with withEffects(), and Signals replace manual subscriptions by providing automatic change detection. Throughout the entire flow, TypeScript inference ensures full type safety from event creation to state updates.
Setting Up the Events Plugin
Now that we understand the architecture, let's implement it. The Events Plugin ships with NgRx Signals since version 19.2 (also available in latest v20.0.1):
npm add @ngrx/signals@latest
With the plugin installed, we can define the events that will flow through our task tracker application.
Defining Task Events
The first step in event-driven architecture is defining all the events that can occur in our system. Think of events as a complete chronicle of everything that happens in our application: both user actions and system responses.
We use eventGroup() to organize related events under a common source, which helps us understand where events originate and makes debugging easier. For our task tracker, we'll create two event groups:
// src/app/stores/task-store/task.events.ts
import { type } from '@ngrx/signals';
import { eventGroup } from '@ngrx/signals/events';
import { Task, TaskStatus } from '../../interfaces/task';
export const taskPageEvents = eventGroup({
source: 'Task Page',
events: {
opened: type<void>(),
taskCreated: type<Omit<Task, 'id' | 'createdAt'>>(),
taskDeleted: type<string>(),
taskStatusChanged: type<{ id: string; status: TaskStatus }>(),
pageChanged: type<number>(),
},
});
export const taskApiEvents = eventGroup({
source: 'Task API',
events: {
tasksLoadedSuccess: type<Task[]>(),
tasksLoadedFailure: type<string>(),
taskCreatedSuccess: type<Task>(),
taskCreatedFailure: type<string>(),
taskDeletedSuccess: type<string>(),
taskDeletedFailure: type<string>(),
taskStatusChangedSuccess: type<{ id: string; status: TaskStatus }>(),
taskStatusChangedFailure: type<{
id: string;
previousStatus: TaskStatus;
error: string;
}>(),
},
});
Understanding the Event Structure
The separation into two groups is intentional and follows the Flux pattern:
Page Events (taskPageEvents) represent user intentions — the "what the user wants to do":
-
opened- User navigated to the task board -
taskCreated- User submitted a new task (we don't have the ID yet) -
taskDeleted- User clicked delete on a task -
taskStatusChanged- User moved a task to a different column
These events describe what happened from the user's perspective, without any knowledge of how the system will handle them.
API Events (taskApiEvents) represent system outcomes — the "what actually happened":
-
tasksLoadedSuccess/tasksLoadedFailure- The result of fetching tasks -
taskCreatedSuccess/taskCreatedFailure- The result of creating a task - Each operation has both success and failure events to handle all possible outcomes
This separation creates a clear request-response pattern: page events trigger effects, which call APIs, which emit API events, which trigger reducers. The data flow becomes a traceable sequence of events rather than hidden method calls.
Notice also that failure events include enough context to handle errors properly — for example, taskStatusChangedFailure includes the previousStatus so we can roll back optimistic updates.
Implementing Reducers
Reducers in NGRX Signals Events are declarative configurations that define how state should update in response to events. Using the on() function, we map event types to state update instructions. They return state update instructions (like setAllEntities(), addEntity(), or partial state objects) that compose into the store.
💡 Reducers define what state changes should happen when events occur. Effects - which we will cover next - handle side effects like API calls and dispatch result events that our reducers process.
The Reducer Pattern
We use withReducer() wrapped in a signalStoreFeature() to create a composable reducer feature:
// src/app/stores/task-store/task.reducer.ts
import { on, withReducer } from '@ngrx/signals/events';
import {
setAllEntities,
addEntity,
removeEntity,
updateEntity,
} from '@ngrx/signals/entities';
import { taskPageEvents, taskApiEvents } from './task.events';
import { signalStoreFeature } from '@ngrx/signals';
import { Task, TaskStatus } from '../../interfaces/task';
export function withTaskReducer() {
return signalStoreFeature(
withReducer(
// Handle loading states
on(taskPageEvents.opened, () => ({ isLoading: true })),
// Handle successful task loading
on(taskApiEvents.tasksLoadedSuccess, (event: { payload: Task[] }) => {
return [
setAllEntities(event.payload, { collection: 'task' }),
{ isLoading: false },
];
}),
// Handle failed task loading
on(taskApiEvents.tasksLoadedFailure, (event: { payload: string }) => ({
isLoading: false,
error: event.payload,
})),
// Handle successful task creation
on(taskApiEvents.taskCreatedSuccess, (event: { payload: Task }) =>
addEntity(event.payload, { collection: 'task' })
),
// Handle successful task deletion
on(taskApiEvents.taskDeletedSuccess, (event: { payload: string }) =>
removeEntity(event.payload, { collection: 'task' })
),
// Handle optimistic status update
on(
taskPageEvents.taskStatusChanged,
(event: { payload: { id: string; status: TaskStatus } }) =>
updateEntity(
{ id: event.payload.id, changes: { status: event.payload.status } },
{ collection: 'task' }
)
),
// Handle status update failure (revert)
on(
taskApiEvents.taskStatusChangedFailure,
(event: {
payload: { id: string; previousStatus: TaskStatus; error: string };
}) =>
updateEntity(
{
id: event.payload.id,
changes: { status: event.payload.previousStatus },
},
{ collection: 'task' }
)
)
)
);
}
Understanding Reducer Mechanics
Each on() handler maps an event type to state update instructions. The handler receives a single event parameter with a payload property containing the event data, and returns:
-
Partial state update:
{ isLoading: true }- which merges into current state -
Array of updates:
[setAllEntities(...), { isLoading: false }]- which applies multiple updates -
Entity operation:
addEntity(...),updateEntity(...),removeEntity(...)- which modifies an entity collection
Why explicit typing? Type annotations like (event: { payload: Task[] }) ensure type safety and improve IDE support. While the Events Plugin provides type inference, explicit annotations prevent payload access errors and enable better autocompletion during development.
The Power of Declarative Updates
One of the most powerful patterns enabled by event-driven reducers is optimistic updates with automatic rollbacks. Consider how we handle task status changes:
// When user changes status, update immediately (optimistic)
on(
taskPageEvents.taskStatusChanged,
(event: { payload: { id: string; status: TaskStatus } }) =>
updateEntity(
{ id: event.payload.id, changes: { status: event.payload.status } },
{ collection: 'task' }
)
)
// If API call fails, revert to previous status (rollback)
on(
taskApiEvents.taskStatusChangedFailure,
(event: { payload: { id: string; previousStatus: TaskStatus; error: string } }) =>
updateEntity(
{ id: event.payload.id, changes: { status: event.payload.previousStatus } },
{ collection: 'task' }
)
)
This pattern provides immediate UI feedback while maintaining data consistency. The user sees the change instantly, and if the server rejects it, the state automatically reverts—all through declarative event handlers with no manual rollback logic.
Implementing Effects
While reducers handle declarative state updates, effects manage side effects: API calls, external services, logging, and any interaction with the outside world. Effects are RxJS observables that listen for specific events, perform asynchronous operations, and dispatch result events back to the store.
The key benefit of effects is separation of concerns: reducers remain declarative and focused on state shape, while effects encapsulate all asynchronous and external operations.
The Effect Pattern
Effects use the Events service to listen for events via events.on(), perform side effects (like HTTP requests), handle success/error cases, and dispatch result events. Like reducers, effects are wrapped in signalStoreFeature() using withEffects() to create a composable effect feature:
// src/app/stores/task-store/task.effects.ts
import { inject } from '@angular/core';
import { Events, withEffects } from '@ngrx/signals/events';
import { signalStoreFeature } from '@ngrx/signals';
import { exhaustMap, tap, catchError, concatMap } from 'rxjs/operators';
import { of } from 'rxjs';
import { TaskService } from '../../services/task.service';
import { taskPageEvents, taskApiEvents } from './task.events';
import { Task } from '../../interfaces/task';
export function withTaskEffects() {
return signalStoreFeature(
withEffects(
(
store: Record<string, unknown>,
events = inject(Events),
taskService = inject(TaskService)
) => ({
// Load tasks when page opens
loadTasks$: events.on(taskPageEvents.opened).pipe(
exhaustMap(() =>
taskService.getTasks(1, 10).pipe(
tap(response => {
console.log('[Task Store] Response from getTasks:', response);
}),
catchError((error: { message: string }) =>
of(taskApiEvents.tasksLoadedFailure(error.message))
),
concatMap((response: { tasks: Task[] } | { type: string }) => {
if ('type' in response) {
// Already an event (error)
return of(response);
}
// Dispatch success with tasks array
console.log(
'[Task Store] Dispatching tasksLoadedSuccess with:',
response.tasks
);
return of(taskApiEvents.tasksLoadedSuccess(response.tasks));
})
)
)
),
// Create task
createTask$: events.on(taskPageEvents.taskCreated).pipe(
exhaustMap(event =>
taskService.createTask(event.payload).pipe(
catchError((error: { message: string }) =>
of(taskApiEvents.taskCreatedFailure(error.message))
),
concatMap((task: Task | { type: string }) =>
'type' in task
? of(task)
: of(taskApiEvents.taskCreatedSuccess(task))
)
)
)
),
// Delete task
deleteTask$: events.on(taskPageEvents.taskDeleted).pipe(
exhaustMap(event =>
taskService.deleteTask(event.payload).pipe(
catchError((error: { message: string }) =>
of(taskApiEvents.taskDeletedFailure(error.message))
),
concatMap(() =>
of(taskApiEvents.taskDeletedSuccess(event.payload))
)
)
)
),
// Change task status with optimistic update rollback
changeTaskStatus$: events.on(taskPageEvents.taskStatusChanged).pipe(
exhaustMap(event => {
// Access store entities with defensive typing
const taskEntitiesFn = store['taskEntities'] as
| (() => Task[])
| undefined;
const taskEntities = taskEntitiesFn?.() || [];
const task = taskEntities.find(
(t: Task) => t.id === event.payload.id
);
const previousStatus = task?.status;
return taskService
.updateTaskStatus(event.payload.id, event.payload.status)
.pipe(
catchError((error: { message: string }) =>
of(
taskApiEvents.taskStatusChangedFailure({
id: event.payload.id,
previousStatus: previousStatus!,
error: error.message,
})
)
),
concatMap(() =>
of(taskApiEvents.taskStatusChangedSuccess(event.payload))
)
);
})
),
})
)
);
}
Understanding Effect Mechanics
Let's break down the pattern using the loadTasks$ effect as an example:
Listening for events:
events.on(taskPageEvents.opened)creates an observable that emits whenever theopenedevent is dispatchedMaking sure that we handle concurrency:
exhaustMapignores new events while the current request is in progress. This prevents duplicate API calls if the user rapidly triggers the same actionCalling an async service method:
taskService.getTasks(1, 10)performs the HTTP request and returns an observableError handling:
catchErrortransforms errors into failure events, ensuring the observable stream continues and errors are handled gracefullyTransforming responses:
concatMapchecks if we already have an event object (fromcatchError) or if we need to wrap the successful response in a success eventDispatching result events: The effect returns observables that emit event objects (created by event creators like
taskApiEvents.tasksLoadedSuccess()), which are automatically dispatched back to the store to trigger reducers
Creating Reusable Features
One of the powerful aspects of this architecture is extracting reusable features. For instance, here's a logging feature that could work with any store:
// src/app/stores/shared/with-event-logging.ts
import { inject } from '@angular/core';
import { Events, withEffects } from '@ngrx/signals/events';
import { signalStoreFeature } from '@ngrx/signals';
import { tap } from 'rxjs/operators';
type EventGroup = Record<string, unknown>;
interface EventWithPayload {
type: string;
payload?: unknown;
}
/**
* Creates a store feature that logs all events from specified event groups.
* Useful for debugging and monitoring event flow.
*/
export function withEventLogging(eventGroups: EventGroup[]) {
return signalStoreFeature(
withEffects((store: Record<string, unknown>, events = inject(Events)) => {
// Collect all events from all groups
const allEvents = eventGroups.flatMap(group =>
Object.values(group)
) as unknown[];
return {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
logAllEvents$: events.on(...(allEvents as [any, ...any[]])).pipe(
tap((event: EventWithPayload) => {
const isError = event.type.includes('Failure');
if (isError) {
console.error(`[Store Event] ${event.type}:`, event.payload);
} else {
console.log(`[Store Event] ${event.type}`, event.payload);
}
})
),
};
})
);
}
This feature can be added to any store to get automatic event logging, which would make our life easier when it comes to debugging.
Composing the Store
Now our store becomes a clean composition of features:
// src/app/stores/task-store/task.store.ts
import { signalStore, withState, withComputed, type } from '@ngrx/signals';
import { withEntities } from '@ngrx/signals/entities';
import { computed, inject, Signal } from '@angular/core';
import { Task, TaskStatus } from '../../interfaces/task';
import { TASK_BOARD_INITIAL_STATE } from './task-store.config';
import { withTaskReducer } from './task.reducer';
import { withTaskEffects } from './task.effects';
import { taskPageEvents, taskApiEvents } from './task.events';
import { withEventLogging } from '../shared/with-event-logging';
interface TaskStoreState {
taskEntities: Signal<Task[]>;
}
export const TaskStore = signalStore(
{ providedIn: 'root' },
// IMPORTANT: Entities must come before State
withEntities({ entity: type<Task>(), collection: 'task' }),
// State
withState(() => inject(TASK_BOARD_INITIAL_STATE)),
// Event-driven reducers
withTaskReducer(),
// Event-driven effects
withTaskEffects(),
// Event logging (for debugging)
withEventLogging([taskPageEvents, taskApiEvents]),
// Computed views
withComputed((store: TaskStoreState) => ({
tasksTodo: computed(() =>
store.taskEntities().filter((t: Task) => t.status === TaskStatus.TODO)
),
tasksInProgress: computed(() =>
store
.taskEntities()
.filter((t: Task) => t.status === TaskStatus.IN_PROGRESS)
),
tasksDone: computed(() =>
store.taskEntities().filter((t: Task) => t.status === TaskStatus.DONE)
),
}))
);
⚠️ Critical ordering note:
withEntities()must come beforewithState(). If we put state first, the entity collection won't be properly initialized.
Configuration Setup
In the initial state we used an injection token pattern for better testability:
// src/app/stores/task-store/task-store.config.ts
import { InjectionToken } from '@angular/core';
import { TaskBoardState } from '../../interfaces/task';
export const TASK_BOARD_INITIAL_STATE = new InjectionToken<TaskBoardState>(
'taskBoardInitialState',
{
providedIn: 'root',
factory: () => ({
isLoading: false,
pageSize: 10,
pageCount: 1,
currentPage: 1,
}),
}
);
This makes it easy to override the initial state in tests.
Dispatching Events from Components
Now our components can simply dispatch events instead of calling store methods:
// src/app/pages/board/board.component.ts
// ..imports
export class BoardComponent {
readonly store = inject(TaskStore);
readonly dispatch = injectDispatch(taskPageEvents);
readonly TaskStatus = TaskStatus;
constructor() {
// Dispatch the 'opened' event when component initializes
// This triggers the effect to load tasks
this.dispatch.opened();
}
createTask() {
if (this.taskForm.invalid) return;
const newTask = {
title: this.taskForm.get('title')?.value,
description: this.taskForm.get('description')?.value,
status: TaskStatus.TODO,
};
// Dispatch event: task.effects.ts handles the API call
// task.reducer.ts updates the store state on success
this.dispatch.taskCreated(newTask);
this.taskForm.reset();
}
deleteTask(taskId: string) {
if (confirm('Are you sure you want to delete this task?')) {
// Dispatch event: handled by effects and reducer
this.dispatch.taskDeleted(taskId);
}
}
moveTo(taskId: string, targetStatus: TaskStatus) {
// Dispatch event: optimistic update by reducer
// Effects handle API call and revert on failure
this.dispatch.taskStatusChanged({ id: taskId, status: targetStatus });
}
}
Using injectDispatch() generates type-safe methods for each event in the group. The component doesn't know or care about API calls, error handling, or state mutations — it just describes what the user did in the UI.
Testing Event-Driven Flows
One of the greatest benefits of event-driven architecture is testability. By separating concerns reducers handle state, effects handle side effects, components dispatch events, and we can test each piece in isolation without complex setup or mocking.
Testing Philosophy
The event-driven pattern gives us three distinct testing levels:
- Reducer tests: Pure function tests — pass in events, assert on state changes
- Effect tests: Observable tests — mock services, dispatch events, assert on service calls
- Integration tests: Full store tests — dispatch events, wait for effects, assert on final state
Let's explore each level with our actual working test suite using Vitest.
Testing Reducers
Our reducers are event handler configurations that define how state should be updated in response to events. Therefore, reducer tests should verify that handlers are correctly structured with proper type signatures and return valid state update instructions. Note that actual state mutations will be tested at the store integration level.
describe('Task Reducer', () => {
describe('Page Events', () => {
it('should set isLoading to true on page opened', () => {
const result = on(taskPageEvents.opened, () => ({ isLoading: true }));
expect(result).toBeDefined();
});
it('should update task status optimistically on taskStatusChanged', () => {
const result = on(
taskPageEvents.taskStatusChanged,
(evt: { payload: { id: string; status: TaskStatus } }) =>
updateEntity(
{ id: evt.payload.id, changes: { status: evt.payload.status } },
{ collection: 'task' }
)
);
expect(result).toBeDefined();
});
});
describe('API Events - Success', () => {
it('should set all entities and loading to false on tasksLoadedSuccess', () => {
const result = on(
taskApiEvents.tasksLoadedSuccess,
(evt: { payload: Task[] }) => [
setAllEntities(evt.payload, { collection: 'task' }),
{ isLoading: false },
]
);
expect(result).toBeDefined();
});
it('should add entity on taskCreatedSuccess', () => {
const result = on(
taskApiEvents.taskCreatedSuccess,
(evt: { payload: Task }) =>
addEntity(evt.payload, { collection: 'task' })
);
expect(result).toBeDefined();
});
it('should remove entity on taskDeletedSuccess', () => {
const result = on(
taskApiEvents.taskDeletedSuccess,
(evt: { payload: string }) =>
removeEntity(evt.payload, { collection: 'task' })
);
expect(result).toBeDefined();
});
});
// .. more reducer tests
});
Testing Effects
Effect tests should verify that effects respond to events by calling the appropriate service methods. We create a minimal test store, dispatch events, and verify the effect triggers the expected service calls.
describe('loadTasks$ effect', () => {
it('should call getTasks service method when opened event is dispatched', async () => {
const mockTasks: Task[] = [
{
id: '1',
title: 'Test Task',
status: TaskStatus.TODO,
createdAt: new Date().toISOString(),
},
];
mockTaskService.getTasks.mockReturnValue(
of({ tasks: mockTasks, totalPages: 1 })
);
const TestStore = signalStore(
withState({ isLoading: false }),
withEntities({ entity: type<Task>(), collection: 'task' }),
withTaskEffects()
);
TestBed.configureTestingModule({
providers: [TestStore],
});
const store = TestBed.inject(TestStore);
const dispatch = TestBed.runInInjectionContext(() =>
injectDispatch(taskPageEvents)
);
dispatch.opened();
await new Promise(resolve => setTimeout(resolve, 100));
expect(mockTaskService.getTasks).toHaveBeenCalledWith(1, 10);
});
});
Key patterns:
Create minimal test stores: We compose a minimal
signalStorewith only the dependencies needed for the effect being tested (withState,withEntities,withTaskEffects)Mock service dependencies: Vitest's
vi.fn()is being used to mockTaskServicemethods, with RxJSof()for success cases orthrowError()for error scenariosDispatch events to trigger effects: We utilize
injectDispatch(taskPageEvents)to obtain the dispatch function, then dispatch events and wait for async effects withawait new Promise(resolve => setTimeout(resolve, 100))Verify service calls: Assert that the correct service method was called with expected parameters for both success and error paths, validating that the effect correctly responds to dispatched events
Integration Testing
Integration tests verify the complete event-driven flow by dispatching actual events through injectDispatch and observing that effects trigger service calls. These tests use the full TaskStore composition with all features (entities, state, reducers, effects, computed signals) and mocked services to validate the entire system works together:
// ..imports
describe('Integration: Event Flow', () => {
let store: InstanceType<typeof TaskStore>;
let dispatch: ReturnType<typeof injectDispatch<typeof taskPageEvents>>;
let mockTaskService: {
getTasks: ReturnType<typeof vi.fn>;
createTask: ReturnType<typeof vi.fn>;
deleteTask: ReturnType<typeof vi.fn>;
updateTaskStatus: ReturnType<typeof vi.fn>;
};
beforeEach(() => {
mockTaskService = {
getTasks: vi.fn(),
createTask: vi.fn(),
deleteTask: vi.fn(),
updateTaskStatus: vi.fn(),
};
TestBed.configureTestingModule({
providers: [
TaskStore,
{ provide: TaskService, useValue: mockTaskService },
],
});
store = TestBed.inject(TaskStore);
dispatch = TestBed.runInInjectionContext(() =>
injectDispatch(taskPageEvents)
);
});
describe('Load Tasks Flow', () => {
it('should dispatch opened event and load tasks through complete flow', async () => {
// Arrange: Mock service response
const mockTasks: Task[] = [
{
id: '1',
title: 'Test Task',
status: TaskStatus.TODO,
createdAt: new Date().toISOString(),
},
{
id: '2',
title: 'In Progress Task',
status: TaskStatus.IN_PROGRESS,
createdAt: new Date().toISOString(),
},
];
mockTaskService.getTasks.mockReturnValue(
of({ tasks: mockTasks, totalPages: 1 })
);
// Initial state verification
expect(store.taskEntities()).toEqual([]);
expect(store.isLoading()).toBe(false);
// Act: Dispatch page opened event
dispatch.opened();
// Wait for async effects to complete
await new Promise(resolve => setTimeout(resolve, 100));
// Assert: Verify complete flow
expect(mockTaskService.getTasks).toHaveBeenCalledWith(1, 10);
// Note: State updates happen through reducers responding to API success events
});
it('should handle service errors when loading tasks', async () => {
// Arrange: Mock service error
mockTaskService.getTasks.mockReturnValue(
throwError(() => ({ message: 'Network error' }))
);
// Act: Dispatch page opened event
dispatch.opened();
// Wait for async effects
await new Promise(resolve => setTimeout(resolve, 100));
// Assert: Store should remain in safe state
expect(store.taskEntities()).toEqual([]);
});
});
});
Key patterns:
Full store composition: We inject the complete
TaskStorewithTestBedto test all composed features (entities, state, reducers, effects, computed signals)Dispatch actual events: We use
injectDispatch(taskPageEvents)to dispatch events likedispatch.opened(),dispatch.taskCreated(), etc.Mock service dependencies and verify calls:
vi.fn()is being used to mockTaskServicemethods, control responses withof()orthrowError(), and verify services were called when events are dispatchedHandle async and assert state: We use
await new Promise(resolve => setTimeout(resolve, 100))to allow effects to complete, then verify service calls and state through signals likestore.taskEntities()andstore.isLoading()
Why This Approach Works
The event-driven architecture makes testing more structured because:
- Clear boundaries: Reducers (state updates), effects (side effects), and components (UI/dispatch) have distinct, testable responsibilities
- Explicit state changes: All state updates go through events, making the data flow transparent and traceable
- Mockable dependencies: Services are injected and easily replaced with mocks in tests
- Predictable behavior: Same events with same initial state will always produce the same results
💡 NGRX Signals Events use a configuration-based approach where reducers and effects are declarative instructions rather than directly executable functions. Unit tests verify correct configuration and type safety, while integration tests validate the complete event flow and actual state changes.
When to Use Event-Driven Architecture
Choosing between event-driven and method-based approaches depends on your application's complexity, team expertise, and long-term maintainability needs. Consider these factors when making your decision.
Use event-driven when:
- Building medium to large applications with complex state interactions
- Multiple features need to react to the same user action or data change
- You require centralized logging, monitoring, or error handling across state updates
- Working in a team where clear separation between UI, business logic, and side effects improves collaboration
Stick with method-based when:
- Building simple features with straightforward CRUD operations
- Rapid prototyping where architectural overhead isn't justified yet
- Application scope is small and unlikely to grow
- Team is new to NGRX Signals
💡 You don't need to choose one approach for your entire application. NGRX Signals enable us to start with method-based stores for simple features and migrate to event-driven architecture as our complexity grows. Many successful applications use both patterns, applying event-driven architecture where it provides the most value and keeping simpler features method-based. The key is understanding the trade-offs and choosing the approach that best fits each specific use case.
Resources
- Code for this article: View on GitHub (v2.0.0)
- NgRx Signal Store Events documentation
- Previous article: Using NgRx Signal Store (method-based approach)
What's your experience with state management? Have you encountered similar implementation challenges?
Let me know in the comments.

Top comments (0)