Introduction
In this article, we will explore the creation of a credit card payment application using NestJS and RabbitMQ to handle billing generation with a Payment Service Provider (PSP). Additionally, we will incorporate Docker and Docker Compose to streamline container management.
Initially, we will create an endpoint simulating the billing creation process, performing a database insert and an HTTP request. We will observe that the response time for this endpoint is unacceptably high, around 1100 ms. Next, we will introduce a message broker service to asynchronously process the HTTP request for billing generation, resulting in a significant reduction in the endpoint's response time.
To conclude, we will address the separation of the consumer from the API in a monorepo, enabling both to scale independently. I want to highlight that I developed this project using the Clean Architecture, but you have the flexibility to choose the architecture that best suits your needs.
Why should I use asynchronous processing and RabbitMQ?
Asynchronous Processing for Improved Performance:
The introduction reveals that the initial response time for the billing creation process is unacceptably high, around 1100 ms. Introducing a message broker allows you to shift to asynchronous processing, significantly reducing the endpoint's response time. RabbitMQ excels in handling asynchronous communication, enabling your application to scale more efficiently.Enhanced Scalability and Responsiveness:
By leveraging RabbitMQ, you can decouple the billing generation process from the API endpoint. This decoupling facilitates improved scalability, as both the API and the billing generation service can scale independently. This flexibility ensures that your application remains responsive, even under increased load.Reliable Message Delivery:
RabbitMQ provides reliable message delivery mechanisms, ensuring that messages are successfully delivered even in the event of system failures or network issues. This reliability is crucial in financial applications like credit card payment processing, where data integrity and consistency are paramount.
In summary, utilizing RabbitMQ in conjunction with NestJS for a credit card payment application brings tangible benefits such as improved performance, scalability, reliability, and flexibility, making it a well-rounded choice for handling asynchronous communication and optimizing your application's overall architecture.
1. Start new nestjs app
$ nest new nestjs-rabbitmq-example
See these changes in the commit.
After installation, I removed app.controller.ts and app.service.ts
See these changes in the commit.
2. Include entity, controller, and services to simulate the creation of a charge using a payment service provider such as Pagarme
$ nest g module credit-card
You can see the classes created in this commit.
Now we have an endpoint that simulates the creation of a credit card charge, as if making an insertion in the database (100ms) and then an http request to pagarme (1000ms).
In this scenario, the client is not required to linger on the HTTP request, awaiting a response from Pagarme; we can efficiently handle this process asynchronously. To achieve this, we will leverage RabbitMQ to queue the creation requests for charges.
3. Add the necessary libs to implement rabbitmq
$ yarn add @nestjs/microservices amqplib amqp-connection-manager
See these changes in the commit.
4. Dockerize application
Dockerfile
FROM node:16
WORKDIR /usr/src/app
COPY package*.json ./
RUN yarn
COPY . .
RUN yarn build
EXPOSE 3000
CMD [ "yarn", "start:prod" ]
docker-compose.yml
version: '3.7'
services:
credit-card-api:
container_name: credit-card-api
restart: on-failure
build:
context: .
volumes:
- .:/usr/src/app
ports:
- 3000:3000
command: yarn start:dev
depends_on:
- rabbitmq
rabbitmq:
image: rabbitmq:3.9-management
container_name: rabbitmq
restart: always
hostname: rabbitmq
ports:
- 5672:5672
- 15672:15672
volumes:
- rabbitmq_data:/var/lib/rabbitmq
volumes:
rabbitmq_data:
See these changes in the commit.
4. Connect application with rmq
update file main.ts to use app.connectMicroservice() and app.startAllMicroservices()
I'm not going to delve into rabbitmq settings.
import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';
import { RmqOptions, Transport } from '@nestjs/microservices';
async function bootstrap() {
const app = await NestFactory.create(AppModule);
app.connectMicroservice<RmqOptions>({
transport: Transport.RMQ,
options: {
urls: [`amqp://rabbitmq:5672`],
queue: 'create_charge_psp',
prefetchCount: 1,
persistent: true,
noAck: false,
queueOptions: {
durable: true,
},
socketOptions: {
heartbeatIntervalInSeconds: 60,
reconnectTimeInSeconds: 5,
},
},
});
await app.startAllMicroservices();
await app.listen(3000);
}
bootstrap();
See these changes in the commit.
5. Start application
$ docker-compose up --build
Access rabbitmq panel in http://localhost:15672 with default crendentials login: guest and pass: guest
6. Create publisher and consumer
create-charge.publisher.ts
import { Inject } from '@nestjs/common';
import { ClientProxy } from '@nestjs/microservices';
import { catchError, firstValueFrom, throwError } from 'rxjs';
import { CreateChargeInputProps } from 'src/credit-card/domain/contracts/psp-service.interface';
export class CreateChargePublisher {
constructor(
@Inject('create_charge_publisher')
private readonly clientProxy: ClientProxy,
) {}
async publish(data: CreateChargeInputProps): Promise<void> {
await firstValueFrom(
this.clientProxy.emit('CREATE_CHARGE_PSP', data).pipe(
catchError((exception: Error) => {
return throwError(() => new Error(exception.message));
}),
),
);
}
}
create-charge-on-psp.event-handler.ts
import { Controller, Inject } from '@nestjs/common';
import { Ctx, EventPattern, Payload, RmqContext } from '@nestjs/microservices';
import {
CreateChargeInputProps,
ICreateCharge,
} from 'src/credit-card/domain/contracts/psp-service.interface';
import { Pagarme } from 'src/credit-card/infra/psp/pagarme/pagarme.service';
@Controller()
export class CreateChargeOnPSPEventHandler {
constructor(
@Inject(Pagarme)
private readonly pspService: ICreateCharge,
) {}
@EventPattern('CREATE_CHARGE_PSP')
async handle(
@Payload() payload: CreateChargeInputProps,
@Ctx() context: RmqContext,
): Promise<void> {
console.log(payload);
const channel = context.getChannelRef();
const originalMsg = context.getMessage();
try {
await this.pspService.createCharge(payload);
} catch (error) {
console.log(error);
}
channel.ack(originalMsg);
}
}
credit-card.module.ts
import { Module } from '@nestjs/common';
import { Pagarme } from './infra/psp/pagarme/pagarme.service';
import { CreateChargeUseCase } from './app/use-cases/create-charge.use-case';
import { CreateChargeController } from './presentation/controllers/create-charge.controller';
import { CreateChargeOnPSPEventHandler } from './presentation/event-handler/create-charge-on-psp.event-handler';
import { CreateChargePublisher } from './infra/rmq/publisher/create-charge.publisher';
import {
ClientProxy,
ClientProxyFactory,
Transport,
} from '@nestjs/microservices';
import { ICreateCharge } from './domain/contracts/psp-service.interface';
@Module({
providers: [
Pagarme,
{
provide: CreateChargeUseCase,
useFactory: (pspService: ICreateCharge) => {
return new CreateChargeUseCase(pspService);
},
inject: [Pagarme],
},
CreateChargePublisher,
{
provide: 'create_charge_publisher',
useFactory: (): ClientProxy => {
return ClientProxyFactory.create({
transport: Transport.RMQ,
options: {
urls: [`amqp://rabbitmq:5672`],
queue: 'create_charge_psp',
prefetchCount: 1,
persistent: true,
noAck: true,
queueOptions: {
durable: true,
},
socketOptions: {
heartbeatIntervalInSeconds: 60,
reconnectTimeInSeconds: 5,
},
},
});
},
},
],
controllers: [CreateChargeController, CreateChargeOnPSPEventHandler],
})
export class CreditCardModule {}
See these changes in the commit.
7. Change the use case to send charge data to the rabbitmq queue instead of making the call to pagarme directly
create-charge.use-case.ts
import {
CreateChargeInputProps,
CreateChargeOutputProps,
} from 'src/credit-card/domain/contracts/psp-service.interface';
import { CreditCardChargeEntity } from 'src/credit-card/domain/entities/credit-card-charge.entity';
import { ICreateChargeUseCase } from 'src/credit-card/domain/use-cases/create-charge.use-case';
import { IPublisherCreateCharge } from 'src/credit-card/infra/rmq/publisher/create-charge.publisher';
export class CreateChargeUseCase implements ICreateChargeUseCase {
constructor(private readonly publisher: IPublisherCreateCharge) {}
async execute(
props: CreateChargeInputProps,
): Promise<Omit<CreateChargeOutputProps, 'pspId' | 'value'>> {
const entity = CreditCardChargeEntity.newCharge(props);
console.log(entity);
await new Promise((resolve) => setTimeout(resolve, 100)); //simulate database access
await this.publisher.publish(props);
return { ...props, status: 'PENDING' };
}
}
See these changes in the commit.
Now our endpoint for creating a credit card charge has an acceptable response time.
Several might halt their progress at this juncture; I've witnessed products in production following this approach. While it's not inherently incorrect, it hinders our ability to scale both the consumer and API horizontally and vertically independently. Additionally, the processing load on the consumer can adversely affect API response times. To address this, we aim to decouple startAllMicroservices() from listen(), instigating a shift towards a monorepo structure. Simultaneously, we'll develop a dedicated app for the consumer, fostering improved scalability and performance.
8. Switch from standard mode to monorepo mode
Let's change the project structure to monorepo.
$ nest generate app credit-card-consumer
See these changes in the commit.
It is imperative to modify the scripts responsible for building and launching the application. Additionally, we must craft a Dockerfile tailored for the consumer, configuring it to spawn three replicas, and subsequently, fine-tune the docker-compose.yml configuration accordingly.
See these changes in the commit.
9. Start application again
$ docker-compose up --build
Now, running the $ docker ps
command we can see 3 instances of the consumer and 1 of the api running
10. Decouple the consumer from the api
Currently, the consumer remains tightly coupled to the API, as we've only created the new credit-card-consumer app without implementing any modifications. The next step involves decoupling these components by transferring the responsibility of initiating the consumer to the credit-card-consumer application.
credit-card-consumer/src/main.ts
import { NestFactory } from '@nestjs/core';
import { CreditCardConsumerModule } from './credit-card-consumer.module';
import { RmqOptions, Transport } from '@nestjs/microservices';
async function bootstrap() {
const app = await NestFactory.create(CreditCardConsumerModule);
app.connectMicroservice<RmqOptions>({
transport: Transport.RMQ,
options: {
urls: [`amqp://rabbitmq:5672`],
queue: 'create_charge_psp',
prefetchCount: 1,
persistent: true,
noAck: false,
queueOptions: {
durable: true,
},
socketOptions: {
heartbeatIntervalInSeconds: 60,
reconnectTimeInSeconds: 5,
},
},
});
await app.startAllMicroservices();
}
bootstrap();
See that here we do not need to invoke the app.listen() method.
nestjs-rabbitmq-example/src/main.ts
import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';
async function bootstrap() {
const app = await NestFactory.create(AppModule);
await app.listen(3000);
}
bootstrap();
See that here we only keep the app.listen() method.
Moreover, I have eliminated the services and controllers produced by the $ nest generate app credit-card-consumer
command.
You can see that changes in this commit.
To finalize the decoupling process, it's essential to relocate the event handler to the credit-card-consumer, as it's responsible for both removing and processing messages from the queue. However, a challenge arises as the event handler relies on the Pagarme class within the nestjs-rabbitmq-example structure. To resolve this issue, let's transfer all shared components between the API and the consumer to the 'libs' folder, utilizing the 'nest generate library' resource. Importantly, we must refrain from importing any class from the API into the consumer under any circumstances.
I moved all the shared code to the credit-card lib, basically i kept only the presentation layer for API and consumer.
Now our API is completely decoupled from the consumer, allowing us to scale each component independently.
See these changes in the commit.
11. Limit cpu and memory for containers
Establishing a controlled environment serves to restrict the RAM and CPU usage of containers.
See these changes in the commit.
12. Conclusion
In conclusion, this article outlined the journey of creating a credit card payment application, emphasizing the efficient integration of technologies such as NestJS, RabbitMQ, Docker, Docker Compose, and the use of a monorepo. When addressing the initial challenge of inadequate response time in our endpoint, we implemented a message broker service to enable asynchronous processing of HTTP requests, resulting in significant improvements in system efficiency and responsiveness.
Additionally, we explored separating the consumer from the API within a monorepo, providing flexibility and individual scalability for both parts of the system. It is worth noting that, while I opted for the Clean Architecture, the choice of architecture and implementation within a monorepo remains at the developer's discretion.
This approach not only enhances application performance but also provides a solid foundation for future adaptations and customizations as needs evolve. By seamlessly integrating these technologies within a monorepo environment, we hope this article serves as a valuable guide for those seeking to optimize efficiency and scalability in their payment applications.
repo: nestjs-rabitmq-example
links:
Top comments (1)
I run an online marketplace and was getting requests from customers to accept crypto payments. I needed a reliable cryptocurrency payment gateway, and this platform fit the bill perfectly. The setup process was straightforward, and it supports a wide range of cryptocurrencies, which was important for me. Transactions are processed quickly, and I haven’t encountered any issues. It’s made a huge difference in how I handle payments, and my customers love the new option.