In this blog, I'll walk you through the architecture and implementation of a scalable and robust application using various AWS services. This project demonstrates how can integrate multiple services to create a resilient solution. Working app link May be down because of cost
Architecture Overview
Key AWS Services Used
- Amazon S3: To host the React application built with Vite.
- Amazon CloudFront: For content delivery and caching to enhance performance.
- Amazon API Gateway: To manage API requests and route them to the appropriate backend services.
- Amazon Route 53: For domain name system (DNS) web services and routing user requests.
- Elastic Load Balancer (ELB): To distribute incoming application traffic across multiple EC2 instances.
- Amazon EC2: To run the backend servers.
- Amazon SQS: For message queuing and handling asynchronous requests.
- Amazon DynamoDB: To store execution results and manage state.
- AWS Cloud9: For development, debugging, and monitoring logs and sqs (we will be scaling worker nodes based on sqs length).
- AWS Certificate Manager: For managing SSL/TLS certificates to secure traffic.
Detailed Architecture Breakdown
Front-end
- Hosting the React Application
Amazon S3: The React application built with Vite is hosted on an S3 bucket, which serves the static files.
Amazon CloudFront: CloudFront is configured to deliver the application content with low latency and high transfer speeds.
- Routing and Domain Management
Amazon Route 53: Route 53 handles DNS management, routing user requests to the appropriate resources.
AWS Certificate Manager: Manages SSL/TLS certificates to ensure secure communication between the user and the application.
Back-end
- API Gateway and Load Balancer
Amazon API Gateway: API Gateway serves as the entry point for all API requests. It routes incoming requests to the backend services.
Elastic Load Balancer: The load balancer distributes incoming API requests across two EC2 instances in a round-robin fashion, ensuring even load distribution.
- Primary Backend servers
Amazon EC2: Two EC2 instances act as servers backend servers. They process incoming requests and interact with other AWS services.
Amazon SQS: The backend servers place incoming requests into an SQS queue for asynchronous processing.
Worker Nodes: Worker nodes monitor the SQS queue, processing messages as they arrive. The number of worker nodes scales up or down based on the queue length, managed by an Auto Scaling group.
Amazon DynamoDB: Execution results are stored in DynamoDB. The backend servers periodically check the database to determine if a task has been completed.
- Monitoring and Scaling
Amazon CloudWatch: CloudWatch monitors the application and triggers scaling actions based on SQS metrics. It ensures that the application scales in and out based on demand.
Auto Scaling Group: Manages the worker nodes, launching or terminating instances based on the queue length and CloudWatch metrics.
Implementation Details
Frontend: React Application
The frontend of the application is a simple React app built with Vite. It allows users to submit JavaScript code, which is then processed by the backend. The frontend interacts with the backend via API Gateway.
import { useRef, useState } from "react";
import axios from "axios";
import Editor, { Monaco } from "@monaco-editor/react";
import { editor as monacoEditor } from "monaco-editor";
import { Button } from "./components/ui/button";
import "./App.css";
function App() {
const editorRef = useRef<monacoEditor.IStandaloneCodeEditor | null>(null);
const [output, setOutput] = useState("");
function handleEditorDidMount(
editor: monacoEditor.IStandaloneCodeEditor,
monaco: Monaco
) {
editorRef.current = editor;
}
async function showValue() {
if (editorRef.current) {
const code = editorRef.current.getValue();
try {
const response = await axios.post(
"https://api..primarybacked.com/submit-code",
{ code }
);
const { executionId } = response.data;
const checkStatus = async () => {
const statusResponse = await axios.get(
`https://api..primarybacked.com/check-status/${executionId}`
);
const { status, result } = statusResponse.data;
if (status === "Executed") {
setOutput(result);
} else {
setTimeout(checkStatus, 5000); // Polling every 5 seconds
}
};
checkStatus();
} catch (error) {
console.error("Error submitting code:", error);
setOutput("Error submitting code");
}
}
}
return (
<div className="flex h-screen w-full flex-col bg-gray-950 text-gray-50">
<header className="flex items-center justify-between border-b border-gray-800 px-4 py-4 sm:px-6">
<div className="flex items-center gap-4">
<span className="text-xl font-semibold">Code Playground</span>
</div>
<Button onClick={showValue}>Run</Button>
</header>
<div className="flex-1 overflow-hidden">
<div className="grid h-full grid-cols-1 gap-6 p-4 sm:grid-cols-[1fr_400px] sm:p-6">
<div className="flex h-full flex-col gap-6 overflow-hidden rounded-lg border border-gray-800 bg-gray-900">
<div className="flex-1 overflow-auto p-4">
<Editor
defaultLanguage="javascript"
theme="vs-dark"
defaultValue="console.log('Hello, world!');"
onMount={handleEditorDidMount}
options={{}}
/>
</div>
</div>
<div className="flex h-full flex-col gap-6 overflow-hidden rounded-lg border border-gray-800 bg-gray-900">
<div className="flex-1 overflow-auto p-4">
<pre className="whitespace-pre-wrap break-words font-mono text-sm">
{output}
</pre>
</div>
</div>
</div>
</div>
</div>
);
}
export default App;
Primary Backend: Processing Requests
The primary backend is built using EC2 instances, SQS, and DynamoDB.They forward the submitted code to the SQS, and forward the output to the users
const express = require('express');
const { DynamoDBClient, PutItemCommand, GetItemCommand } = require("@aws-sdk/client-dynamodb");
const { v4: uuidv4 } = require('uuid');
const { SQSClient, SendMessageCommand ,ReceiveMessageCommand} = require("@aws-sdk/client-sqs");
const cors = require('cors');
const app = express();
app.use(express.json());
require('dotenv').config();
app.use(cors({
origin: '*',
methods: ['GET', 'POST', 'PUT', 'DELETE', 'OPTIONS'],
}))
const config = {
region: "eu-north-1",
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
}
};
const DBclient = new DynamoDBClient(config);
const sqsClient = new SQSClient(config);
app.post('/submit-code', async (req, res) => {
console.log(req.body)
const executionId = uuidv4();
const { code } = req.body;
const input = {
TableName: 'code',
Item: {
executionId: { S: executionId },
code: { S: code },
status: { S: 'pending' },
result: { S: '' },
},
};
try {
await DBclient.send(new PutItemCommand(input));
const sqsParams = {
QueueUrl: process.env.QUEUE_URL,
MessageBody: JSON.stringify({ executionId }),
MessageDeduplicationId: executionId,
MessageGroupId: "CodeSubmissionGroup",
};
await sqsClient.send(new SendMessageCommand(sqsParams));
res.json({ executionId, message: "Code submitted successfully" });
} catch (error) {
console.error("Error submitting code:", error);
res.status(500).json({ error: "Failed to submit code" });
}
});
app.get('/check-status/:executionId', async (req, res) => {
const { executionId } = req.params;
const params = {
QueueUrl: process.env.QUEUE_URL,
MaxNumberOfMessages: 10,
WaitTimeSeconds: 0,
VisibilityTimeout: 0,
};
try {
const data = await sqsClient.send(new ReceiveMessageCommand(params));
let found = false;
if (data.Messages) {
for (const message of data.Messages) {
const body = JSON.parse(message.Body);
if (body.executionId === executionId) {
found = true;
break;
}
}
}
if (found) {
res.json({ status: 'pending', result: '' });
} else {
const dbParams = {
TableName: 'code',
Key: {
executionId: { S: executionId },
},
};
const dbData = await DBclient.send(new GetItemCommand(dbParams));
if (dbData.Item) {
const status = dbData.Item.status.S;
const result = dbData.Item.result.S;
res.json({ status, result });
} else {
res.status(404).json({ error: "Execution ID not found" });
}
}
} catch (error) {
console.error("Error checking status:", error);
res.status(500).json({ error: "Failed to check status" });
}
});
app.listen(3000, () => {
console.log('primary backend listening on port 3000!')
});
Worker node: Processing code
The Worker node is built using EC2 instances,They process the submitted code, and store the results in DynamoDB.
const express = require('express');
const safeEval = require('safe-eval');
const { DynamoDBClient, UpdateItemCommand, GetItemCommand } = require("@aws-sdk/client-dynamodb");
const { SQSClient, ReceiveMessageCommand, DeleteMessageCommand } = require("@aws-sdk/client-sqs");
require('dotenv').config();
const app = express();
app.use(express.json());
require('dotenv').config();
const config = {
region: "eu-north-1",
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
}
};
const DBclient = new DynamoDBClient(config);
const sqsClient = new SQSClient(config);
const processMessage = async (message) => {
try {
const { executionId } = JSON.parse(message.Body);
const { Item } = await DBclient.send(new GetItemCommand({
TableName: 'code',
Key: {
executionId: { S: executionId },
},
}));
if (!Item) {
console.log("Item not found for executionId:", executionId);
return;
}
const { code } = Item;
let outputString = '';
try {
const output = await safeEval(code.S);
outputString = output != null || output != undefined ? output.toString() : 'Code did not return any output';
} catch (error) {
outputString = "this code cant be executed";
}
const input = {
TableName: 'code',
Key: {
executionId: { S: executionId },
},
UpdateExpression: 'SET #status = :status, #result = :result',
ExpressionAttributeNames: {
'#status': 'status',
'#result': 'result'
},
ExpressionAttributeValues: {
':status': { S: 'Executed' },
':result': { S: outputString },
},
};
await DBclient.send(new UpdateItemCommand(input));
await sqsClient.send(new DeleteMessageCommand({
QueueUrl: process.env.QUEUE_URL,
ReceiptHandle: message.ReceiptHandle,
}));
} catch (error) {
console.error("Error processing message:", error);
}
};
const processMessages = async () => {
while (true) {
try {
const { Messages } = await sqsClient.send(new ReceiveMessageCommand({
QueueUrl: process.env.QUEUE_URL,
WaitTimeSeconds: 10,
}));
if (Messages && Messages.length > 0) {
for (const message of Messages) {
await processMessage(message);
}
}
} catch (error) {
console.error("Error receiving messages:", error);
}
await new Promise(resolve => setTimeout(resolve, 10000));
}
};
processMessages();
const port = 4000;
app.listen(port, () => {
console.log(`Primary backend listening on port ${port}!`);
});
Here is the complete architecture diagram
This project shows how to use different AWS services to build an app that can handle lots of users and work efficiently. By combining services like S3, CloudFront, API Gateway, Route 53, ELB, EC2, SQS, DynamoDB, and Cloud9, do tasks without waiting, and adjust to how many people are using it. This setup makes sure the app is always available, reliable, and works fast.
See you next time 👋
Top comments (0)