Hello there.
My name is Alan Terriaga, and I am a software engineer who likes to write about new tech, devops and most of all, share and learn new ideas.
This is my first post here on Dev. Before, I used to write on my own website. But with time, I have found that having a platform like Medium or Dev.to would serve my needs better. So far, I have chosen Dev.to, and for now, I am liking what I am seeing.
A while ago, I wrote how to create a MERN stack with the frameworks we are familiar with: Mongo, ExpressJS, ReactJS and NodeJS. Although creating a project from that post would work, the code would not serve today's necessities. And as we know, we need to adapt and evolve for the current frameworks to improve our skills and work.
Therefore, I have decided as my first article is how to create a MERN stack using current frameworks. And beyond that, how to adapt a MERN stack on a cloud database and cloud development. The solution is a simple Students CRUD application divided by two projects: the server side and the other the client.
For this first part of the article, we will cover the server-side of the project and aspects like:
- The creation of Mongo database on Azure CosmosDB
- The server-side NodeJS server
- The creation of the API services.
- Webpack & Typescript ES6 configuration.
As I continue to describe the project below, I will point new techniques we can use to improve and facilitate our work. Details that we didn't have some years or months ago.
- Requirments for this article:
- Mongo, NodeJS and Typescript basic knowledge.
- Azure free subscription.
MERN SERVER-SIDE.
1 - MongoDB on Azure CosmosDB.
As we know, MongoDB is a document database, which means it stores data in JSON-like documents. And Azure CosmosDB is a fully managed NoSQL database service that guarantees 99.999-per cent availability and open source Apis for MongoDB and Cassandra. Companies like Coca-Cola and Symantec are using Cosmos DB solutions. You can learn more about Azure Cosmos DB at https://azure.microsoft.com/en-au/services/cosmos-db/#featured.
*If you are not interested in Azure CosmoDB or have a preference in AWS DynamoDB is ok. It won't impact the development of the APIs.
On Azure Portal goes to Azure Cosmos DB:
- Add database
- Select your Azure Subscription.
- Select or Create a Resource Group.
- API: Azure Cosmos DB for MongoDB API
- Location: At your convenience.
- Account Type: Non-Production (for Azure free subscription)
- Availability Zones: Disabled.
- Review and create: It will take some minutes.
2 - Server Project.
The project consists of one server configuration, two routes, one service and one model class. Most of the project was developed by Typescript ES6 instead of standard Javascript, even in the server configuration. So for the bundle translation, it is used Webpack and Babel.
Frameworks
- NodeJs
- Nodemon
- Webpack 5
- Babel
- ExpressJS
- Typescript
- Mongoose
- Ts-node
Project structure
Node Server
On src/index.ts is configured the NodeJs server of the project.
import express from 'express';
import bodyParser from 'body-parser';
import dotenv from 'dotenv';
import mongoose from 'mongoose';
import router from './routes/index.routes';
//==================================================================================
// Setting the application
dotenv.config();
// Connect to the Azure Cosmos DB
mongoose.Promise = global.Promise;
mongoose.connect(
`${process.env.DATABASE_URI}`,
{useNewUrlParser: true}
)
.then(() => {
return console.log(`DB Connected`);
})
.catch(error => {
console.log("Error connecting to database: ", error);
return process.exit(1);
});
const db = mongoose.connection;
db.on('error', console.error.bind(console, 'connection error:'));
db.once('open', () => console.log('LOG: database connected'));
const app = express();
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
// CORS config
app.use( (req, res, next) => {
res.header("Access-Control-Allow-Origin", "*"); // update to match the domain you will make the request from
res.header("Access-Control-Allow-Credentials", "true");
res.header("Access-Control-Allow-Headers", "*");
res.header("Access-Control-Allow-Methods", "POST, PUT, GET, OPTIONS");
next();
});
app.use('/', router);
const PORT = process.env.PORT || 3000;
//Express js listen method to run project on http://localhost:3000
app.listen(PORT, () => console.log(`App is running in ${process.env.NODE_ENV} mode on port ${PORT}`));
Understanding the code.
First of all, the project loads the environment variables. For that is used dotenv. A npm library that allows you to create environment variables instead of specifying sensitive data inside the server file. After the installation, you need to create a .env file on your project's root, like the example below.
NODE_ENV=development
PORT=3000
DATABASE_URI=<mongoURI>
By the way, if you are using Azure like me, you can collect the database URI on Azure console > Your db > Settings > Connection String
Now, with our environment variables set, we can connect to Azure using Mongoose.
The other piece of MERN stack is [ExpressJS], (https://expressjs.com), a flexible Node.js web application framework that provides quick and easy APIs creation. It is through ExpressJs that the client project will access its Api's services. But before that, we need to configure Express in our server.
const app = express();
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
// CORS config
app.use( (req, res, next) => {
res.header("Access-Control-Allow-Origin", "*"); // update to match the domain you will make the request from
res.header("Access-Control-Allow-Credentials", "true");
res.header("Access-Control-Allow-Headers", "*");
res.header("Access-Control-Allow-Methods", "POST, PUT, GET, OPTIONS");
next();
});
app.use('/', router);
The code above set the project only to accept and parses JSON on the requests. It also sets the CORS configuration, if you want your project to use HTTPS protocol, you have to set Header and Methods allowed to be accessed. For last I set Express to route path "/" to router class, which is described further this post.
As you noticed, I am using typescript to set the server. To work, we need to set Webpack, Ts-loader and Babel to parse Typescript to CommonJs in the bundle file.
/webpack.config.cjs
const path = require('path');
const webpackNodeExternals = require('webpack-node-externals');
const isProduction = typeof NODE_ENV !== 'undefined' && NODE_ENV === 'production';
const devtool = isProduction ? false : 'inline-source-map';
const serverConfig = {
target: 'node',
entry: './src/index.ts', // Server NodeJs file
output: {
path: path.join(__dirname, 'dist'), // Specify bundle location directory
filename: 'bundle.js',
},
externals: [webpackNodeExternals()], // Does not include code from node_modules in the server-side bundle
devtool: 'inline-source-map',
resolve: {
extensions: ['.js', '.jsx', '.json', '.ts', '.tsx'], // Specify extensions used in the project
},
module: {
rules: [
{
test: /\.(tsx|ts)$/,
use: 'ts-loader', // With ts-loader tsx adn ts files are translated to bundle.
exclude: /node_modules/
}
]
}
};
module.exports = [serverConfig];
/babel.config.json
{
"presets": [
["@babel/env"]
]
}
As the last part of ES6 configuration, I set on ts.config file the bundle file location, the module to be used on the parsing and the most important Module resolution, which in our case is Node.
/ts.config.json
{
"compilerOptions": {
"baseUrl": "./src",
"outDir": "./dist/",
"noImplicitAny": false,
"module": "CommonJs",
"target": "ESNext",
"moduleResolution": "node",
"allowJs": true,
"strict": true,
"allowSyntheticDefaultImports": true,
"sourceMap": true,
"esModuleInterop" : true,
"typeRoots": [
"node_modules/@types"
],
"lib": [
"ESNext",
"DOM"
]
},
"include": [
"./src"
]
}
Now that we have set our server, connected to CosmosDB, configured the project to parse ES6 to CommonJS, we can now create our APIs.
Defining routes.
Through Express Router, I create a standard route to test the project and other to persist with CRUD operations.
src/routes/index.routes.ts
import {Router} from 'express';
import studentRouter from './student.routes';
const router = Router();
router.get('/', (req, res) => {
res.status(200).send('MERN Azure running - Server');
});
router.use('/student', studentRouter);
export default router;
src/routes/student.routes.ts
import { Request, Response, Router } from 'express';
import { StudentInterface, StudentSearchRequestDto } from '../model/student.model';
import {
getStudents,
insertStudent,
updateStudent,
deleteStudent
} from '../service/student.service';
import moment from 'moment';
const studentRouter = Router();
/**
* POST: Get Students list
*/
studentRouter.post('/list', (req: Request<StudentSearchRequestDto>, res: Response) => {
getStudents(req, res);
});
/**
* POST: Insert Student
*/
studentRouter.post('/', (req: Request<StudentInterface>, res: Response) => {
insertStudent(req, res);
});
/**
* PUT: Update Student
*/
studentRouter.put('/', (req: Request<StudentInterface>, res: Response) => {
if(req.body && req.body.dateOfBirth) {
const dateMomentObject = moment(req.body.dateOfBirth, "DD/MM/YYYY");
req.body.dateOfBirth = dateMomentObject.toISOString();
}
updateStudent(req, res);
});
/**
* UPDATE: Inactive Student
*/
studentRouter.post('/inactive', (req: Request, res: Response) => {
deleteStudent(req, res);
});
export default studentRouter;
Understanding the code.
Using Typescript in the project, we have the benefit to set Generics on the request classes. This makes the project stronger at compile check and also an easier code to read.
One detail to notice in the code is that I am using POST protocol instead of GET on the search of students API. That's because I am expecting multiple parameters for the filter.
Mapping Mongo collection and Model classes.
For the CRUD operations to work correctly, we need to map the Mongo
collections we will be going to work on, and the interfaces used as generic types on the API's request classes.
src/model/student.model.ts
import { ObjectId } from 'mongodb';
import mongoose, { Schema, Document, Collection } from 'mongoose';
export interface StudentSearchRequestDto {
name: string,
skills: string[];
}
export interface StudentInterface extends Document {
_id: String,
firstName: String,
lastName: String,
dateOfBirth: Date,
country: String,
skills: String[],
inactive: Boolean
}
const StudentSchema: Schema = new Schema(
{
_id: { type: String, unique: true },
firstName: { type: String, required: false },
lastName: { type: String, required: false },
dateOfBirth: { type: Date, required: false},
country: { type: String, required: false },
skills: { type: [String], required: false },
inactive: { type: Boolean, default: false }
},
{
collection: 'student' // Without this attribute the collection won't be retrieved
}
);
// model name, schema, ?collection name
const Student = mongoose.model<StudentInterface>('student', StudentSchema);
export default Student;
Understanding the code.
Some details to point here, the attribute _id is set as String type, by default MongoDB create as ObjectID, which can cause parsing problems between the database and the application. I recommend you save _id as String but still generating in uuidv4 format. The second point is that we can set the response API interface we will use as generic type on the schema thanks to Typescript.
Setting services.
Now is time to set the services that perform the CRUD operations, and since the code of this class is long, I will break it into pieces.
Search operation
import Student, { StudentInterface, StudentSearchRequestDto } from '../model/student.model';
import { Request, Response } from 'express';
import { FilterQuery } from 'mongoose';
import mongoose from 'mongoose';
import _ from 'lodash';
import { v4 as uuidv4 } from 'uuid';
/**
* Search Students by name or skills
*
* @param req
* @param res
*/
async function getStudents(req: Request<StudentSearchRequestDto>, res: Response<Array<StudentInterface>>) {
const query = Student.find();
const filterQueryArray: Array<FilterQuery<StudentInterface>> = new Array<FilterQuery<StudentInterface>>();
filterQueryArray.push({inactive: {$ne: true}});
if (req.body.name) {
filterQueryArray.push({firstName: {$regex: req.body.name}});
filterQueryArray.push({lastName: {$regex: req.body.name}});
}
if(!_.isEmpty(req.body.skills)) {
filterQueryArray.push({skills: {$all: req.body.skills}});
}
if(!_.isEmpty(filterQueryArray)) {
query.or(filterQueryArray);
}
await query
.sort({firstName:1,lastName:1})
.exec()
.then(students => {
console.log('**** SUCCESS');
return res.send(students);
})
.catch(err => {
console.log(err);
});;
}
Understanding the code.
Mongoose.Model "find()" method returns a Query object, so in case you need to perform a more robust query on your search, you can create the Query object and then attach an Array of FilterQuery with all conditions you want on it, just like I am doing on the code above.
For the query sorting function to work, you will need to create an index on your mongo collection. And for that we need go back to our Azure database and create the index. At the moment, a compound index is only able to be made via shell command. You can open a shell command window on AzureCosmosDB portal > Data Explorer > Open Mongo Shell
The code below creates a compound index based on the attributes I want to order my list.
db.student.createIndex({"firstName": 1, "lastName": 1})
Insert and Update operations
/**
* Insert new Student
*
* @param req
* @param res
*/
async function insertStudent(req: Request<StudentInterface>, res: Response) {
//req.body._id = new mongoose.Types.ObjectId();
req.body._id = uuidv4();
console.log(`_ID: ${req.body._id}`);
await Student
.create(
{
_id: req.body._id,
firstName: req.body.firstName,
lastName: req.body.lastName,
dateOfBirth: req.body.dateOfBirth,
country: req.body.country,
skills: req.body.skills
}
)
.then(student => {
return res.status(200).send();
})
.catch(err => {
console.log(err);
});
}
/**
* Update Student data
*
* @param req
* @param res
*/
async function updateStudent(req: Request<StudentInterface>, res: Response) {
await Student
.updateOne(
{_id: { $in: req.body._id}},
{
firstName: req.body.firstName,
lastName: req.body.lastName,
dateOfBirth: req.body.dateOfBirth,
country: req.body.country,
skills: req.body.skills
},
{upsert: false}
)
.then(student => {
return res.status(200).send();
})
.catch(err => console.log(err));
}
/**
* Delete Student data
*
* @param req
* @param res
*/
async function deleteStudent(req: Request<any>, res: Response) {
if (!req.body && !req.body.ids) {
res.status(400).send();
return;
}
await Student
.updateMany(
{_id: { $in: req.body.ids}},
{
inactive: true
},
{upsert: false}
)
.then(student => {
return res.status(200).send();
})
.catch(err => console.log(err));
}
export {
getStudents,
insertStudent,
updateStudent,
deleteStudent
};
No mystery here, just to point that I am using the Mongo Model objects straight on the services for the rest of our operations.
Testing the server.
To summarize, we have created the Node server, configured the project to translate ES6 javascript to the bundle, mapped the mongo collections and created the services for our CRUD operations.
Now let's start the server in local environment and test the APIs.
As you can see, our APIs are now ready to be consumed for the Front-end. And with that, we finalize our first part of this 2021 MERN article. Stay tuned for the next posts, I still have many topics to cover like, Front-end project, Test cases and deployment on Azure Services.
If you have stayed with me until this end, thank you very much. And check it out the project on Github: mern-azure-server
Please feel free to comments for suggestions or tips.
See ya.
Alan Terriaga.
Top comments (0)