Written by Diogo Souza✏️
Everybody’s talking about OAuth 2.0.
Regardless of the size of the company you work for or the number of services and APIs you’re running in the background, there’s a good chance you need OAuth2 (if you’re not already using it.)
Given the huge amount of information out there and the tools and frameworks needed for many different languages and platforms, it can get really hard to understand and easily apply the protocol to your projects. And it’s important to do that.
When it comes to JavaScript (and more specifically Node.js), it also depends on factors such as which server you’re choosing and whether it already provides OAuth2 support. It’s also important to consider the maturity of the project, docs, and community.
With that in mind, node-oauth2-server comes to the rescue. It is a framework-agnostic module for implementing an OAuth2 server in Node.js. It is open source, simple, and easy to integrate with your Node apps (even if they’ve already been running for a while).
Within its docs, you can find the official Model Specification that describes how your JS code must override the default OAuth2 functions to provide your customized auth experience.
const model = {
// We support returning promises.
getAccessToken: function() {
return new Promise('works!');
},
// Or, calling a Node-style callback.
getAuthorizationCode: function(done) {
done(null, 'works!');
},
// Or, using generators.
getClient: function*() {
yield somethingAsync();
return 'works!';
},
// Or, async/wait (using Babel).
getUser: async function() {
await somethingAsync();
return 'works!';
}
};
const OAuth2Server = require('oauth2-server');
let oauth = new OAuth2Server({model: model});
With the OAuth2Server object in hand, you can override the default OAuth2 provider of your Express server. Then, we can easily provide your own auth experience.
Please refer to the official docs for more info on how the framework works behind the scenes.
In this article, we’ll explore a bit of this framework by developing our own overwritten implementation and testing it through a real API so you can see the project in action blocking and allowing access to a specific endpoint.
We’ll also integrate it with a Postgres database to turn the example more robust and real.
Our example will explore the universe of the passwordgrant type of OAuth 2 for the sake of simplicity.
Based on this example, you can move on and adapt the implementation to the other types.
Setup
First, let’s install everything that is a requirement. Make sure to have Postgres installed to your respective OS.
After you’ve installed it successfully, create a new database called “logrocket_oauth2” and run the following SQL to create our user and access token tables:
CREATE TABLE public.users
(
id serial,
username text,
user_password text,
PRIMARY KEY (id)
)
WITH (
OIDS = FALSE
);
ALTER TABLE public.users
OWNER to postgres;
CREATE TABLE public.access_tokens
(
id serial,
access_token text,
user_id integer,
PRIMARY KEY (id)
)
WITH (
OIDS = FALSE
);
ALTER TABLE public.access_tokens
OWNER to postgres;
We’ve simplified the tables the most, so columns related to creation or updating date times won’t be covered here.
Next, create a new folder in the directory of your choosing named logrocket-oauth2-example
and run the npm init
command to initialize it with your package.json
file.
Then, run the following command to install the dependencies we’ll need:
npm install bluebird body-parser express pg node-oauth2-server crypto
Note that they relate to Postgres integration with Node, Express server, the node-oauth2-server
dependency itself, and crypto
(to provide some features for password encryption).
You can also run the commands under Yarn, if you prefer. In this case, please follow the instructions stated here.
Finally, make sure to reproduce the following folder structure:
Database layer
Now, let’s move on to the database setup. After you’ve created the database and tables successfully, we’ll need a Postgres wrapper to encapsulate the queries we’re going to make in the db.
Inside of the db
folder, insert the following code to the pgWrapper.js
file:
module.exports = {
query: query,
};
const Pool = require("pg").Pool;
function query(queryString, cbFunc) {
const pool = new Pool({
user: "postgres",
host: "localhost",
database: "logrocket_oauth2",
password: "postgres",
port: 5432,
});
pool.query(queryString, (error, results) => {
cbFunc(setResponse(error, results));
});
}
function setResponse(error, results) {
return {
error: error,
results: results ? results : null,
};
}
The most important part of this code is the query()
function. Instead of throwing the Postgres connection pool object everywhere, we’re going to centralize it into this file and export this function to the outer world.
It is pretty simple, made of a new pg Pool
instance (make sure to change the database properties to yours) and a callback function that, in turn, will always receive a JSON object composed of an error
and a results
properties. Let’s keep the results
as an array for simplicity.
Next, we’re going to need two repositories that will handle the database operations for both users and tokens. The first one will be the userDB.js
file:
let pgPool;
module.exports = (injectedPgPool) => {
pgPool = injectedPgPool;
return {
register: register,
getUser: getUser,
isValidUser: isValidUser,
};
};
var crypto = require("crypto");
function register(username, password, cbFunc) {
var shaPass = crypto.createHash("sha256").update(password).digest("hex");
const query = `INSERT INTO users (username, user_password) VALUES ('${username}', '${shaPass}')`;
pgPool.query(query, cbFunc);
}
function getUser(username, password, cbFunc) {
var shaPass = crypto.createHash("sha256").update(password).digest("hex");
const getUserQuery = `SELECT * FROM users WHERE username = '${username}' AND user_password = '${shaPass}'`;
pgPool.query(getUserQuery, (response) => {
cbFunc(
false,
response.results && response.results.rowCount === 1
? response.results.rows[0]
: null
);
});
}
function isValidUser(username, cbFunc) {
const query = `SELECT * FROM users WHERE username = '${username}'`;
const checkUsrcbFunc = (response) => {
const isValidUser = response.results
? !(response.results.rowCount > 0)
: null;
cbFunc(response.error, isValidUser);
};
pgPool.query(query, checkUsrcbFunc);
}
Our database model is going to resume three operations: the registration, searching, and validation of a user.
Note that we’re injecting the pgPool
in the beginning of the file that we’ve created before. In order for this code to work, we still need to pass the param to the constructor in the index.js
file.
Each function deals with our previously created query
function. The npm pg package receives the query itself as the first argument. The error-results composition is the second argument, which contains the result of our execution.
Plus, we’re injecting the params via the ${}
operator to simplify the concatenation. However, you can also use parameterized queries by passing the values as an array in the second (optional) argument of the query
function.
Finally, the pg package returns the values in the results
object, but there isn’t any length
property. This differs from other databases like MySQL.
In order to see if any results are coming, we need to access the rowCount
property.
Note that we’re passing around a lot of callback functions to avoid having the control under the function returns. This will make the whole architecture more async. Feel free to adapt this to your own style.
Now, Let’s go to the tokenDB.js
implementation:
let pgPool;
module.exports = (injectedPgPool) => {
pgPool = injectedPgPool;
return {
saveAccessToken: saveAccessToken,
getUserIDFromBearerToken: getUserIDFromBearerToken,
};
};
function saveAccessToken(accessToken, userID, cbFunc) {
const getUserQuery = `INSERT INTO access_tokens (access_token, user_id) VALUES ('${accessToken}', ${userID});`;
pgPool.query(getUserQuery, (response) => {
cbFunc(response.error);
});
}
function getUserIDFromBearerToken(bearerToken, cbFunc) {
const getUserIDQuery = `SELECT * FROM access_tokens WHERE access_token = '${bearerToken}';`;
pgPool.query(getUserIDQuery, (response) => {
const userID =
response.results && response.results.rowCount == 1
? response.results.rows[0].user_id
: null;
cbFunc(userID);
});
}
Very similar to our previous JS file, we’re injecting the pg Pool
in the constructor and calling the respective queries.
Pay special attention to the getUserIDFromBearerToken
function. Here, attending to the default node-oauth2-server
model contract, we need to provide a function that will evaluate if the given bearer token is actually valid.
Here, valid means that the token exists in the database.
This function will work thanks to the previous isValidUser
from userDB.js
, since it checks for username duplicity when inserting a new user.
OAuth2 Service and Routes
Now that we have the database layer ready to be called, let’s implement the services and routes we need.
We’ll start with the tokenService.js
file:
let userDB;
let tokenDB;
module.exports = (injectedUserDB, injectedTokenDB) => {
userDB = injectedUserDB;
tokenDB = injectedTokenDB;
return {
getClient: getClient,
saveAccessToken: saveAccessToken,
getUser: getUser,
grantTypeAllowed: grantTypeAllowed,
getAccessToken: getAccessToken,
};
};
function getClient(clientID, clientSecret, cbFunc) {
const client = {
clientID,
clientSecret,
grants: null,
redirectUris: null,
};
cbFunc(false, client);
}
function grantTypeAllowed(clientID, grantType, cbFunc) {
cbFunc(false, true);
}
function getUser(username, password, cbFunc) {
userDB.getUser(username, password, cbFunc);
}
function saveAccessToken(accessToken, clientID, expires, user, cbFunc) {
tokenDB.saveAccessToken(accessToken, user.id, cbFunc);
}
function getAccessToken(bearerToken, cbFunc) {
tokenDB.getUserIDFromBearerToken(bearerToken, (userID) => {
const accessToken = {
user: {
id: userID,
},
expires: null,
};
cbFunc(userID === null, userID === null ? null : accessToken);
});
}
It sounds a bit more complex than it actually is. All of these functions are simply overwritten versions of the Model Specification contract we’ve seen.
For each of its default actions, we need to provide our own implementation that calls our database repository to save a new user and a new access token to retrieve them or to get the client application.
Note that for the grantTypeAllowed
function, we’re actually just recalling the callback function passed as a third argument (they will be passed by the node-oauth2-server
framework).
Here, we validate if the given client id has real access to this grant type (set to password only).
You can add as many validations as you wish. We can also integrate it with other private validation APIs you or your company may have.
Now, on to the authenticator.js
file code:
let userDB;
module.exports = (injectedUserDB) => {
userDB = injectedUserDB;
return {
registerUser: registerUser,
login: login,
};
};
function registerUser(req, res) {
userDB.isValidUser(req.body.username, (error, isValidUser) => {
if (error || !isValidUser) {
const message = error
? "Something went wrong!"
: "This user already exists!";
sendResponse(res, message, error);
return;
}
userDB.register(req.body.username, req.body.password, (response) => {
sendResponse(
res,
response.error === undefined ? "Success!!" : "Something went wrong!",
response.error
);
});
});
}
function login(query, res) {}
function sendResponse(res, message, error) {
res.status(error !== undefined ? 400 : 200).json({
message: message,
error: error,
});
}
Here we have the two main authentication methods: one for the user registration and the other for the user login.
Whenever an attempt to register a user is made, we first need to make sure it’s valid (if it’s not a duplicate) and then register it.
We’ve already seen the validation and saving functions. Now, it’s just a single call.
The login
function, in turn, doesn’t need to have any implementation since it’s going to call the framework default flow.
In the end, check for whether we had an error or a success for each request so we can set the proper HTTP response code.
Finally, we need to set up our Express routes:
module.exports = (router, app, authenticator) => {
router.post("/register", authenticator.registerUser);
router.post("/login", app.oauth.grant(), authenticator.login);
return router;
};
Simple, isn’t it? The only difference is that we’re calling the Express oauth
function grant()
to make sure this user is logged in properly.
In order to assure that the implementation is fully working, we’ll also need a safe test endpoint.
It’ll be created as any other endpoint, but protected.
That means that only authorized users may have access to it through the sending of a valid bearer token.
Add the following content to our testAPIService.js
:
module.exports = {
helloWorld: helloWorld,
};
function helloWorld(req, res) {
res.send("Hello World OAuth2!");
}
And this to the testAPIRoutes.js:
module.exports = (router, app, testAPIService) => {
router.post("/hello", app.oauth.authorise(), testAPIService.helloWorld);
return router;
};
Last but not least, we need to set up the index.js
mappings:
// Database imports
const pgPool = require("./db/pgWrapper");
const tokenDB = require("./db/tokenDB")(pgPool);
const userDB = require("./db/userDB")(pgPool);
// OAuth imports
const oAuthService = require("./auth/tokenService")(userDB, tokenDB);
const oAuth2Server = require("node-oauth2-server");
// Express
const express = require("express");
const app = express();
app.oauth = oAuth2Server({
model: oAuthService,
grants: ["password"],
debug: true,
});
const testAPIService = require("./test/testAPIService.js");
const testAPIRoutes = require("./test/testAPIRoutes.js")(
express.Router(),
app,
testAPIService
);
// Auth and routes
const authenticator = require("./auth/authenticator")(userDB);
const routes = require("./auth/routes")(
express.Router(),
app,
authenticator
);
const bodyParser = require("body-parser");
app.use(bodyParser.urlencoded({ extended: true }));
app.use(app.oauth.errorHandler());
app.use("/auth", routes);
app.use("/test", testAPIRoutes);
const port = 3000;
app.listen(port, () => {
console.log(`listening on port ${port}`);
});
Here, we’re basically importing all the required modules, as well as injecting the corresponding ones into each other.
Pay special attention to the Express settings. Notice that we’re overwriting the default oauth
object of Express with our own implementation, as well as defining the grant type and the model service.
Then, the routes for the authenticator and the tests must be assigned to the Express Router so Express understands how to redirect each of the approaching requests.
Let’s test it now. To test the endpoints, we’ll make use of the Postman tool because it’s simple and practical. Feel free to pick up one of your choice.
Then, start the server by running:
node index.js
First, we need to create a new user. For this, perform a POST request to http://localhost:3000/auth/register with the following body params (encoded as x-www-form-urlencoded
):
Go ahead and check if the user was successfully created at your database.
With a valid user in hand, you can now login. For this, send another POST request to http://localhost:3000/auth/login with the following body params:
Note that if you change the credentials to invalid ones, you’ll get this message: OAuth2Error: User credentials are invalid
.
Now, with OAuth2 implemented and working, we come to our most important test.
Let’s validate our secure endpoint. Postman provides us with special features to test this: the Authorization tab
.
Take a look at the following picture:
By selecting the Authorization Tab
, you get access to some interesting test features.
You get the type of authorization flow your API is making use of. In our case, OAuth 2.0
.
You’ll also be able to choose where exactly Postman should place the authorization data: to the request header or body? Select the header option.
Additionally, you have two options of where to retrieve the access tokens. You can explicitly drop the token text into the available textarea
, or click the “Get New Access Token
” button that will, in turn, open a dialog modal with some more fields. Those fields will ask for the access token URL endpoint to get new ones, the TTL, grant type, etc.
Here, you can preview the request. After clicking the button, the inputted values will be automatically translated to the header and body configurations of the current request. This way, you don’t have to manually change each header every time you need to run a new request.
Click the Send button and the Hello World OAuth2
will appear as a result.
Conclusion
You can find the full source code for this example here.
This framework is just one of the options available out there. You can go to the OAuth.net project and check out the latest recommendations for Node.js and your preferred language as well.
Of course, there’s a lot to see.
OAuth2 is a huge protocol that deserves more time and attention when reading and applying its specifications. However, this simple introduction will allow you to understand how the framework works along with Express and Postgres.
You can also change the server and the database to switch your needs. Just make sure to use the same contract we’ve established so far.
Regarding your studies, don’t lock yourself to this framework specifically. There are many others depending on the frontend frameworks you’re using (React, Angular and Vue, for example, have other good libraries to help out with that).
Good luck!
200's only ✅: Monitor failed and show GraphQL requests in production
While GraphQL has some features for debugging requests and responses, making sure GraphQL reliably serves resources to your production app is where things get tougher. If you’re interested in ensuring network requests to the backend or third party services are successful, try LogRocket.
LogRocket is like a DVR for web apps, recording literally everything that happens on your site. Instead of guessing why problems happen, you can aggregate and report on problematic GraphQL requests to quickly understand the root cause. In addition, you can track Apollo client state and inspect GraphQL queries' key-value pairs.
LogRocket instruments your app to record baseline performance timings such as page load time, time to first byte, slow network requests, and also logs Redux, NgRx, and Vuex actions/state. Start monitoring for free.
The post Implementing OAuth 2.0 in Node.js appeared first on LogRocket Blog.
Top comments (1)
Github link to the complete source code is not working.