As a frontend developer, one of the main challenges I have encountered in my work is finding the best way to coordinate and collaborate with the backend development team when working on a feature. This challenge becomes even more complex when collaborating with external companies or clients.
Often, requests come in the form of user stories, and one of the fundamental principles that make them well-formulated is the INVEST principle, acronym for Independent Negotiable, Valuable, Estimable, Small, Testable. Independent means that each user story should have the ability to be worked on independently, avoiding issues of dependencies or cascading delays.
To achieve this goal, I have learned that taking some organizational actions with the team is crucial. When developing a feature that involves both frontend and backend, following key steps can simplify synchronization and reduce complications:
- Clearly define API endpoints: First and foremost, it is crucial to clearly define and share the names of the API endpoints with all team members.
- Collaboratively design object structures: Another important practice is to collaboratively design the structure of objects exchanged between the frontend and backend. This helps create a common guide for development and ensures that everyone is aware of the necessary information to work effectively.
- Communicate any changes promptly: During development, changes or updates to the frontend interface or backend business logic may occur. It is essential to promptly communicate such changes to everyone involved in the collaboration. This way, we can make the necessary adjustments in a timely manner and reduce obstacles that may arise due to discrepancies between frontend and backend.
Who can help us?
JSON-Server creates fake REST API with a minimum amount of configuration, it provides a simple way to create mock RESTful APIs and easily define the required endpoints, allows easy definition of the data schema in a JSON file and can serve as a reference for each figure in the project.
Hands-on with JSON-Server
Now let's take a deep look at how this tool works and can help in our projects.
First, let's proceed with installing the modules:
npm i -D json-server concurrently
JSON-Server uses .json files to simulate the behaviour of a real database. So, let's create a db.json
in the root folder of our project, with a data structure that we have agreed upon and shared with the team or backend developer.
db.json
{
"categories":[
{
"id":1,
"name":"tech"
},
{
"id":2,
"name":"music"
}
],
"posts":[
{
"id":1,
"categoryId":1,
"title": "How to get started with Docker",
"content": "content of the article about Docker"
},
{
"id":2,
"categoryId":2,
"title": "New Iron Maiden's album is out",
"content": "content of the article about the new Iron Maiden's album"
}
]
}
To keep everything clear and easy to maintain, let's create a json-server.json
file in the root folder of our project. This file allows us to define a default configuration. We will include the watch:true
option to ensure that the server updates when we modify the db.json
file, and delay:1000
to simulate the latency of a real server and a the port where our server will run.
json-server.json
{
"delay": 1000,
"middlewares": [],
"watch": true,
"port": 3000
}
Let's configure our scripts in the package.json
file to launch JSON-Server, to make the process easier we will use Concurrently, an NPM package that allow us to run multiple commands simultaneously.
package.json
{
"name": "My Amazing Project",
"version": "0.0.1",
"scripts":{
"start": "<your-serve-command>",
"mock-api": "json-server db.json",
"dev": "concurrently \"npm run mock-api\" \"npm run start\""
}
}
By running npm run dev
, we will start both our app and the mock server at http://localhost:3000
As mentioned earlier, we need to be able to handle our code development with simplicity and avoid time-consuming tasks that distract us from our main objectives.
At this point, all that's left is to save our URL in a configuration variable that we can use within our projects.
mock-server.config.ts
import * as jsonServerConfig from './json-server.json';
export const mockServerConfig = {
apiUrl: `http://localhost:${jsonServerConfig.port}`
};
With JSON-Server, you can use all available HTTP methods:
bashCopy code
GET /posts
GET /posts/1
POST /posts
PUT /posts/1
PATCH /posts/1
DELETE /posts/1
Let's implement our service that we will use to manage our posts.
blog.service.ts
export class BlogService {
constructor(apiUrl) {
this.apiUrl = apiUrl
}
private apiUrl = mockServerConfig.apiUrl;
private headers = { 'Content-Type': 'application/json' };
private async handleResponse(res: Response) {
if (!res.ok) {
throw new Error(`Error: ${await res.text()}`);
} else {
return res.json();
}
}
async getPosts() {
const url = `${this.apiUrl}/posts`;
const res = await fetch(url);
return this.handleResponse(res);
}
async createPost() {
const url = `${this.apiUrl}/posts`;
const body = {
categoryId: 1,
title: 'json-server is cool',
content: 'content of the article about json server'
};
const res = await fetch(url, {
method: 'POST',
body: JSON.stringify(body),
headers: this.headers
});
return this.handleResponse(res);
}
}
We can use mockServerConfig
as the default option in our development environment, while relying on environment variables for the production environment. This approach eliminates the need to manually change configurations during development or deployment, ensuring a smoother workflow.
import { mockServerConfig } from './mock-server.config.ts'
import { BlogService } from './blog.service.ts'
const apiUrl = process.env.PRODUCTION
? process.env.API_URL
: mockServerConfig.apiUrl
const blogService = new BlogService(apiUrl)
The createPost()
method updates our db.json
file. If we check it, we will find a new object inside the posts array:
{
...
"posts":[
{
"id":1,
...
},
{
"id":2,
...
},
{
"id":3,
"categoryId":1,
"title": "json-server is cool",
"content": "content of the article about json server"
}
]
}
Conclusions
In addition to allowing us to easily manage our endpoints and data structures, JSON-Server offers a range of interesting features. These include the ability to customize URLs with minimal configuration and create custom middleware. I have used JSON-Server-Auth to quickly add an authentication and authorization system to some of my projects.
Furthermore, it can be useful for multiple purposes. For example, you could create a dedicated file to be used as a data mock during the execution of End-to-End tests, eliminating the need to create a separate database version specifically for this purpose.
It is crucial to keep in mind that the decision to adopt these features should be based on the unique requirements of your project and the dynamics of your team.
Top comments (5)
Hi Pierdomenico,
I have made great use of JSON-Server in the past in both personal and professional projects. It is extremely capable as your excellent article documents and can simultaneously replace an application-server and web-server as part of a development/test environment. It can expose a live API to a mock data set and supply static content/files on demand, which bypasses potential CORS blockage.
However, I have raised an issue with the developers to suggest an improvement. When used as an application server to provide remote interaction with a JSON dataset (its primary purpose), the original data file is modified. Having a dynamic dataset is extremely useful but I think in some cases it would be preferable if it could be configured to only update the data in memory. This would provide a consistent initial state that would be very useful for automated testing.
To your knowledge is there already a configuration to support this behaviour or would I have to resort to developing middleware?
Regards, Tracy
Hi Tracy,
Thank you for the valuable feedback on the article.
While exploring the documentation and articles related to JSON-server, unfortunately, I did not find any references to a configuration that allows for setting an initial state. The solution I have adopted in my projects is to create a
db-model.json
file where I establish the structure and initial data. Afterwards, the file is copied and renamed todb.json
using a custom script or middleware.A configuration that allows to update de data in memory, could be a really interesting feature which can save us a lot of time, avoiding manual solutions.
Have you ever thought about creating a middleware and share it with the community, or to implement the functionality and submit a PR?
Regards.
After forking the repo and investigating the source code, I have tried and tested a simple and effective work-around. Tested with 0.17.3, if the data file (db.json) is made read-only the application loads the data but does not modify it. Changes are applied but to a temporary file that is overwritten on each restart. The temporary file is retained when the server exits, which can be used to confirm the results of the transactions performed.
It can be an excellent compromise if you need this feature in a short time for your projects. I think, in order to make a better DX can be useful to implement a more intuitive solution maybe with a configuration option.
In the meantime it would be interesting if you share this trick perhaps with an article 😎
I have looked briefly into the repo but given significant thought to creating a PR. But you are right, I think that is the way to get this feature considered by the maintainer.
Tracy