How do you log REST API on the backend, especially the Body?
If the data is in path params or querystring, it is probably not a real problem. Maybe log as a JSON, and the size usually isn't very big.
But querystrings seem to have a problem of serialization format and URL length
But if it is body, it is even possible that it is a large JSON string, or isn't even a string. How do you deal with that, to not to bloat the server logs?
Edit: it seems that an answer is Node.js
's inspector.console. It will log to Chrome DevTools, if you run node --inspect server.js
or ts-node-dev --inspect -- server.ts
Making it work in Fastify requires not only pino-pretty
, but also pino-inspector, which is a very simple module.
Additional setup is this. Otherwise, it won't log req.body
.
import { Serialize } from 'any-serialize'
const ser = new Serialize()
f.addHook('preHandler', function (req, _, done) {
if (req.body && typeof req.body === 'object') {
req.log.debug(
{
body: filterObjValue(
req.body,
/**
* This will keep only primitives, nulls, plain objects, Date, and RegExp
* ArrayBuffer in file uploads will be removed.
*/
(v) => ser.hash(v) === ser.hash(ser.clone(v))
),
},
'parsed body'
)
}
done()
})
function filterObjValue(obj: any, fn: (v: any) => boolean) {
if (obj && typeof obj === 'object') {
if (Array.isArray(obj)) {
return obj.filter((a) => fn(a)).map((a) => filterObjValue(a, fn))
} else {
return Object.entries(obj)
.filter(([, v]) => fn(v))
.map(([k, v]) => [k, filterObjValue(v, fn)])
.reduce((prev, [k, v]) => ({ ...prev, [k]: v }), {})
}
}
return obj
}
Not sure how to make it work in Express or Winston module, though.
Top comments (6)
It depends on what are you trying to achieve.
Large companies are going to a ELK (Elastic Search + LogStash + Kibana) or other similar combinations.
You could go in the direction of a SaaS service to have your logging done. Something like New Relic can help.
If you have a web server in front of your REST API (and you really SHOULD have) you can also have some logging there too.
You have many options. It all depends on what are your requirements (performance, retention, availability, etc.)
I deployed my Docker container on Google Cloud Run. The rest is fully managed by Google.
GCR dashboard looks like this,
There seems to be no error message shown, even if the status is 500.
However, I am also asking about logging in development settings, e.g. localhost.
On localhost, I usually don't log stuff, I debug. By that I mean using the debugger to view variable content, bodies, etc. inside my code. Eventually, I
console.log()
(or whatever your language console method is) things if I feel debugging is not necessary.But that's MY workflow. If its for development stuff, I wouldn't worry much if I'm bloating my local server as I could easily delete logs.
Maybe I missunderstood what you're trying to do. :)
I don't know if you're already using but I'll leave here for reference for other people: Google Cloud Run has a logging feature: cloud.google.com/run/docs/logging.
I use
fastify
which usespino
, which automatically log to console; but this can easily be a bad idea.--inspect
flag yet.Also, of course, I don't know how to make
pino
work well with GCR. As far as I read the guide, it still requires a decision of "what to log"?When I develop in Node.js I log request/responses locally with debug and morgan/winston packages. They require little setup and work well for me.
Can you explain more? I do realize the existence of Morgan and Winston, though.
There is also pino-debug, but I don't see it as helpful. Too much non-useful information.
Have you tried, something like this. The error log size will destroy console output in VSCode. I believe it won't destroy
chrome://inspect
, though. (I edited the main post on how I made it work.)