DEV Community

Santiago Zarate
Santiago Zarate

Posted on • Originally published at foursixnine.io on

Table to json with jq and awk

The problem

Say you have a table that looks like this:

AGGREGATE_NEEDED 1
ARCH x86_64
BASE_TEST_ISSUES NUMBER
BUILD :NUMBER:PACKAGE
DISTRI DISTRIBUTION
FLAVOR Server-DVD-Incidents-Install
INCIDENT_ID 99999
Enter fullscreen mode Exit fullscreen mode

It’s just that it contains about 78 or more entries. Of course for a very skilled engineer or a person with a lot of tricks under the hood, this might be a very trivial task in vim or something like this, I guess that with a couple of replaces here and there, you’d get somewhere; but I’m not skilled, other than at eating.

The Solution

So I took my Key Value table saved it to a file and after googling a bit, now I’m more versed into awk :D:

cat FILE.txt | \ 
    awk 'BEGIN { print "{" } \ 
        { printf "\"%s\":\"%s\",", $1,$2} \ 
        END { print "\"MANUALLY_GENERATED_ISO_POST\":1 }" }'
        | jq > x86_64-ready.json'"}'
Enter fullscreen mode Exit fullscreen mode

I guess this could have been done easier and prettier, but fits my need and might save you too at some point. Just make sure you have jq installed ok?

Latest comments (4)

Collapse
 
oguzismail profile image
oguz-ismail

Why not just

jq -Rn '[inputs/" "|{(.[0]):.[1]}]|add' file

???

Collapse
 
foursixnine profile image
Santiago Zarate • Edited

@that's even better! \o/ :) Will keep it in mind next time I need to do this! Thanks!

Collapse
 
jdforsythe profile image
Jeremy Forsythe • Edited

I find scripting languages to be much easier to write this stuff in and easier to fix later.

const fs = require('fs');
const data = fs.readFileSync('./FILE.txt').toString();
const lines = data.split('\n');
const json = lines.reduce((acc, l) => {
  const kv = l.split(' ');
  acc[kv[0]] = kv[1];
  return acc,
}, {});
fs.writeFileSync('./file.json', JSON.stringify(json, undefined, 2));

You can also just use the Node REPL for one-off conversions. I use this pattern for CSV to JSON all the time, which would be a nightmare using awk.

Collapse
 
foursixnine profile image
Santiago Zarate

I guess something similar could be done in Perl, Ruby or any other language capable of reading files and with json object representation capabilities. Thing is, I end up using awk a lot, for many things. But I'll keep your idea in hand and maybe update with a easier to follow $script version. If I ever have to do that again :)