So your collegue came to you asking for help transposing some column of a spread sheet because you are the programmer so “You should be the best in everything relating to whatever is done with a computer” and:
- you hate Excel
- you think that the abuse of Excel is the root cause of the 70% of company diseases
- you have to come back to your tasks
- you prefer to use an ugly script that will do the job
In this particular case you can export the Excel file in csv format and quickly transform it using a nodejs script.
Example
Let’s see an example.
Suppose you have a csv file with this structure:
ID;NAME;PROP1;VAL1;PROP2;VAL2;PROP3;VAL3;PROP4;VAL4;PROP5;VAL5;PROP6;VAL6;PROP7;VAL7;PROP8;VAL8
1;Bitcoin;Consensus;PoW;Dominance;80%;Price;44k;;;Symbol;BTC;;;;;;
2;Ethereum;;;Price;3k;Consensus;PoW/PoS;Symbol;ETH;;;Dominance;20%;;;;
The ID field and NAME are in a fixed position while the PROPX and VALX are pair that can be positioned in different columns of the csv file.
The output should be this one:
ID;NAME;SYMBOL;CONSENSUS;DOMINANCE
1;Bitcoin;BTC;PoW;80%
2;Ethereum;ETH;PoW/PoS;20%
Write an ugly script
You don’t have to write the best code ever but the one that will do the job.
I think this is very important because very often programmers (I’m one of them 🙋🏻♂️) tend to abstract too much or code useless features.
Don’t get me wrong, if you are a super clean coder or a script maximalist feel free to review and refactor it until is ok for you but I think could be a good training to do just the minimal script right for the task.
Script configuration
npm init
npm instal csv-parser csv-writer
Index.js file
const fs = require("fs")
const csv = require('csv-parser')
const createCsvWriter = require('csv-writer').createObjectCsvWriter
const csvWriter = createCsvWriter({
path: 'out/result.csv',
fieldDelimiter:";",
header: [
{id: "ID", title: "ID"},
{id: "NAME", title: "NAME"},
{id: "SYMBOL", title: "SYMBOL"},
{id: "CONSENSUS", title: "CONSENSUS"},
{id: "DOMINANCE", title: "DOMINANCE"},
]
})
const readFile = () => {
return new Promise((resolve) => {
let results = [];
fs.createReadStream('./data/data.csv')
.pipe(csv({ separator: ";"}))
.on('data', (data) => {results.push(data)})
.on('end', () => {
resolve(results);
});
});
}
(async () => {
// read csv file
const data = await readFile()
// create a map with PROPX as key and VALUEX as value
const mapped = data.map((d) => {
let res = { ID: d.ID, NAME: d.NAME };
res[d.PROP1.toUpperCase()] = (d.VAL1 || "").trim().toUpperCase();
res[d.PROP2.toUpperCase()] = (d.VAL2 || "").trim().toUpperCase();
res[d.PROP3.toUpperCase()] = (d.VAL3 || "").trim().toUpperCase();
res[d.PROP4.toUpperCase()] = (d.VAL4 || "").trim().toUpperCase();
res[d.PROP5.toUpperCase()] = (d.VAL5 || "").trim().toUpperCase();
res[d.PROP6.toUpperCase()] = (d.VAL6 || "").trim().toUpperCase();
res[d.PROP7.toUpperCase()] = (d.VAL7 || "").trim().toUpperCase();
res[d.PROP8.toUpperCase()] = (d.VAL8 || "").trim().toUpperCase();
return res
});
// write the result to the output csv
await csvWriter.writeRecords(mapped)
})()
Conclusion
Ok, this example is really easy and simplified compared to my collegue’s version that had more columns and required some data sanitization and check for distinct values but the overall point here is how much useful can be throwaway scripts in the day to day working life.
Top comments (0)