DEV Community



elmuerte profile image
Michiel Hendriks

This is not about losses of job (types).

A lot of automation was created with the purpose of making work easier for people. So that work was less stressful. In cases where they succeeded instead of people being able to do the same amount of work, with less stress, and with better quality. The bean counters concluded that they could do more work instead. This resulted in more stress and generally lower quality.

With the advent of more processing power and cheaper storage it became possible to store more and more data about people's actions. Now that there is stored and measurable data, people suddenly thing they can extra information from it in a magical way, and predict the present and future. This kind of pseudo science is not new. Things like phrenology, polygraphs, ... are all based on data misuse, and have been used in really bad ways. The same thing has happened, and is still happening, with data retrieved with newer technology. AI and machine learning is adding kerosene to this tire fire. If the AL/ML system has no transparency in they way the come to a conclusion you don't know if they work correctly. This will lead to people being discriminated based on bad models.

Technology should improve quality of life, the larger audience, the better. But it should not reduce quality of life for anybody.