DEV Community

Jotham Zvikonyaukwa
Jotham Zvikonyaukwa

Posted on

The Mind You Are Giving Away While Borrowing…

Chapter 2: AI Industrialized the Problem

AI did not create the habit of outsourced thinking. Let us be clear about that from the start (“l am not here to blame the machine, probability or inference , you understand what l mean. l am here to talk about what we did with it”). The habit was already there, walking around, perfectly comfortable, living the officesm, kitches and church halls along before anyone typed a prompt into a chatbox.

What AI did was give it a factory.

(“Think about this for a second. Before, you needed a person, someone willing, someone available, someone who had time for you. Now you need nothing but a device and a question. The friction is gone. And the friction, it turns out, was doing a lot of work”)

Here is what most people do not stop and ask themselves a question, and this is where it gets interesting (“I am cooking again, stay with me”) what is AI actually doing when it gives you an answer?.

It is not thinking, It is not understanding, It is not reasoning the way you reason when you sit with a problem and wrestle with it in solitude. What it is doing is inference. It is looking at patterns in enormous amounts of data and calculating what answer is most probable given what it has seen before. It arrives at a solution the same way water finds the lowest point not because it understands where it is going, but because that is simply where the weight of everything pushes it (“Probability is not wisdom. It is a very confident guess”). And here is the deeper problem with that (“This is the part most people skip over and they really shouldn’t”).

AI was trained overwhelmingly on positive data. On solutions that worked, answers that were accepted, narratives that were affirmed. This means it has a built-in lean , away from the negative, away from the uncomfortable, away from the inconvenient truth that sometimes the popular answer is the wrong one. When you ask it a question, it does not weigh the negative narrative equally. It gravitates toward what has been positively reinforced (“In other words, it tells you what you want to hear more often than what you need to hear. Sound familiar? That colleague who always agreed with you , AI is that colleague, scaled to millions”).

The result is answers that feel complete, Polished, Confident, Thorough. But underneath that polish, the hard questions were never fully asked. The negative angles were quietly set aside. The discomfort that real thinking requires was removed entirely (“And we accepted it. Gladly. Because it was easier”).

This brings us back to Ackoff’s ladder:

Data → Information → Knowledge → Understanding → Wisdom

AI operates at the level of Information. It takes data , vast , incomprehensible amounts of it , and returns Information. Organised, articulate, presented beautifully. And because it looks so complete, so authoritative, most people receive it as Knowledge, or even Understanding (“When in reality you are still standing at the second lever of a five level ladder wondering why you cannot see very far, hahaha ‘Nearsighted’ ”).

Wisdom cannot be borrowed, Understanding cannot be downloaded. Knowledge is not what arrives in the response , it is what happens inside you when you have struggled with something long enough that it changes how you see or look at things (“That is the part AI cannot give you. Not because it is broken. Because that part was never its to give”).

The factory is running at full capacity. It is producing answers at a scale no colleague, no relative, no friend ever could. And the more answers it produces, the less most people feel the need to produce their own (“When the well never runs dry, you forget how to dig”).

That is not AI’s failure. That is ours.

Top comments (0)