The Technological Black Box Problem: The Opacity That Threatens Democracy** in three parts:
- What happens when the systems that decide who goes to jail or who receives credit cannot explain why they have made that decision?
- In these 4000 words, at IAExplore we will break down why technological opacity is one of the greatest threats to justice and democracy. We will analyze the conflict between accuracy and explainability, the risks of delegating critical decisions to un-auditable systems, and the technical solutions emerging under the umbrella of Explainable AI (XAI).
- Unlike traditional software, where every line of code obeys a logical rule written by a human, Deep Learning is based on discovering patterns in huge volumes of data. Neural networks create trillions of weighted connections. The result is a model that "knows" but cannot "say" how it knows. This complexity is what gives it its amazing accuracy, but also what creates the barrier of opacity. This is the logic of the system. Source: The Technological Black Box Problem: The Opacity That Threatens Democracy
Top comments (0)