TL;DR
- Most cyber incidents ultimately trace back to human error
- Technical defenses like EDR and firewalls alone have clear limits
- Japan is launching a Supply Chain Security Evaluation Standard in October 2026
- Building a "safety culture" is now essential for organizational security
- As AI spreads, human judgment matters more than ever — not less
Background: Why Are We Suddenly Talking About "People"?
In 2026, companies worldwide are pouring money into technical security
measures — EDR, zero trust architecture, VPN hardening, you name it.
But here's what a KnowBe4 Japan seminar in March 2026 made crystal clear:
The root cause of most cyber incidents is, ultimately, human error.
And not the "oops, I clicked the wrong button" kind of error.
We're talking about the accumulation of rational decisions — choosing
convenience over compliance, or unknowingly bending security guidelines
because getting the job done felt more important. These small, reasonable
choices stack up and create gaps equivalent to zero-day vulnerabilities.
What's Changing: Japan's Supply Chain Security Standard (October 2026)
Starting October 2026, Japan will fully enforce a Supply Chain
Security Evaluation Standard covering everyone from IT system vendors
to raw material suppliers.
The goal? Cyber resilience — built on the foundations of NIST's
Cybersecurity Framework (CSF) 2.0.
This is a shift from "can we prevent attacks?" to
"can we survive and recover when they happen?"
Previously, supply chain security was a mess of asymmetric power dynamics.
Large buyers would monitor their suppliers unilaterally, while suppliers
were forced to respond to different requirements from every single client —
exhausting and inefficient. The new unified standard aims to fix this,
leveling up security across entire supply chains.
The Paradigm Shift: From "Tools" to "Culture"
| Era | Focus |
|---|---|
| Early 2020s | Antivirus, Firewalls, EDR |
| 2024 – Present | Zero Trust, Compliance, Incident Response Drills |
| 2026 onwards | Organizational Safety Culture & Supply Chain Resilience |
The old mindset was simple: buy the right tool, stay protected.
But today's attacks exploit legitimate entry points — VPN vulnerabilities,
vendor accounts, insider access — and weaponize the organizational
pressure to prioritize efficiency over security.
Patch the technical holes all you want.
If there's a human gap, attackers will find it.
The Four Components of Safety Culture (James Reason's Model)
British psychologist James Reason's Safety Culture Model — originally
developed for high-risk industries like aviation and nuclear power — is
now making waves in cybersecurity.
It breaks down into four elements:
| Element | What It Means |
|---|---|
| Reporting Culture | People feel safe reporting mistakes and anomalies |
| Just Culture | Evaluation focuses on learning, not blame |
| Flexible Culture | Org structure allows context-based decision-making |
| Learning Culture | Failures are systematically turned into improvements |
The most critical — and most overlooked — is Just Culture.
According to KnowBe4 Japan's research, nearly 49% of Japanese companies
subject employees to disciplinary action even for unintentional mistakes.
The result? People hide errors. Risk goes underground. Resilience drops.
Unlike a factory accident, a cyber incident's root cause often lies in
organizational decision-making — not individual negligence. Asking
"what structure caused this failure?" instead of
"who made the mistake?" is what builds real organizational strength.
A Hardware Engineer's Perspective
As a semiconductor engineer, I can tell you this supply chain security
discussion is anything but abstract to me.
Manufacturing control systems are now cloud-connected. Factory equipment
is tied to external vendor accounts. VPN-based remote work has become
the norm for production efficiency. And the more people involved, the
higher the chance that someone will prioritize convenience over security.
Three things I find particularly concerning on the ground:
- Lifecycle vs. patch cycle mismatch — Factory equipment runs for 5–10 years, but software vulnerabilities appear every month
- OT/IT boundary collapse — Manufacturing networks that used to be air-gapped are now connected to IT systems, multiplying attack surfaces
- Lack of incident reporting culture — In manufacturing, there's a deep-rooted belief that "problems shouldn't happen," making it hard for small anomalies to get reported
And as AI automation expands, one thing remains constant:
humans decide what to delegate to AI.
Ethical judgment and critical decision-making will always stay with us.
Takeaways & Next Actions
Bottom line: EDR alone isn't enough.
After 2026, organizational culture IS your competitive advantage in security.
Here's what I'd recommend acting on now:
- Build a blame-free incident reporting culture (Just Culture first)
- Audit your entire supply chain against the new evaluation standard (October 2026 deadline)
- Replace passive e-learning with simulation-based training
- Create psychological safety so small concerns get raised early
It's time to stop treating security as a cost center and start treating
it as a demonstration of organizational capability.
Sources:
- ITmedia Enterprise — "The Human Vulnerability: 4 Overlooked Security Blind Spots" (March 18, 2026)
- KnowBe4 Japan Seminar Materials
- NIST Cybersecurity Framework 2.0
Top comments (0)