I’ve been on both sides of audits and inspections enough times to know the difference between a system that exists to pass an audit and one that actually protects patients. To be fair, the two can look similar in a 45‑minute walkthrough. In practice this means the difference comes down to the evidence and how it ties together — not a checklist ticked in a hurry.
Below I draw on years of notified‑body assessments and internal audits under MDR 2017/745 and ISO 13485. I’m writing from life in a mid‑size manufacturer where the next surveillance audit is rarely more than a quarter away and the CAPA queue keeps us awake on bad nights.
What I mean by "theatre" and "culture"
- Quality theatre: processes exist on paper, documents are current, people can recite the procedure, but records lack follow‑through. CAPAs close without verification. Training records are a series of signature blocks. Management review is a slide deck presentation with no linked actions. Good for short audits; fragile under scrutiny.
- Quality culture: decisions are data‑driven and traceable. Findings instantly become quality events, timelines include verification steps, and corrective actions are measurable. Staff escalate issues without fear because the process works and demonstrably improves the product or process.
Granting that some degree of documentation is necessary, inspectors don’t want theatre; they want proof that your system works day‑to‑day.
What inspectors actually look for (concrete signs)
Inspectors are pragmatic. They ask fewer hypotheticals and look for joined‑up evidence. From what I see repeatedly, the following items carry disproportionate weight:
- Traceability across artefacts
- Can the auditor follow a complaint through the non‑conformance, risk assessment, CAPA and update to the Technical File (Annex II) or Design History?
- In practice this means documents, change records and verification results are linked and timestamped.
- CAPA effectiveness, not just closure
- Is there objective evidence the root cause was addressed and recurrence is unlikely? Trend data, verification testing or supplier corrective action acknowledgements are what they expect.
- Meaningful change control
- Auditors want to see impact analysis for a change (design, supplier, software). A single checkbox “no impact” is a red flag. Look for linked risk update and test evidence.
- Living risk management
- Risk files should be updated when a failure occurs, a complaint is received, or a change is implemented. If the risk file is static, inspectors will ask why it hasn’t been maintained per ISO 14971 and Annex I (General Safety and Performance Requirements).
- Supplier oversight and incoming quality
- Evidence that supplier issues led to concrete supplier action: audits, non‑conformance reports, and change agreements. A neat contract is theatre if you can’t show ongoing monitoring.
- Training with assessment
- Records that show someone was trained and demonstrated competence. A signed attendance sheet alone is theatre.
- Management review that triggers action
- Review minutes that point to follow‑up items with owners, timelines and measurable indicators. If these are absent, it looks like theatre — a report that sits on a shelf.
- Complaint handling and PSUR/PMCF linkage
- For higher‑risk devices, inspectors expect to see that complaints feed into periodic safety update reports and PMCF activities as part of continuous vigilance.
Examples (short, real patterns I’ve seen)
- A supplier change for an injection‑moulded component was approved because the supplier’s certificate was still valid. During audit, the notified body asked for incoming inspection records that showed dimensional conformity across production lots. The supplier provided a one‑off certificate; no batch data existed. That’s theatre — control only on paper.
- A series of software bug fixes were marked “closed” because code merged into main branch. The CAPA lacked regression test evidence and failed to reference the clinical impact assessment required under MDR. The auditor asked for verification; we had to re‑open the CAPA and perform formal verification testing.
Practical steps to move from theatre to culture
You don’t need a fancy eQMS to start, but you do need connected workflows and traceable decisions.
- Link findings to actions automatically
- Where possible, make findings instantly become quality events that flow into your CAPA and change control process. This reduces manual handoffs and the chance of “lost” actions.
- Require objective evidence for CAPA closure
- Define what “verified” looks like for each CAPA: test results, trend analysis, supplier corrective action evidence, or updated clinical data.
- Use impact mapping for every change
- Even minor changes need an impact analysis: what documents, validations, and training must update? A live‑reactive traceability matrix helps here; if you don’t have one, at least a standard change‑impact template.
- Make management review actionable
- Don’t present only charts. Assign owners, set measurable targets, and record follow‑ups. In audits I’ve seen, an action with an owner and due date is treated far better than a general statement of intent.
- Tie risk management to real evidence
- Reference ISO 14971 where appropriate and ensure your risk file is the single source of truth that reflects actual incidents, complaints and field performance.
Tools matter, but process matters more
To be honest, a good eQMS will reduce administrative burden: connected workflow, reviewability, traceability and automated CAPAs are genuine time‑savers. However, tools alone don’t create culture. You still need leadership that values corrective action over appearance, and line managers who follow through.
I’ve worked in systems where findings instantly became quality events and others where a PDF folder was the only record. The former scales; the latter collapses under external scrutiny.
Final thought
Inspectors don’t want theatre. They want to see you saw a problem, investigated it, fixed it and verified the fix. That sequence — documented, measurable, and traceable — is culture in action.
What’s one evidence gap your team repeatedly struggles to show to auditors, and how have you tried to close it?
Top comments (0)