TL;DR
Two and a half decades of pre-production testing taught me: Good documentation isn't about perfect grammar—it's about saving someone's ass at 3 AM when a system fails. Here's what actually matters.
How I Got Here
2001: Fresh as Developer, first job in industrial automation.
My role: Test engineer for Daimler Automotive systems.
My assumption: Testing means finding bugs.
Reality: Testing means recreating entire systems before they go to production—and documenting everything that could possibly go wrong.
25 years later: Still doing it.
The Moment Documentation Became Personal
2003, 2:37 AM, Monday morning.
Phone rings. Manufacturing line down.
Production manager on the line:
"The conveyor belt controller is throwing error code E847. Your documentation says check sensor #4. Which sensor is sensor #4? We have 23 sensors."
I had written that documentation.
I had no idea which sensor #4 was.
We lost 4 hours of production.
Cost: €47,000.
Lesson: Documentation written for "someone else" is useless when you become that "someone else" at 3 AM.
What 25 Years of Pre-Production Testing Taught Me
Lesson 1: Systems Are Rebuilt Before Production
Why?
Because you can't test a €2M production line in the factory where it will run.
The process:
- Build the actual system
- Ship to customer site
- Meanwhile: Rebuild it in test environment
- Test with realistic scenarios
- Document everything
- Train technicians
- Generate test data
- Validate edge cases
The problem:
Step 5-8 take longer than Step 1-4.
Why?
Because building physical systems scales.
Creating content doesn't.
Lesson 2: Documentation That's Never Used Is Worthless
True story:
2007 project: Automotive paint shop.
127 robots, 43 sensors, 18 PLCs.
Documentation produced:
- 847 pages technical manual
- 234 pages troubleshooting guide
- 156 flowcharts
- 89 wiring diagrams
Documentation used in first year:
- 3 troubleshooting flowcharts
- 1 wiring diagram
- 12 pages from technical manual
Rest: Never opened.
Why?
Because we documented what we thought people needed, not what they actually needed.
Lesson 3: Edge Cases Are Where Value Lives
Normal operation:
- System works
- Data flows
- Nobody needs documentation
Edge cases:
- Sensor fails
- Communication timeout
- Unexpected value
- System recovery needed
This is when documentation matters.
This is what's hardest to document.
This is what gets forgotten.
Lesson 4: Different People Need Different Details
Same component, five different audiences:
1. Design Engineer:
Needs:
Protocol specifications
Bit-level encoding
Timing diagrams
Integration requirements
2. Test Engineer (me):
Needs:
Valid value ranges
Edge cases
Error scenarios
Expected behavior
3. Field Technician:
Needs:
"What does this measure?"
"What's normal?"
"How do I fix it?"
"When do I call engineering?"
4. Operator:
Needs:
"What's this?"
"Should I worry?"
"What do I do?"
5. Maintenance:
Needs:
Replacement procedures
Calibration steps
Failure patterns
Part numbers
Standard approach: Write one document, hope it helps everyone.
Result: Helps nobody.
Lesson 5: Test Data Is Documentation
2011 insight:
I spent months generating test data for a vehicle dynamics system.
What I created:
- Normal driving scenarios
- Edge cases (emergency braking, sharp turns)
- Sensor failure scenarios
- Communication timeouts
- Environmental extremes
Later realization:
This test data was better documentation than the actual documentation.
Why?
Because it showed:
- What values are realistic
- What patterns are normal
- What failures look like
- How system responds
Test data is executable documentation.
Lesson 6: Documentation Rots Instantly
The cycle:
Day 1: Write documentation
Day 2: System update
Day 3: Documentation is wrong
Day 4: Someone follows old docs
Day 5: System breaks
Day 6: "Why isn't this documented?"
Traditional solution: Version control.
Real solution: Auto-regenerate documentation on every system change.
Lesson 7: Copy-Paste Creates Dangerous Inconsistencies
2014 incident:
Copied documentation from Sensor A to Sensor B.
Changed the sensor name.
Forgot to change the measurement range.
Result:
- Documentation said -40°C to +85°C
- Actual sensor: -20°C to +60°C
- Test generated values outside real range
- False positives in testing
- Component failures in production
Cost: €89,000 in emergency replacements.
Lesson: Copy-paste is evil in technical documentation.
Lesson 8: The Best Documentation Is Never Written
2016 project: Building automation system.
Innovation: Generated documentation directly from PLC program.
Process:
PLC Code → Parser → Documentation Generator → Human Review → Published Docs
Result:
- 100% consistent with actual system
- Updates automatically
- No copy-paste errors
- Faster than manual writing
Limitation: Only worked for our specific PLC platform.
But: Proved the concept works.
Patterns I've Seen Across Industries
Automotive (My main domain)
Characteristics:
- Thousands of CAN signals
- Safety-critical components
- Strict regulations
- Global teams
Documentation challenges:
- Multi-language requirements
- Certification compliance
- Update frequency
- Consistency across models
Manufacturing
Characteristics:
- Mixed protocols (CAN, OPC UA, Profinet)
- Legacy systems integration
- 24/7 operation
- Minimal downtime tolerance
Documentation challenges:
- "We can't stop the line to update docs"
- Tribal knowledge
- High technician turnover
- Emergency troubleshooting focus
Building Automation
Characteristics:
- Long system lifetime (20+ years)
- Multiple vendors
- Gradual upgrades
- Distributed systems
Documentation challenges:
- Documentation older than current staff
- Lost installation records
- Retrofit integration
- Multi-building consistency
Energy Sector
Characteristics:
- Safety-critical
- Regulatory oversight
- Remote locations
- Environmental extremes
Documentation challenges:
- Audit trails required
- Certification dependencies
- Failure analysis needs
- Expert knowledge retention
What Actually Makes Good Documentation
After 25 years, here's what matters:
1. Findable
Bad:
"Check the troubleshooting section on page 247"
Good:
"Search 'E847 error' → immediate answer"
2. Contextual
Bad:
"Sensor operating range: -40°C to +85°C"
Good:
"This sensor monitors brake disc temperature. Normal: 20-60°C during regular driving. Above 80°C indicates excessive braking or sensor failure."
3. Actionable
Bad:
"System may experience communication timeout"
Good:
"If you see 'Timeout Error E321':
- Check cable connection at Point A
- Verify power supply shows green LED
- If problem persists, call [number]"
4. Layered
Quick answer: What to do right now
Detail level 1: Why it happened
Detail level 2: Technical background
Detail level 3: Deep dive for engineers
5. Validated
Not: "Probably works like this"
But: "Verified with system version 2.4.1"
6. Current
Not: "As of 2019..."
But: Auto-updated from current system
7. Tested
Critical test: Give it to someone who doesn't know the system.
Can they solve a problem using only the documentation?
If no: Documentation failed.
Why I'm Building What I'm Building
The realization:
After 25 years of:
- Writing documentation nobody reads
- Generating test data manually
- Recreating systems for testing
- Training technicians
- Debugging at 3 AM
I noticed:
The information already exists in the system.
We're just terrible at extracting and formatting it.
AI can do this.
But only if it:
- Understands industrial protocols
- Generates contextual explanations
- Creates actionable content
- Validates technical accuracy
- Learns from feedback
This is what I'm building.
Not to replace technical writers.
But to let them focus on:
- Edge cases
- Strategic content
- Complex troubleshooting
- Customer-specific customization
Instead of:
- Copy-pasting specs
- Updating parameter tables
- Reformatting for different audiences
- Tracking version changes
The Questions I Still Don't Have Answers For
1. How Much Automation Is Too Much?
Some documentation needs human judgment:
- Safety-critical systems
- Regulatory compliance
- Liability considerations
Where's the line?
2. How Do You Measure Documentation Quality?
Metrics I've tried:
- Pages written (useless)
- Time to write (misleading)
- Reader feedback (too late)
- Incident reduction (too many variables)
What actually works?
3. How Do You Keep Tribal Knowledge?
Best documentation comes from experience.
But:
- Experts retire
- Companies restructure
- Knowledge gets lost
How do you capture this in AI systems?
4. What About Liability?
If AI-generated documentation causes:
- Equipment damage
- Safety incident
- Production loss
Who's responsible?
How do you handle this?
What I Need From Experienced Testing Engineers
If you've done industrial testing, I need your input:
Questions:
-
What documentation do you actually use?
- Not what you're supposed to use
- What you actually open when there's a problem
-
What documentation failures have you seen?
- Worst incidents
- Common patterns
- What causes the most pain
-
How do you generate test data?
- Manual scenarios?
- Recorded real data?
- Synthetic generation?
- What works, what doesn't?
-
What would make AI-generated docs trustworthy?
- What validation would you need?
- What would make you nervous?
- Deal-breakers?
-
Am I missing something critical?
- Blind spots from my experience?
- Industry-specific issues?
- Technical challenges I'm underestimating?
Beta Program Update
Current beta testers: 3
Industries represented:
- Automotive testing
- Manufacturing automation
- Building automation
Early feedback:
- "Faster than manual documentation" ✅
- "Validation catches most errors" ✅
- "Still needs human review" ✅ (expected)
- "Missing [specific feature]" 🚧 (working on it)
Still looking for testers from:
- Energy sector
- Water treatment
- Food processing
- Pharmaceutical manufacturing
Why these industries?
Different regulatory requirements, different challenges, different needs.
Want to test?
write me DM
Next Article
Part 4: "Beta Results: What Worked, What Failed, What Surprised Me"
Real results from:
- 3 beta customers
- 500+ components documented
- 47 edge cases discovered
- Lessons learned
Following week: Public launch plan
Have you done industrial testing? What documentation challenges have you faced? What am I missing? 👇
Top comments (0)