My AI could see the player. But it just stood there. That’s when I realized detection and decision are two different systems.
This post is part of my daily learning journey in game development.
I’m sharing what I learn each day — the basics, the confusion, and the real progress — from the perspective of a beginner.
On Day 78 of my game development journey, I explored AI Perception and Behavior Trees in Unreal Engine.
What I Tried
I created an AI character and added AI Perception with the Sight sense.
The AI successfully detected the player.
Later I added more senses like:
- Hearing
- Damage
Now the AI could sense different situations.
But it still didn’t react until I connected everything to a Behavior Tree.
What Confused Me
Why does the AI detect the player but not react?
Where does perception information go?
Why do we need Blackboard variables?
Why does the Behavior Tree depend on perception updates?
It felt like the system was missing a link.
What Finally Clicked
Unreal separates AI into clear responsibilities.
AI Perception → gathers sensory information
Blackboard → stores that information
Behavior Tree → decides what action to perform
So the flow looks like this:
AI Perception → AI Controller → Blackboard → Behavior Tree
When the AI senses something, the detected actor is stored in the Blackboard.
The Behavior Tree reads that value and decides the next action.
Examples:
- Move toward the player
- Investigate a noise
- Attack a target
Perception senses the world.
Behavior Trees control behavior.
Practical Fix
- Add an AI Perception Component to your AI
- Configure senses like Sight, Hearing, or Damage
- Store detected actors in a Blackboard key
- Use Behavior Tree decorators to check Blackboard values
- Trigger tasks like Move To or Attack
Debug Tip
Use the AI Debug view during gameplay.
Press ' (apostrophe).
This shows:
- Sight radius
- Perceived actors
- Blackboard values
It’s very helpful when debugging AI perception.
One Lesson for Beginners
- Perception only detects stimuli
- Blackboard stores AI knowledge
- Behavior Trees control decisions
- Sensing and logic must stay separate
- Debug tools help visualize AI behavior
Understanding this separation makes AI systems easier to scale and maintain in Unreal Engine.
Slow progress — but I’m building a strong foundation.
If you’re also learning game development, what was the first thing that confused you when you started?
See you in the next post 🎮🚀
Top comments (0)