I want to tell you about the moment this project stopped being a concept and became something I couldn't stop thinking about.
I typed:
"build camera rig"
And Unreal Engine — the actual editor, the actual viewport — built it. A SpringArm, a Camera, wired together, attached to the correct root. Clean. No errors. No Blueprint editor opened. No node graph touched. I didn't drag a single component.
I just typed three words into a terminal.
The Problem I Was Actually Trying to Solve
I'm making a game called NONRESOLVED. The core of it — the thing that makes it different from anything else — is an AI that has memory, identity, and evolves over time. Not a chatbot you talk to. An entity that exists inside the world, that remembers what it said last week, that changes based on what happens to it.
I want to tell you about the moment this project stopped being a concept and became something I couldn't stop thinking about.
What I Built
The system has a simple shape, but it took a while to get the architecture right.
Prompt → Python Server → JSON → C++ Plugin → Unreal Engine
Here's how it works in practice:
- I type a prompt — plain English, no special syntax
- A FastAPI server (Python) interprets the intent and returns structured JSON
- A custom C++ Unreal plugin (AIBridge) receives that JSON over HTTP
- The plugin parses it and executes real Unreal API calls —
FKismetEditorUtilities,USimpleConstructionScript, the actual Blueprint machinery - The editor updates. Live.
No Python running inside Unreal. No Blueprints being manually touched. C++ all the way down once the intent is resolved, which was a constraint I set early and refused to break. Python is the interpreter. C++ is the executor. The engine doesn't know or care that the instruction came from a human typing casually in a terminal.
The First Thing That Worked
The first milestone was just being able to create a Blueprint. Sounds trivial. It took a few days.
{
"action": "create_blueprint",
"name": "BP_AICharacter_1"
}
That's it. Type "create blueprint called BP_AICharacter_1" and the file appears at /Game/AI/BP_AICharacter_1. The asset is real, persistent, compilable.
Once that worked, I felt something shift. Not excitement exactly — more like recognition. This could actually be a thing.
The camera rig came next. And that's when I had the moment I described at the top. "build camera rig" expanding into a multi-step execution:
clean existing cameras
→ remove duplicate SpringArms
→ create SpringArm
→ create Camera
→ attach Camera to SpringArm
→ compile Blueprint
All from three words.
Milestone 2: Mesh and the Player Actor
This is where I am now, and it's the most satisfying thing I've shipped on this project.
I added five new actions:
add_mesh — adds a SkeletalMeshComponent or StaticMeshComponent to any Blueprint, cleaning first so you never end up with duplicates.
set_mesh — assigns a specific asset path to an existing mesh component. Point it at a skeletal mesh asset and it wires it up.
add_movement — adds and configures CharacterMovementComponent with values I actually want: 600 max walk speed, 1.75 gravity, bOrientRotationToMovement = true. Not Unreal defaults. My defaults.
set_transform — sets relative location and rotation on any scene component. The standard mannequin mesh offset (Z: -90, Yaw: -90) can now be applied with a single line.
But the one that made me stop and stare was this:
setup_player_actor
"setup player character BP_PlayerCharacter"
This single prompt now builds the entire thing:
BP_PlayerCharacter [parent: ACharacter]
├── CapsuleComponent
├── Mesh (offset Z:-90, Yaw:-90)
├── SpringArm (boom 400, eye height 88, follows controller)
│ └── Camera
└── CharacterMovementComponent
MaxWalkSpeed: 600
JumpZVelocity: 600
GravityScale: 1.75
AirControl: 0.35
From one prompt. One compound action. The AI doesn't just respond to intent — it expands intent into a sequence of precise operations that produce a correct, compilable, playable result.
What This Actually Feels Like to Work In
Let me be honest about the texture of this workflow, because it's not all magic.
What's genuinely different:
The feedback loop is shorter than anything I've experienced in solo game dev. I'm not context-switching between "thinking about the game" and "doing Unreal things." The doing and the thinking are the same action. I describe what I want and I see it, and then I describe what I want next.
The system also forces you to think in operations, not in clicks. When you're dragging components around in a Blueprint editor, you're thinking visually and spatially. When you're prompting, you're thinking about what the system should do. It's a higher-order abstraction and — surprisingly — it makes bad decisions easier to catch before you make them.
What's still hard:
The intent parser has edges. "Set the mesh to the mannequin" works. "Use the default character mesh" is ambiguous and needs a real asset path. I'm still hand-tuning the keyword extraction for transforms and component targeting.
Also: you have to trust the system, which requires understanding the system. When setup_player_actor runs and the hierarchy looks wrong, you can't just undo a drag. You have to understand what the C++ executed, why, and how to correct the JSON response. The debugging layer is different, not easier.
The Design Principle That Keeps Everything Sane
Early on I wrote this down and I've referenced it constantly:
AI is the builder.
NOT the developer.
The AI agent doesn't make game design decisions. It doesn't choose where the camera should be or what the character should feel like to control. Those are my decisions. What the AI does is execute them — faster, without friction, and without requiring me to memorize the Unreal API at 1am.
This distinction matters because it's easy to let the tool start making decisions for you. When the AI builds a default camera rig, the temptation is to accept the defaults because changing them requires another prompt. I've resisted this by keeping my defaults in the system — the movement values, the spring arm length, the eye height. The AI executes my preferences, not its own.
Where This Is Going
The system is roughly 55% complete as a construction layer. Here's the full roadmap:
- [x] Blueprint creation
- [x] Camera rig system
- [x] Mesh system
- [x] Player actor system
- [x] Transform configuration
- [ ] Input bindings (Enhanced Input System)
- [ ] Animation Blueprint hookup
- [ ] Enemy AI (Behavior Trees via prompt)
- [ ] Combat system
- [ ] Narrative AI — memory, identity, evolution ← this is the whole game
The narrative AI layer is the reason any of this exists. Once the construction system is complete enough that I can build and iterate on gameplay systems through prompts, I can redirect my full attention to the thing that makes NONRESOLVED what it is: an entity that knows who it is, remembers what it's been through, and changes.
That's the thing I'm building toward. Everything else is scaffolding.
If You're Thinking About Building Something Like This
A few things I'd tell you:
Start with the C++ plugin, not the Python server. The hardest part wasn't the intent parsing — it was understanding which Unreal APIs to call and in what order to do things without breaking the asset system. Get one action working end-to-end in C++ before you build any abstraction on top of it.
Define your JSON schema before you write any code. Every action should have a clear shape. I wasted time refactoring early because "name" meant different things in different contexts.
Build clean-before-create into every action. The single best thing in this system is that every add operation removes duplicates first. Without this, you end up with five SpringArms and a nightmare.
The constraint is the point. No Unreal Python for core logic. All execution through C++. Structured JSON only. These rules feel restrictive but they're what makes the system reliable. Every time I've thought about bending them, the result has been worse.
I'll keep posting as this develops. Next up is input and animation — the last things standing between "a Blueprint that looks like a player" and "something you can actually control."
If you're building something with a similar approach — AI as construction layer, not design oracle — I'd genuinely love to hear about it. This space feels early and weird and I think that's exactly where the interesting work is.
NONRESOLVED is in active development. This is devlog #1.
Top comments (0)