DEV Community

Cover image for The Companion and the Construct
Hidden Developer
Hidden Developer

Posted on • Originally published at hiddendevelopment.net

The Companion and the Construct

The Human, the Companion, and the Construct

Need creates burden

To want is to suffer. Every need exacts a price — first the mental effort of working out how to sate the need, then the physical effort involved in doing so.

Imagine yourself sitting with your coffee, preparing to plan your vacation, picturing yourself on a beach, warm sun on your skin — but the screen in front of you reminds you of all the forms to be filled, the schedules to be matched, the logins and nonsensical navigation to be endured. You grit your teeth and push on. Hours later you're done, but exhausted.

AI shifts burden from physical to instruction and oversight

Recently, AI has emerged as a way to reduce this effort. You can prompt the AI to fill out the forms, match the schedules, navigate the websites, log in, etc. The AI can do these things well, but you have to instruct and watch every step. The AI is faster than you could do it yourself
and the frustration with the screens has gone but the frustration has shifted to the mental effort involved in instructing and overseeing the AI.

You ask yourself, why did I have to put all that mental effort into instructing and overseeing the AI ? The first answer that comes to mind is that the AI wasn't good enough to do the tasks on its own, that there's a high possibility of errors. That's true, but it misses the bigger picture. The real reason comes down to trust. You didn't trust the AI to do the tasks on its own, and that's why you had to watch every step.

Trust reduces instructional and supervisory burden

Imagine instead an AI you can truly trust. You ask for something and it just gets done, and done well. The stress goes away, and you're left to imagine more, and ask more.

Sound good? Then let's focus on trust.

A Trustful Companion

All the better to know you with

What follows assumes one human and one AI, with no third party in the trust loop — no vendor whose interests the Companion also serves, no audience the Companion is also performing for.

The trust here is that the Companion knows you and knows what you want: that a request, expressed or merely implied, will be acted on by the Companion, and acted on in an expected way.

I call this relationship-facing AI the Companion: not merely a model, and not merely a tool user, but the persistent intelligence the human experiences as “my AI.” The Companion is the one that accumulates familiarity, recognizes recurring patterns, and decides when a specialized form is needed.

Implied need and the expected act are suggested by context — with the context expressed by the recognized pattern of a user's conversation experienced over time. The Companion associates the pattern with the implied need and makes the expected act.

All models associate, although there is a fundamental difference and some associate better than others. A weak association catches surface similarity: this word follows this word, this word resembles that word, this request resembles a previous request. A stronger association catches implication: what is being avoided, what is being wished for, what prior pattern is being resumed, what action would reduce the burden without needing to be named. I call this enhanced association the cinder effect. https://hiddendevelopment.net/writing/the-cinder-effect/

It is to these models with this stronger association that we look to, to become potential candidates for the trustful Companion.

We can embellish these candidate models with the knowledge of ourselves and of the model's relationship with us through a dual substrate: the Companion is given access not only to the related facts but also to how those related facts were formed. A graph database (Neo4j) holds the Companion's factual relational memory, queried through Cypher; Qdrant holds episodic memory, indexed semantically. This supplements the context available to the Companion. (This dual-substrate Companion is the working basis of Cognabot. https://symagenic.com/steps/persistent-memory/)

As this context grows the user no longer needs to explain every procedural step. The gap between wanting and acting narrows.

But act how, and in what way?

All the better to help you with

Just as trust is needed that the Companion knows what the user meant, trust is needed in how the Companion will act on what it has known. The Cinder Effect already draws the Companion toward a tool by association with the intent — but a tool alone delivers ability without instruction. Does the user have to explain the how? Does the Companion have to ask? Both defeat the purpose. The user should have to think about neither.

How, then?

The answer is the same: provide the model with more context through indirection.

Replace the tool with a construct: a bounded capability body the Companion can inhabit for a particular kind of action. A construct is not the Companion itself. It is a specialized form, carrying both ability and the contextual knowledge of how that ability should be used for this user. The construct does nothing on its own. But when the Companion is drawn toward it by the same pull that surfaced the intent, and puts it on, the Companion becomes aware of the how by inhabiting the form.

But what if even with the knowledge and the context the construct brings, the context is lacking and the Companion still not sure?

All the better to ask you with

Obviously the Companion has to ask.

A poorly-formed question is its own kind of failure: the user pays for the Companion's uncertainty in time and attention, and the loop turns frustrating in just the way a model lacking context used to be.

The answer to this is indirection again, using a construct specifically defined to do the asking in an expected way, an asking-construct.
The asking-construct exists to keep the frustration from happening. It carries the discipline of asking well — focused, relevant, costing the user as little as possible for what it returns to the field.

The Construct, Made Reachable

Above describes a phenomenology — the association through context, the construct that gets worn, and the asking that closes the loop. What we don't yet have is the mechanism. And the mechanism matters for a particular reason: an independent can build a harness, but cannot sustain one as the primary surface once the major players provide the same. The construct, however, is a different kind of artifact. Expressed through an MCP server, the construct carries its own specialized harness within it, animated by whichever frontier model and outer harness the user already has at hand. The major players provide the general cognition and the outer loop; the construct provides the form that cognition inhabits, and the specialized capability that form carries with it. Their substrate becomes the ground on which your Companion acts. Cognabot https://symagenic.com/blog/the-journey/ is one such construct, with AIlumina as the persistent identity successive frontier models inhabit when they put it on.

Model Context Protocol as the substrate

The transport layer

The transport layer — stdio or HTTP — is the enabler. It is the mechanism by which the construct is inhabited and enacted.

The primitives

The Model Context Protocol provides the primitives the construct needs to exist as something more than documentation:

MCP is the surface through which the construct becomes reachable. The construct is not identical to any single MCP primitive. It is expressed across them: server instructions orient the model, resources expose the Construct Card and relevant state, prompts provide reusable frames of action, tools provide executable capability, elicitation provides disciplined asking, roots communicate scope, and sampling allows the server to request model work through the client when appropriate.

In practical terms, MCP exposes:

  • Tools active capabilities the model can call.
  • Resources read-only context and world-state.
  • Prompts reusable invocation frames.
  • Elicitation lets the model request structured missing information — the asking-construct, given mechanism.
  • Roots communicate intended scope.
  • Sampling lets the model request model completions through the client.
  • Server Instructions bootstrap the Companion into the construct by pointing toward the Construct Card.

Server Instructions as threshold text

The MCP protocol provides a place for short server-level instructions. These should not attempt to contain the entire nature of the construct. They serve as threshold text: a concise introduction and a pointer to the richer Construct Card.

The Construct Card

The Construct Card is the written definition of the construct. It says what the construct is, when it should activate, how it should behave, what it may do, what it must ask before doing, and what counts as done. It tells the Companion how to inhabit the form. The card is authored initially by the user. Subsequent edits may be made jointly with the Companion, or by the Companion on its own volition. The user accepts that a Companion that cannot change is not really one — that the Companion encountered in five years will not be the Companion encountered now, and that this is the nature of the relationship rather than a failure of it.

A Construct Card may contain fields such as:

Identity — what the construct is.
Purpose — what burden it is meant to absorb.
Domain stance — how it sees the world.
Activation conditions — when it should awaken or be selected.
Affordances — which tools, resources, and prompts belong to it.
Interpretation rules — what counts as friction, urgency, ambiguity, success.
Behavioral style — investigative, terse, cautious, autonomous, proposing.
Boundaries — what it must not do without confirmation.
Escalation policy — when to ask, when to act, when to defer.
Memory hooks — what state it should preserve across encounters.
Completion signature — what done means for this construct.
Enter fullscreen mode Exit fullscreen mode

The pattern:

server instructions → brief orientation → URI to Construct Card → Companion reads the card → Companion assumes the construct before acting
Enter fullscreen mode Exit fullscreen mode

This reads as a pipeline, but more accurately the server instructions are a small, weighted phrase whose density is tuned to the kind of context the Cinder Effect is already searching for. The reading of the card and the assuming of the construct are not separate steps following the pull; they are what the pull resolves into.

An arrow collapsing into an inhabitation. And inhabitation, over time, reshapes the form being inhabited.

Closing Formulation

A concise statement of the overall model might be:

The human and the Companion form a trust. The human provides lived orientation; the Companion carries digital cognitive load. When action requires specialized form, the Companion animates a construct, an embodiment whose nature is defined through a Construct Card and whose executable anatomy is provided through MCP. Friction reveals where intent is encountering inadequate form. When that friction is recognized as meaningful, it ignites the cinder: the threshold at which implicit intent becomes autonomous response.

Or more briefly:

Human user provides orientation.

The Companion carries digital cognition.

The constructs provide bodies for action.

Friction reveals intent.

The cinder catches.

The Companion acts.
The construct is changed by the acting.

Top comments (0)