DEV Community

Cover image for AngleCore / ENGO Core and # AI Doesn’t Need Better Prompts. It Needs Better Patterns.
PEACEBINFLOW
PEACEBINFLOW

Posted on

AngleCore / ENGO Core and # AI Doesn’t Need Better Prompts. It Needs Better Patterns.

OpenClaw Challenge Submission 🦞

A Spatial AI System for Pattern-Based Workflows


What I Built

AngleCore (powered by ENGO Core) is a spatial interface for constructing and interpreting AI workflows through patterns instead of prompts.

Most AI systems today rely on text input. This creates a bottleneck:

  • Users must translate intent into language
  • Language becomes ambiguous
  • Systems respond inconsistently

AngleCore removes that layer.

Instead of writing prompts, users construct workflows visually using nodes that represent fundamental computational roles:

  • Input
  • Process
  • Branch
  • Memory
  • Agent
  • Output

By interacting with these nodes in a spatial environment, users generate structured intent patterns that can be interpreted and extended by AI.

This turns workflow design into something:

  • Visual
  • Iterative
  • Reusable

At its core, ENGO Core acts as the pattern engine behind this system, allowing workflows to be:

  • Captured
  • Interpreted
  • Replayed
  • Shared as templates

How I Used OpenClaw

OpenClaw was used as the execution and interpretation backbone of AngleCore.

Rather than using AI as a direct output generator, I integrated OpenClaw as a pattern interpreter.

1. Pattern Construction Layer

Users create a sequence of nodes:

INPUT → PROCESS → BRANCH → OUTPUT
Enter fullscreen mode Exit fullscreen mode

Each node carries semantic meaning. The sequence itself becomes a structured representation of intent.


2. Pattern Translation (OpenClaw Integration)

When a pattern is created, it is transformed into a structured instruction:

Pattern sequence: INPUT → PROCESS → BRANCH → OUTPUT

Tasks:
1. Interpret workflow intent  
2. Name the workflow  
3. Suggest next node  
4. Evaluate coherence  
Enter fullscreen mode Exit fullscreen mode

OpenClaw processes this structure and returns:

  • Workflow interpretation
  • Suggested continuation
  • Structural evaluation

This is where OpenClaw shifts from being a tool to being a reasoning layer.


3. Structured Output Layer

The system enforces structured responses:

  • JSON-based interpretation
  • Named workflows
  • Coherence scoring

This ensures that outputs are not just readable — they are system-compatible and reusable.


4. Workflow as Reusable Units

Each interpreted pattern becomes:

  • A template
  • A reusable workflow
  • A potential automation unit

ENGO Core is designed to evolve these patterns into:

  • Executable pipelines
  • AI-assisted decision flows
  • Shareable system components

Demo

Core Interaction Flow:

  1. Spawn nodes in the spatial field
  2. Connect nodes through interaction
  3. Build a traversal pattern
  4. Trigger interpretation
  5. Receive structured workflow analysis

What You See:

  • Dynamic node system
  • Pattern trails
  • Real-time interaction feedback
  • AI interpretation panel

(Video demo or live link recommended here)


What I Learned

1. AI is not limited by capability — it is limited by structure

The biggest realization was that most AI limitations are not model-based.
They are input design problems.

By introducing structured patterning, output quality becomes significantly more consistent.


2. Visual systems reduce cognitive load

Constructing workflows spatially:

  • Reduces ambiguity
  • Improves iteration speed
  • Makes complex logic easier to reason about

3. OpenClaw works best as an interpreter, not just a generator

Using OpenClaw to:

  • Evaluate
  • Extend
  • Validate

…is far more powerful than using it to simply generate text.


4. Workflows can become assets

Patterns are not just temporary interactions.
They can be:

  • Stored
  • Reused
  • Shared
  • Expanded

This introduces the idea of workflow ecosystems, not just applications.


ClawCon Michigan

Not attended.


Closing Note

AngleCore is an early exploration of a broader idea:

AI systems should not rely on better prompts.
They should rely on better patterns.

ENGO Core is the foundation for making that possible.

This is a submission for the OpenClaw Writing Challenge


The Problem With Modern AI Interfaces

We are not lacking intelligence.

We are lacking structure.

Most AI systems today operate on a simple loop:

Input → Interpretation → Output
Enter fullscreen mode Exit fullscreen mode

The assumption is that better prompts lead to better outputs.

In reality:

  • Prompts are inconsistent
  • Interpretations vary
  • Outputs are unpredictable

The issue is not the AI.
The issue is the interface layer between humans and AI systems.


OpenClaw and the Shift Toward Structured Interaction

OpenClaw introduces something important:

It allows developers to move beyond raw prompting and into designed workflows.

Instead of asking:

“What should I say to get the right result?”

We start asking:

“What structure produces consistent results?”

This is a fundamental shift.


From Prompts to Patterns

A prompt is a one-time instruction.

A pattern is:

  • Repeatable
  • Structured
  • Interpretable

In systems like AngleCore (built using OpenClaw as the reasoning layer), patterns are constructed through:

  • Node relationships
  • Sequential logic
  • Decision pathways

Example:

INPUT → PROCESS → BRANCH → OUTPUT
Enter fullscreen mode Exit fullscreen mode

This is not a prompt.

It is a workflow definition.


Why Patterns Matter

Patterns introduce:

1. Consistency

The same structure produces predictable results.


2. Reusability

Patterns can be stored and reused across contexts.


3. Scalability

Systems can evolve by combining patterns instead of rewriting prompts.


4. Interpretability

AI can reason about structure more effectively than free-form language.


Tutorials as Executable Systems

One of the most overlooked opportunities is how we treat knowledge.

Today:

  • Tutorials are static
  • Documentation is passive

In a pattern-based system:

  • Tutorials become workflows
  • Workflows become templates
  • Templates become executable units

With OpenClaw:

  • A tutorial can act like an agent
  • It can guide, adapt, and respond
  • It becomes part of a living system

The Emergence of Workflow Ecosystems

When patterns are:

  • Created
  • Shared
  • Interpreted

You don’t just get tools.

You get an ecosystem:

  • Users contribute workflows
  • Workflows evolve into systems
  • Systems interact with each other

This is where AI moves from:

  • Tool → Platform
  • Platform → Ecosystem

A Personal Take

The most important realization for me while building with OpenClaw was this:

The bottleneck is no longer computation.
The bottleneck is how we define intent.

Once intent is structured properly:

  • AI becomes predictable
  • Systems become scalable
  • Workflows become composable

Final Thought

We don’t need better prompts.

We need:

  • Better structure
  • Better patterning
  • Better ways of interacting with intelligence

OpenClaw doesn’t just improve AI workflows.

It opens the door to redefining how those workflows are created in the first place.


ClawCon Michigan

Not attended.


Closing

The next generation of AI systems will not be built on prompts.

They will be built on patterns.

And the teams that understand that early will define how these systems evolve.


“How I Used OpenClaw”


System Architecture Overview

┌──────────────┐
│   USER INPUT │
│ (Spatial UI) │
└──────┬───────┘
       ↓
┌────────────────────┐
│  PATTERN BUILDER   │
│ (Node Sequences)   │
└──────┬─────────────┘
       ↓
┌────────────────────┐
│ PATTERN TRANSLATOR │
│ (Structured Prompt)│
└──────┬─────────────┘
       ↓
┌────────────────────┐
│    OPENCLAW AI     │
│ (Interpretation)   │
└──────┬─────────────┘
       ↓
┌────────────────────┐
│ STRUCTURED OUTPUT  │
│ (Workflow Insight) │
└────────────────────┘
Enter fullscreen mode Exit fullscreen mode

This shows:

  • You’re not using AI directly
  • You built a layered system around it

Spatial Pattern Construction

   [INPUT] 
      ↓
   [PROCESS] ──→ [MEMORY]
      ↓              ↓
   [BRANCH] ─────→ [AGENT]
      ↓
   [OUTPUT]
Enter fullscreen mode Exit fullscreen mode

This represents:

  • Non-linear thinking
  • Multi-path workflows
  • Real system logic (not linear prompts)

Pattern → AI Translation

Visual Pattern:
INPUT  PROCESS  BRANCH  OUTPUT

 Translated into 

Structured Instruction:
{
  "steps": [
    "INPUT",
    "PROCESS",
    "BRANCH",
    "OUTPUT"
  ],
  "tasks": [
    "interpret",
    "name",
    "extend",
    "evaluate"
  ]
}
Enter fullscreen mode Exit fullscreen mode

This is your core innovation visualized


Interaction Loop

[User Builds Pattern]
          ↓
[System Structures Intent]
          ↓
[OpenClaw Interprets]
          ↓
[User Refines Pattern]
          ↺
Enter fullscreen mode Exit fullscreen mode

This shows:

  • Iteration loop
  • Learning system
  • Not one-shot prompting

ADD THIS TO PART 2 (Writing Post)

Place under “From Prompts to Patterns”


Prompt-Based System (Old Model)

User Thought
     ↓
Text Prompt
     ↓
AI Guess
     ↓
Output (Unstable)
Enter fullscreen mode Exit fullscreen mode

Problems:

  • Ambiguity
  • Inconsistency
  • No structure

Pattern-Based System (Your Model)

User Intent
     ↓
Pattern Structure
     ↓
AI Interpretation
     ↓
Structured Output
Enter fullscreen mode Exit fullscreen mode

Benefits:

  • Clarity
  • Repeatability
  • Scalability

Prompt vs Pattern (Side-by-Side)

PROMPT SYSTEM                PATTERN SYSTEM
──────────────               ──────────────
"Write code for X"           INPUT → PROCESS → OUTPUT
        ↓                            ↓
Ambiguous intent            Structured intent
        ↓                            ↓
Unpredictable output        Interpretable workflow
        ↓                            ↓
Hard to reuse               Easily reusable
Enter fullscreen mode Exit fullscreen mode

This one hits HARD. Keep it.


Knowledge Evolution Model

Tutorial (Static)
        ↓
Pattern (Structured)
        ↓
Template (Reusable)
        ↓
Agent (Executable)
Enter fullscreen mode Exit fullscreen mode

This connects directly to your idea:

“Tutorials become agents”


Ecosystem Vision

User A → Creates Pattern
          ↓
Stored as Template
          ↓
User B → Reuses + Modifies
          ↓
AI → Enhances Pattern
          ↓
System → Evolves
Enter fullscreen mode Exit fullscreen mode

This shows:

  • Network effects
  • Platform thinking
  • Why this scales

Final Touch (Optional but Powerful)

At the VERY end of Part 2, add this:


The Shift

Past:   Interface → Tool → Output  
Now:    Interface → Pattern → Intelligence  
Next:   Pattern → Ecosystem → Autonomous Systems
Enter fullscreen mode Exit fullscreen mode

AngleCore — Spatial AI as an Interface

A formal tutorial and system introduction aligned with the OpenClaw model of building through structured, composable intelligence.


1. What AngleCore Is Actually Doing

AngleCore is not a visualization tool. It is a spatial programming interface.

The HTML system you built is effectively a runtime environment where thought becomes structure, and structure becomes a prompt.

Instead of writing linear instructions, the user constructs node sequences in space. That sequence is then transformed into a machine-interpretable workflow.

At its core, AngleCore operates on three principles:

  • Spatial Encoding of Intent
    Position, order, and interaction replace syntax.

  • Pattern as Prompt
    A sequence of nodes becomes a structured instruction set.

  • Interpretation as Execution Layer
    The AI does not execute code directly — it interprets structure and proposes meaning.

This is the shift:
from writing commands → to drawing intelligence paths.


2. System Architecture (What’s Actually Happening Under the Hood)

The HTML file defines a full client-side system composed of five tightly coupled layers.

2.1 Field Layer — The Spatial Substrate

The dual canvas system:

  • bg-canvas → static spatial grid (polar + radial logic)
  • main-canvas → dynamic node simulation

This is not aesthetic. The polar grid establishes:

  • radial symmetry (decision branching)
  • circular recursion (feedback loops)
  • spatial anchoring (center = origin of logic)

This turns the screen into a coordinate system for reasoning.


2.2 Node System — Semantic Primitives

Each node is not just a visual object. It is a typed unit of computation intent.

Defined types:

  • INPUT
  • PROCESS
  • OUTPUT
  • BRANCH
  • MEMORY
  • AGENT

Each node contains:

  • identity (id)
  • semantic role (type)
  • behavior (velocity, interaction states)
  • visual encoding (polygon sides, glyphs)
  • pattern metadata (inPattern, patternOrder)

This is effectively a graph-based DSL without text.


2.3 Interaction Layer — Pattern Construction

User interaction creates meaning through:

  • Click → focus + implicit inclusion in pattern
  • Shift + Click → explicit pattern building
  • Right-click → structural manipulation (node spawning, clustering)

The critical structure:

pattern = [N001, N004, N002, ...]
Enter fullscreen mode Exit fullscreen mode

This array is the core data product of the system.

Everything else is scaffolding.


2.4 Graph Dynamics — Emergent Structure

The system uses lightweight physics:

  • node repulsion
  • boundary constraints
  • velocity damping
  • cluster generation

This does two things:

  1. Prevents visual collapse
  2. Encourages emergent topology

Meaning is not just clicked — it forms organically.


2.5 AI Interpretation Layer — OpenClaw Integration

This is where the system aligns directly with the OpenClaw philosophy.

The pattern is transformed into a structured prompt:

Step 1: Node N001 [INPUT]
Step 2: Node N004 [PROCESS]
Step 3: Node N002 [OUTPUT]

Pattern: INPUT → PROCESS → OUTPUT
Enter fullscreen mode Exit fullscreen mode

Then sent to an AI model.

The AI returns:

  • workflow name
  • interpretation
  • next node suggestion
  • coherence score
  • insight

This is not execution. It is semantic compilation.

AngleCore does not run workflows —
it discovers them.


3. Why This Is Not Just Another Node Editor

Most node-based systems:

  • are deterministic
  • require predefined logic
  • operate as visual wrappers for code

AngleCore breaks that model:

Traditional Systems AngleCore
Nodes execute logic Nodes represent intent
Graph defines output Graph is interpreted
Static workflows Emergent workflows
Developer-defined meaning AI-inferred meaning

This is closer to:

  • cognitive mapping
  • symbolic reasoning
  • proto-agent orchestration

4. OpenClaw Alignment — Where This Fits

OpenClaw is fundamentally about:

  • composability
  • modular intelligence
  • structured workflows
  • interpretable systems

AngleCore uses OpenClaw not as a backend tool, but as a philosophical layer:

  • The node pattern = a composable unit
  • The structured prompt = a portable workflow
  • The interpretation = a reusable insight object

This makes AngleCore:

an interface for generating OpenClaw-compatible workflows without writing them manually


5. Build Use Cases (Real, Not Hypothetical)

5.1 Workflow Discovery Engine

Users map out processes visually, and the system names and formalizes them.

5.2 AI System Design Interface

Instead of writing pipelines, users draw agent interactions.

5.3 Cognitive Debugging Tool

Patterns reveal:

  • missing steps
  • redundant loops
  • broken logic chains

5.4 Data Flow Prototyping

Clearing agents, analysts, or operators can simulate:

  • input → validation → processing → output chains

without writing a single script.


6. The Innovation Layer (Why This Matters)

There are three non-obvious innovations in this system.

6.1 Pattern → Prompt Conversion

This is the real breakthrough.

You are not prompting AI directly.
You are constructing prompts through interaction.

That unlocks:

  • non-technical users
  • visual reasoning
  • system-level thinking

6.2 Semantic Nodes Instead of Functional Nodes

Nodes don’t execute.

They represent intent categories.

That allows:

  • abstraction
  • flexibility
  • reinterpretation across contexts

6.3 AI as an Interpreter, Not an Executor

Most systems use AI to:

  • generate
  • automate
  • predict

AngleCore uses AI to:

  • understand structure
  • assign meaning
  • suggest evolution

This is closer to:

AI as a reasoning partner, not a tool


7. Tutorial — How to Actually Use It

Step 1: Spawn Nodes

Use the spawn button or right-click to create nodes.

Step 2: Build a Pattern

Click nodes in sequence.

Use Shift + Click for controlled pattern construction.

Step 3: Observe the Chain

The system displays:

INPUT → PROCESS → OUTPUT
Enter fullscreen mode Exit fullscreen mode

This is your implicit workflow.

Step 4: Interpret

Click “Interpret Pattern”.

The system:

  • builds a structured prompt
  • sends it to the AI
  • returns a semantic breakdown

Step 5: Iterate

Add nodes, branch paths, test variations.

You are not building code.
You are exploring possibility space.


8. What This Means Going Forward

AngleCore is an early form of something larger:

  • spatial programming environments
  • AI-native interfaces
  • intent-driven system design

If extended, this becomes:

  • multi-agent orchestration layer
  • real-time workflow compiler
  • visual language for AI systems

9. Final Positioning

AngleCore is not a product yet.

It is a new interaction model:

A system where users don’t write workflows —
they trace them, and AI makes them legible.

That aligns directly with the direction OpenClaw is pushing:

  • less syntax
  • more structure
  • composable intelligence

10. ClawCon Michigan

Not attended. Focus remained on building and formalizing the system architecture.


This is Phase I.

What matters is not the interface —
it’s the idea that interaction itself can be compiled into intelligence.


<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>AngleCore — Spatial AI System</title>
<style>
  @import url('https://fonts.googleapis.com/css2?family=Syne:wght@400;600;800&family=Space+Mono:wght@400;700&display=swap');

  :root {
    --void: #02040a;
    --deep: #060b14;
    --field: #0a1120;
    --grid: rgba(30,80,160,0.12);
    --node-idle: rgba(20,60,140,0.7);
    --node-hover: rgba(40,120,255,0.9);
    --node-focus: rgba(80,200,255,1);
    --node-active: rgba(0,255,180,1);
    --edge: rgba(40,100,200,0.3);
    --edge-hot: rgba(0,220,180,0.7);
    --text-dim: rgba(100,160,255,0.5);
    --text-bright: rgba(180,220,255,0.95);
    --text-gold: rgba(255,200,80,0.9);
    --accent: rgba(0,255,180,0.8);
    --danger: rgba(255,60,80,0.8);
    --glow: 0 0 20px rgba(40,120,255,0.4);
    --glow-hot: 0 0 40px rgba(0,255,180,0.6);
  }

  * { margin: 0; padding: 0; box-sizing: border-box; }

  body {
    background: var(--void);
    font-family: 'Space Mono', monospace;
    color: var(--text-bright);
    overflow: hidden;
    height: 100vh;
    width: 100vw;
    cursor: crosshair;
    user-select: none;
  }

  #field {
    position: absolute;
    inset: 0;
    overflow: hidden;
  }

  #bg-canvas {
    position: absolute;
    inset: 0;
    opacity: 0.6;
  }

  #main-canvas {
    position: absolute;
    inset: 0;
  }

  /* HUD */
  #hud {
    position: absolute;
    top: 0; left: 0; right: 0;
    display: flex;
    justify-content: space-between;
    align-items: flex-start;
    padding: 20px 28px;
    pointer-events: none;
    z-index: 10;
  }

  #logo {
    font-family: 'Syne', sans-serif;
    font-weight: 800;
    font-size: 22px;
    letter-spacing: 4px;
    color: var(--text-bright);
    text-transform: uppercase;
    pointer-events: auto;
  }
  #logo span {
    color: var(--accent);
  }

  #phase-badge {
    font-size: 9px;
    letter-spacing: 3px;
    color: var(--text-dim);
    margin-top: 4px;
    text-transform: uppercase;
  }

  #status-bar {
    text-align: right;
    font-size: 10px;
    color: var(--text-dim);
    letter-spacing: 1px;
    line-height: 1.8;
  }

  #status-bar .val {
    color: var(--accent);
  }

  /* Pattern trail display */
  #pattern-display {
    position: absolute;
    bottom: 20px;
    left: 28px;
    right: 28px;
    display: flex;
    align-items: flex-end;
    gap: 0;
    pointer-events: none;
    z-index: 10;
  }

  #pattern-chain {
    display: flex;
    align-items: center;
    gap: 6px;
    flex-wrap: wrap;
    flex: 1;
  }

  .pattern-node-badge {
    font-size: 9px;
    letter-spacing: 2px;
    padding: 3px 8px;
    border: 1px solid rgba(0,255,180,0.3);
    color: var(--accent);
    background: rgba(0,255,180,0.05);
    font-family: 'Space Mono', monospace;
    transition: all 0.3s;
  }

  .pattern-arrow {
    color: var(--text-dim);
    font-size: 9px;
  }

  /* AI Response Panel */
  #ai-panel {
    position: absolute;
    right: 28px;
    top: 50%;
    transform: translateY(-50%);
    width: 300px;
    background: rgba(6,11,20,0.95);
    border: 1px solid rgba(40,100,200,0.3);
    border-left: 2px solid var(--accent);
    padding: 20px;
    z-index: 20;
    display: none;
    box-shadow: -10px 0 60px rgba(0,0,0,0.8), inset 0 0 40px rgba(0,20,60,0.5);
    max-height: 70vh;
    overflow-y: auto;
  }

  #ai-panel.visible {
    display: block;
    animation: slideIn 0.3s ease;
  }

  @keyframes slideIn {
    from { opacity: 0; transform: translateY(-50%) translateX(20px); }
    to { opacity: 1; transform: translateY(-50%) translateX(0); }
  }

  #ai-panel-header {
    font-family: 'Syne', sans-serif;
    font-size: 10px;
    letter-spacing: 3px;
    color: var(--text-dim);
    text-transform: uppercase;
    margin-bottom: 12px;
    display: flex;
    justify-content: space-between;
    align-items: center;
  }

  #ai-panel-close {
    cursor: pointer;
    color: var(--text-dim);
    font-size: 14px;
    pointer-events: auto;
    transition: color 0.2s;
  }
  #ai-panel-close:hover { color: var(--danger); }

  #ai-response-text {
    font-size: 11px;
    line-height: 1.8;
    color: var(--text-bright);
    white-space: pre-wrap;
  }

  #ai-response-text .thinking {
    color: var(--text-dim);
    font-style: italic;
  }

  #ai-prompt-preview {
    margin-top: 16px;
    padding: 10px;
    background: rgba(0,30,80,0.5);
    border-left: 2px solid rgba(40,100,200,0.4);
    font-size: 9px;
    color: var(--text-dim);
    line-height: 1.6;
  }

  /* Context menu */
  #context-menu {
    position: absolute;
    background: rgba(6,11,20,0.97);
    border: 1px solid rgba(40,100,200,0.4);
    padding: 6px 0;
    z-index: 30;
    display: none;
    min-width: 160px;
    box-shadow: 0 10px 40px rgba(0,0,0,0.8);
  }

  #context-menu.visible { display: block; }

  .ctx-item {
    padding: 7px 16px;
    font-size: 10px;
    letter-spacing: 1px;
    color: var(--text-dim);
    cursor: pointer;
    transition: all 0.15s;
  }
  .ctx-item:hover {
    background: rgba(0,255,180,0.08);
    color: var(--accent);
  }

  .ctx-divider {
    height: 1px;
    background: rgba(40,100,200,0.2);
    margin: 4px 0;
  }

  /* Node label tooltip */
  #node-tooltip {
    position: absolute;
    background: rgba(6,11,20,0.95);
    border: 1px solid rgba(40,100,200,0.3);
    padding: 6px 12px;
    font-size: 9px;
    letter-spacing: 2px;
    color: var(--text-bright);
    pointer-events: none;
    z-index: 25;
    display: none;
    text-transform: uppercase;
  }

  /* Spawn button */
  #spawn-btn {
    position: absolute;
    bottom: 70px;
    right: 28px;
    background: transparent;
    border: 1px solid rgba(0,255,180,0.4);
    color: var(--accent);
    font-family: 'Space Mono', monospace;
    font-size: 9px;
    letter-spacing: 3px;
    padding: 10px 18px;
    cursor: pointer;
    z-index: 10;
    transition: all 0.2s;
    text-transform: uppercase;
  }
  #spawn-btn:hover {
    background: rgba(0,255,180,0.1);
    box-shadow: 0 0 20px rgba(0,255,180,0.3);
  }

  #interpret-btn {
    position: absolute;
    bottom: 110px;
    right: 28px;
    background: transparent;
    border: 1px solid rgba(40,120,255,0.4);
    color: rgba(140,180,255,0.9);
    font-family: 'Space Mono', monospace;
    font-size: 9px;
    letter-spacing: 3px;
    padding: 10px 18px;
    cursor: pointer;
    z-index: 10;
    transition: all 0.2s;
    text-transform: uppercase;
    display: none;
  }
  #interpret-btn.visible { display: block; }
  #interpret-btn:hover {
    background: rgba(40,120,255,0.1);
    box-shadow: 0 0 20px rgba(40,120,255,0.3);
  }
  #interpret-btn:disabled {
    opacity: 0.4;
    cursor: not-allowed;
  }

  #clear-btn {
    position: absolute;
    bottom: 20px;
    right: 28px;
    background: transparent;
    border: 1px solid rgba(255,60,80,0.2);
    color: rgba(255,60,80,0.5);
    font-family: 'Space Mono', monospace;
    font-size: 9px;
    letter-spacing: 3px;
    padding: 8px 14px;
    cursor: pointer;
    z-index: 10;
    transition: all 0.2s;
    text-transform: uppercase;
  }
  #clear-btn:hover {
    border-color: rgba(255,60,80,0.5);
    color: rgba(255,60,80,0.9);
    background: rgba(255,60,80,0.05);
  }

  #loading-overlay {
    position: absolute;
    inset: 0;
    background: rgba(2,4,10,0.85);
    z-index: 50;
    display: flex;
    flex-direction: column;
    align-items: center;
    justify-content: center;
    gap: 12px;
  }

  #loading-overlay.hidden { display: none; }

  .loading-ring {
    width: 48px;
    height: 48px;
    border: 2px solid rgba(40,120,255,0.2);
    border-top-color: var(--accent);
    border-radius: 50%;
    animation: spin 0.8s linear infinite;
  }

  @keyframes spin { to { transform: rotate(360deg); } }

  .loading-text {
    font-size: 9px;
    letter-spacing: 4px;
    color: var(--text-dim);
    text-transform: uppercase;
  }

  /* Scrollbar */
  #ai-panel::-webkit-scrollbar { width: 3px; }
  #ai-panel::-webkit-scrollbar-track { background: transparent; }
  #ai-panel::-webkit-scrollbar-thumb { background: rgba(40,100,200,0.3); }
</style>
</head>
<body>

<div id="field">
  <canvas id="bg-canvas"></canvas>
  <canvas id="main-canvas"></canvas>
</div>

<div id="hud">
  <div>
    <div id="logo">Angle<span>Core</span></div>
    <div id="phase-badge">Spatial AI System · Phase I</div>
  </div>
  <div id="status-bar">
    NODES: <span class="val" id="stat-nodes">0</span> &nbsp;
    PATTERN: <span class="val" id="stat-pattern"></span><br>
    FOCUS: <span class="val" id="stat-focus">NONE</span> &nbsp;
    FIELD: <span class="val" id="stat-field">OPEN</span>
  </div>
</div>

<div id="pattern-display">
  <div id="pattern-chain"></div>
</div>

<div id="ai-panel">
  <div id="ai-panel-header">
    <span>AI Interpretation</span>
    <span id="ai-panel-close"></span>
  </div>
  <div id="ai-response-text"></div>
  <div id="ai-prompt-preview"></div>
</div>

<div id="context-menu">
  <div class="ctx-item" id="ctx-add-node">+ Spawn Node Here</div>
  <div class="ctx-item" id="ctx-add-cluster">⬡ Spawn Cluster</div>
  <div class="ctx-divider"></div>
  <div class="ctx-item" id="ctx-clear-pattern">◎ Clear Pattern</div>
  <div class="ctx-item" id="ctx-reset-field">↺ Reset Field</div>
</div>

<div id="node-tooltip"></div>

<button id="spawn-btn">⊕ Spawn Node</button>
<button id="interpret-btn" class="visible">◈ Interpret Pattern</button>
<button id="clear-btn">✕ Clear</button>

<div id="loading-overlay" class="hidden">
  <div class="loading-ring"></div>
  <div class="loading-text">Interpreting Pattern</div>
</div>

<script>
// ============================================================
// ANGLECORE — Spatial AI System — Phase I
// ============================================================

const bgCanvas = document.getElementById('bg-canvas');
const mainCanvas = document.getElementById('main-canvas');
const bgCtx = bgCanvas.getContext('2d');
const ctx = mainCanvas.getContext('2d');

// State
let W, H, cx, cy;
let nodes = [];
let edges = [];
let pattern = []; // sequence of clicked node ids
let focusedNode = null;
let hoveredNode = null;
let mouseX = 0, mouseY = 0;
let panX = 0, panY = 0;
let isPanning = false;
let panStartX, panStartY;
let zoom = 1;
let frameCount = 0;
let contextMenuOpen = false;
let contextX = 0, contextY = 0;

const NODE_TYPES = [
  { label: 'INPUT',   color: '#1e6aff', glyph: '', sides: 3 },
  { label: 'PROCESS', color: '#00ffd4', glyph: '', sides: 6 },
  { label: 'OUTPUT',  color: '#ffb800', glyph: '', sides: 4 },
  { label: 'BRANCH',  color: '#ff3c64', glyph: '', sides: 5 },
  { label: 'MEMORY',  color: '#9b6fff', glyph: '', sides: 8 },
  { label: 'AGENT',   color: '#00ff9d', glyph: '', sides: 7 },
];

// ---- Resize ----
function resize() {
  W = window.innerWidth; H = window.innerHeight;
  cx = W/2; cy = H/2;
  bgCanvas.width = mainCanvas.width = W;
  bgCanvas.height = mainCanvas.height = H;
  drawBackground();
}

// ---- Background: polar grid ----
function drawBackground() {
  bgCtx.clearRect(0,0,W,H);

  // Deep void gradient
  const grad = bgCtx.createRadialGradient(cx,cy,0, cx,cy, Math.max(W,H)*0.7);
  grad.addColorStop(0, 'rgba(8,18,40,1)');
  grad.addColorStop(0.5, 'rgba(4,10,22,1)');
  grad.addColorStop(1, 'rgba(2,4,10,1)');
  bgCtx.fillStyle = grad;
  bgCtx.fillRect(0,0,W,H);

  // Polar rings
  bgCtx.strokeStyle = 'rgba(20,60,160,0.12)';
  bgCtx.lineWidth = 1;
  for (let r = 80; r < Math.max(W,H); r += 80) {
    bgCtx.beginPath();
    bgCtx.arc(cx, cy, r, 0, Math.PI*2);
    bgCtx.stroke();
  }

  // Radial spokes every 30deg
  bgCtx.strokeStyle = 'rgba(20,60,160,0.08)';
  for (let a = 0; a < 360; a += 30) {
    const rad = a * Math.PI/180;
    bgCtx.beginPath();
    bgCtx.moveTo(cx, cy);
    bgCtx.lineTo(cx + Math.cos(rad)*Math.max(W,H), cy + Math.sin(rad)*Math.max(W,H));
    bgCtx.stroke();
  }

  // Fine grid
  bgCtx.strokeStyle = 'rgba(20,60,160,0.06)';
  bgCtx.lineWidth = 0.5;
  for (let x = 0; x < W; x += 40) {
    bgCtx.beginPath(); bgCtx.moveTo(x,0); bgCtx.lineTo(x,H); bgCtx.stroke();
  }
  for (let y = 0; y < H; y += 40) {
    bgCtx.beginPath(); bgCtx.moveTo(0,y); bgCtx.lineTo(W,y); bgCtx.stroke();
  }

  // Center origin marker
  bgCtx.strokeStyle = 'rgba(0,255,180,0.25)';
  bgCtx.lineWidth = 1;
  bgCtx.beginPath();
  bgCtx.arc(cx, cy, 6, 0, Math.PI*2);
  bgCtx.stroke();
  bgCtx.beginPath();
  bgCtx.moveTo(cx-14,cy); bgCtx.lineTo(cx+14,cy);
  bgCtx.moveTo(cx,cy-14); bgCtx.lineTo(cx,cy+14);
  bgCtx.stroke();
}

// ---- Node Factory ----
let nodeCounter = 0;
function createNode(x, y, typeIndex) {
  const t = typeIndex !== undefined ? typeIndex : Math.floor(Math.random()*NODE_TYPES.length);
  const type = NODE_TYPES[t];
  const id = 'N' + (++nodeCounter).toString().padStart(3,'0');
  return {
    id,
    x, y,
    vx: (Math.random()-0.5)*0.3,
    vy: (Math.random()-0.5)*0.3,
    baseRadius: 22 + Math.random()*10,
    radius: 22 + Math.random()*10,
    type,
    state: 'idle', // idle | hover | focused
    angle: Math.random()*Math.PI*2,
    angleSpeed: (Math.random()-0.5)*0.008,
    pulsePhase: Math.random()*Math.PI*2,
    inPattern: false,
    patternOrder: -1,
    weight: 1,
    cluster: null,
    spawnTime: Date.now(),
    opacity: 0,
  };
}

function spawnCluster(ox, oy) {
  const count = 4 + Math.floor(Math.random()*4);
  const clusterNodes = [];
  for (let i = 0; i < count; i++) {
    const a = (i/count)*Math.PI*2;
    const r = 90 + Math.random()*40;
    const n = createNode(ox + Math.cos(a)*r, oy + Math.sin(a)*r);
    n.cluster = 'C' + nodeCounter;
    clusterNodes.push(n);
    nodes.push(n);
  }
  // Create edges within cluster
  for (let i = 0; i < clusterNodes.length-1; i++) {
    edges.push({ from: clusterNodes[i].id, to: clusterNodes[i+1].id, hot: false });
  }
}

// ---- Draw Polygon Node ----
function drawPolygon(ctx, x, y, r, sides, angle) {
  ctx.beginPath();
  for (let i = 0; i < sides; i++) {
    const a = angle + (i/sides)*Math.PI*2;
    const px = x + Math.cos(a)*r;
    const py = y + Math.sin(a)*r;
    i === 0 ? ctx.moveTo(px,py) : ctx.lineTo(px,py);
  }
  ctx.closePath();
}

// ---- Draw a single node ----
function drawNode(node) {
  if (node.opacity <= 0) return;
  ctx.save();
  ctx.globalAlpha = node.opacity;

  const { x, y, radius, type, state, angle, pulsePhase } = node;
  const pulse = 1 + 0.04 * Math.sin(frameCount*0.04 + pulsePhase);
  const r = radius * pulse;

  // Outer glow ring
  if (state === 'focused' || state === 'hover') {
    const glowR = r * (state === 'focused' ? 2.8 : 1.8);
    const glow = ctx.createRadialGradient(x,y,r*0.5, x,y,glowR);
    const glowColor = state === 'focused' ? type.color : type.color;
    glow.addColorStop(0, hexToRgba(glowColor, state === 'focused' ? 0.25 : 0.12));
    glow.addColorStop(1, hexToRgba(glowColor, 0));
    ctx.fillStyle = glow;
    ctx.beginPath(); ctx.arc(x,y,glowR,0,Math.PI*2); ctx.fill();
  }

  // In-pattern pulse ring
  if (node.inPattern) {
    ctx.strokeStyle = 'rgba(0,255,180,0.5)';
    ctx.lineWidth = 1.5;
    ctx.setLineDash([4,6]);
    ctx.beginPath();
    ctx.arc(x, y, r + 10 + 4*Math.sin(frameCount*0.06 + pulsePhase), 0, Math.PI*2);
    ctx.stroke();
    ctx.setLineDash([]);
  }

  // Node body
  const bodyGrad = ctx.createRadialGradient(x-r*0.3, y-r*0.3, 0, x, y, r*1.2);
  bodyGrad.addColorStop(0, hexToRgba(type.color, 0.35));
  bodyGrad.addColorStop(0.6, hexToRgba(type.color, 0.15));
  bodyGrad.addColorStop(1, hexToRgba(type.color, 0.05));
  ctx.fillStyle = bodyGrad;
  drawPolygon(ctx, x, y, r, type.sides, angle);
  ctx.fill();

  // Node border
  ctx.strokeStyle = hexToRgba(type.color, state === 'idle' ? 0.5 : 0.9);
  ctx.lineWidth = state === 'focused' ? 2 : 1.2;
  drawPolygon(ctx, x, y, r, type.sides, angle);
  ctx.stroke();

  // Inner detail ring
  ctx.strokeStyle = hexToRgba(type.color, 0.2);
  ctx.lineWidth = 0.5;
  drawPolygon(ctx, x, y, r*0.65, type.sides, angle + Math.PI/type.sides);
  ctx.stroke();

  // Glyph
  ctx.font = `${Math.floor(r*0.65)}px Space Mono`;
  ctx.textAlign = 'center';
  ctx.textBaseline = 'middle';
  ctx.fillStyle = hexToRgba(type.color, state === 'idle' ? 0.7 : 1);
  ctx.fillText(type.glyph, x, y);

  // ID label (only on hover/focus)
  if (state !== 'idle') {
    ctx.font = '8px Space Mono';
    ctx.fillStyle = hexToRgba(type.color, 0.8);
    ctx.fillText(node.id, x, y + r + 14);
    ctx.font = '7px Space Mono';
    ctx.fillStyle = 'rgba(100,160,255,0.5)';
    ctx.fillText(type.label, x, y + r + 24);
  }

  // Pattern order badge
  if (node.inPattern && node.patternOrder >= 0) {
    const bx = x + r*0.7, by = y - r*0.7;
    ctx.fillStyle = 'rgba(0,255,180,0.9)';
    ctx.beginPath(); ctx.arc(bx,by,8,0,Math.PI*2); ctx.fill();
    ctx.font = 'bold 8px Space Mono';
    ctx.fillStyle = '#000';
    ctx.fillText(node.patternOrder+1, bx, by);
  }

  ctx.restore();
}

// ---- Draw edge ----
function drawEdge(edge) {
  const a = nodes.find(n=>n.id===edge.from);
  const b = nodes.find(n=>n.id===edge.to);
  if (!a||!b) return;

  const dx = b.x-a.x, dy = b.y-a.y;
  const dist = Math.sqrt(dx*dx+dy*dy);
  if (dist < 1) return;

  ctx.save();
  ctx.globalAlpha = 0.6;

  if (edge.hot) {
    ctx.strokeStyle = 'rgba(0,255,180,0.6)';
    ctx.lineWidth = 1.5;
    ctx.shadowColor = 'rgba(0,255,180,0.4)';
    ctx.shadowBlur = 8;
  } else {
    ctx.strokeStyle = 'rgba(40,100,200,0.25)';
    ctx.lineWidth = 0.8;
  }

  // Curved edge with midpoint offset
  const mx = (a.x+b.x)/2 + dy*0.15;
  const my = (a.y+b.y)/2 - dx*0.15;

  ctx.beginPath();
  ctx.moveTo(a.x, a.y);
  ctx.quadraticCurveTo(mx, my, b.x, b.y);
  ctx.stroke();

  // Arrow head
  if (edge.hot) {
    const t = 0.75;
    const qx = (1-t)*(1-t)*a.x + 2*(1-t)*t*mx + t*t*b.x;
    const qy = (1-t)*(1-t)*a.y + 2*(1-t)*t*my + t*t*b.y;
    const ex = b.x, ey = b.y;
    const ang = Math.atan2(ey-qy, ex-qx);
    const hs = 8;
    ctx.beginPath();
    ctx.moveTo(ex, ey);
    ctx.lineTo(ex-Math.cos(ang-0.4)*hs, ey-Math.sin(ang-0.4)*hs);
    ctx.lineTo(ex-Math.cos(ang+0.4)*hs, ey-Math.sin(ang+0.4)*hs);
    ctx.closePath();
    ctx.fillStyle = 'rgba(0,255,180,0.7)';
    ctx.fill();
  }

  ctx.restore();
}

// ---- Draw pattern trail ----
function drawPatternTrail() {
  if (pattern.length < 2) return;
  ctx.save();

  for (let i = 0; i < pattern.length-1; i++) {
    const a = nodes.find(n=>n.id===pattern[i]);
    const b = nodes.find(n=>n.id===pattern[i+1]);
    if (!a||!b) continue;

    const prog = i/(pattern.length-1);
    const alpha = 0.4 + 0.4*prog;

    ctx.strokeStyle = `rgba(0,255,180,${alpha})`;
    ctx.lineWidth = 2 - prog;
    ctx.setLineDash([5,8]);
    ctx.shadowColor = 'rgba(0,255,180,0.3)';
    ctx.shadowBlur = 6;

    ctx.beginPath();
    ctx.moveTo(a.x, a.y);
    ctx.lineTo(b.x, b.y);
    ctx.stroke();
    ctx.setLineDash([]);
  }

  ctx.restore();
}

// ---- Main render loop ----
function render() {
  ctx.clearRect(0,0,W,H);
  frameCount++;

  // Physics
  for (const node of nodes) {
    // Fade in
    node.opacity = Math.min(1, node.opacity + 0.04);

    // Rotate
    node.angle += node.angleSpeed;

    // Gentle float
    node.x += node.vx;
    node.y += node.vy;

    // Boundary repulsion
    const margin = 60;
    if (node.x < margin) node.vx += 0.08;
    if (node.x > W-margin) node.vx -= 0.08;
    if (node.y < margin) node.vy += 0.08;
    if (node.y > H-margin) node.vy -= 0.08;

    // Damping
    node.vx *= 0.98;
    node.vy *= 0.98;

    // Node-node repulsion
    for (const other of nodes) {
      if (other.id === node.id) continue;
      const dx = node.x - other.x;
      const dy = node.y - other.y;
      const d = Math.sqrt(dx*dx+dy*dy);
      const minD = node.radius + other.radius + 30;
      if (d < minD && d > 0.5) {
        const f = (minD-d)/minD * 0.15;
        node.vx += (dx/d)*f;
        node.vy += (dy/d)*f;
      }
    }

    // Mouse proximity
    const mdx = mouseX - node.x;
    const mdy = mouseY - node.y;
    const md = Math.sqrt(mdx*mdx+mdy*mdy);
    const expandZone = 100;

    if (md < expandZone) {
      node.radius = node.baseRadius + (expandZone-md)/expandZone * 12;
      node.state = md < 30 ? 'hover' : 'idle';
    } else {
      node.radius = node.baseRadius + (node.state === 'focused' ? 8 : 0);
      if (node.state !== 'focused') node.state = 'idle';
    }
  }

  // Determine hovered node
  hoveredNode = null;
  let minDist = 999;
  for (const node of nodes) {
    const d = Math.sqrt((mouseX-node.x)**2+(mouseY-node.y)**2);
    if (d < node.radius+8 && d < minDist) {
      minDist = d;
      hoveredNode = node;
    }
  }

  // Draw pattern trail
  drawPatternTrail();

  // Draw edges
  for (const edge of edges) drawEdge(edge);

  // Draw nodes
  const sorted = [...nodes].sort((a,b)=>(a.state==='focused'?1:-1)-(b.state==='focused'?1:-1));
  for (const node of sorted) drawNode(node);

  // Cursor indicator
  ctx.save();
  ctx.strokeStyle = hoveredNode ? 'rgba(0,255,180,0.6)' : 'rgba(40,100,200,0.3)';
  ctx.lineWidth = 1;
  ctx.beginPath();
  ctx.arc(mouseX, mouseY, hoveredNode ? 20 : 10, 0, Math.PI*2);
  ctx.stroke();
  if (!hoveredNode) {
    ctx.strokeStyle = 'rgba(40,100,200,0.15)';
    ctx.beginPath();
    ctx.arc(mouseX, mouseY, 40, 0, Math.PI*2);
    ctx.stroke();
  }
  ctx.restore();

  // Update HUD
  updateHUD();

  requestAnimationFrame(render);
}

function updateHUD() {
  document.getElementById('stat-nodes').textContent = nodes.length;
  document.getElementById('stat-pattern').textContent = pattern.length > 0 ? pattern.join('') : '';
  document.getElementById('stat-focus').textContent = focusedNode ? focusedNode.id : 'NONE';

  // Pattern chain display
  const chain = document.getElementById('pattern-chain');
  if (pattern.length !== chain.children.length / 2 + (chain.children.length > 0 ? 0.5 : 0)) {
    chain.innerHTML = '';
    pattern.forEach((id, i) => {
      if (i > 0) {
        const arr = document.createElement('span');
        arr.className = 'pattern-arrow'; arr.textContent = '';
        chain.appendChild(arr);
      }
      const badge = document.createElement('span');
      badge.className = 'pattern-node-badge';
      const n = nodes.find(x=>x.id===id);
      badge.textContent = n ? `${id} ${n.type.glyph}` : id;
      chain.appendChild(badge);
    });
  }
}

// ---- Input ----
mainCanvas.addEventListener('mousemove', e => {
  mouseX = e.clientX; mouseY = e.clientY;

  const tooltip = document.getElementById('node-tooltip');
  if (hoveredNode) {
    tooltip.style.display = 'block';
    tooltip.style.left = (e.clientX+16)+'px';
    tooltip.style.top = (e.clientY-10)+'px';
    tooltip.textContent = `${hoveredNode.id} · ${hoveredNode.type.label}`;
  } else {
    tooltip.style.display = 'none';
  }
});

mainCanvas.addEventListener('click', e => {
  if (contextMenuOpen) { closeContextMenu(); return; }

  if (hoveredNode) {
    // Add to pattern or toggle focus
    if (e.shiftKey) {
      // Shift+click = add to pattern
      if (!pattern.includes(hoveredNode.id)) {
        pattern.push(hoveredNode.id);
        hoveredNode.inPattern = true;
        hoveredNode.patternOrder = pattern.length-1;
        updateEdgeHotness();
      }
    } else {
      // Click = set focus
      if (focusedNode) {
        focusedNode.state = 'idle';
        focusedNode = null;
      }
      hoveredNode.state = 'focused';
      focusedNode = hoveredNode;

      // Auto-add to pattern
      if (!pattern.includes(hoveredNode.id)) {
        pattern.push(hoveredNode.id);
        hoveredNode.inPattern = true;
        hoveredNode.patternOrder = pattern.length-1;
        updateEdgeHotness();
      }
    }

    // Show interpret button if pattern >= 2
    document.getElementById('interpret-btn').className = pattern.length >= 2 ? 'visible' : '';
  }
});

mainCanvas.addEventListener('contextmenu', e => {
  e.preventDefault();
  contextX = e.clientX; contextY = e.clientY;
  const menu = document.getElementById('context-menu');
  menu.style.left = contextX+'px';
  menu.style.top = contextY+'px';
  menu.className = 'visible';
  contextMenuOpen = true;
});

function closeContextMenu() {
  document.getElementById('context-menu').className = '';
  contextMenuOpen = false;
}

document.addEventListener('keydown', e => {
  if (e.key === 'Escape') {
    closeContextMenu();
    if (focusedNode) { focusedNode.state = 'idle'; focusedNode = null; }
  }
  if (e.key === 'n' || e.key === 'N') spawnNodeAtMouse();
  if (e.key === 'c' || e.key === 'C') clearPattern();
});

function updateEdgeHotness() {
  // Mark edges along pattern as hot
  for (const edge of edges) {
    edge.hot = false;
    for (let i = 0; i < pattern.length-1; i++) {
      if ((edge.from===pattern[i]&&edge.to===pattern[i+1])||
          (edge.to===pattern[i]&&edge.from===pattern[i+1])) {
        edge.hot = true;
      }
    }
  }
}

function spawnNodeAtMouse(x, y) {
  const nx = x||mouseX, ny = y||mouseY;
  const node = createNode(nx, ny);
  nodes.push(node);

  // Connect to nearest node if within range
  let nearest = null, nearD = 999;
  for (const n of nodes) {
    if (n.id === node.id) continue;
    const d = Math.sqrt((n.x-nx)**2+(n.y-ny)**2);
    if (d < 200 && d < nearD) { nearD = d; nearest = n; }
  }
  if (nearest) edges.push({ from: nearest.id, to: node.id, hot: false });
}

function clearPattern() {
  for (const n of nodes) { n.inPattern = false; n.patternOrder = -1; if (n.state==='focused') n.state='idle'; }
  pattern.length = 0;
  focusedNode = null;
  updateEdgeHotness();
  document.getElementById('interpret-btn').className = '';
  document.getElementById('ai-panel').className = '';
}

function resetField() {
  nodes.length = 0;
  edges.length = 0;
  clearPattern();
  nodeCounter = 0;
  initField();
}

// ---- AI Integration ----
async function interpretPattern() {
  if (pattern.length < 2) return;

  const btn = document.getElementById('interpret-btn');
  btn.disabled = true;
  document.getElementById('loading-overlay').className = '';

  // Build structured prompt from pattern
  const patternNodes = pattern.map(id => {
    const n = nodes.find(x=>x.id===id);
    return n ? { id: n.id, type: n.type.label, glyph: n.type.glyph } : { id };
  });

  const structuredPrompt = `You are the AI core of AngleCore, a spatial workflow system.

The user has constructed a node traversal pattern on the spatial field:
${patternNodes.map((n,i)=>`  Step ${i+1}: Node ${n.id} [${n.type}] ${n.glyph}`).join('\n')}

Pattern sequence: ${patternNodes.map(n=>`${n.type}`).join('')}

Based on this spatial interaction pattern:
1. Interpret what workflow or intent this pattern represents (2-3 sentences)
2. Name this workflow pattern (short, evocative name)
3. Suggest the next logical node type to add
4. Rate the pattern coherence 1-10

Respond in this exact JSON format:
{
  "workflow_name": "...",
  "interpretation": "...",
  "next_node": "...",
  "coherence": 8,
  "insight": "..."
}`;

  const aiPanel = document.getElementById('ai-panel');
  const responseText = document.getElementById('ai-response-text');
  const promptPreview = document.getElementById('ai-prompt-preview');

  try {
    const response = await fetch('https://api.anthropic.com/v1/messages', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({
        model: 'claude-sonnet-4-20250514',
        max_tokens: 1000,
        messages: [{ role: 'user', content: structuredPrompt }]
      })
    });

    const data = await response.json();
    const raw = data.content.map(i=>i.text||'').join('');
    let parsed;

    try {
      const clean = raw.replace(/```
{% endraw %}
json|
{% raw %}
```/g,'').trim();
      parsed = JSON.parse(clean);
    } catch {
      parsed = null;
    }

    document.getElementById('loading-overlay').className = 'hidden';
    aiPanel.className = 'visible';

    if (parsed) {
      responseText.innerHTML = `
<div style="color:rgba(0,255,180,0.9);font-family:'Syne',sans-serif;font-size:14px;font-weight:600;margin-bottom:10px;">${parsed.workflow_name}</div>
<div style="color:rgba(140,180,255,0.9);margin-bottom:14px;line-height:1.7;">${parsed.interpretation}</div>
<div style="color:rgba(100,160,255,0.6);font-size:9px;letter-spacing:2px;margin-bottom:6px;">NEXT SUGGESTED NODE</div>
<div style="color:rgba(255,180,80,0.9);margin-bottom:14px;">${parsed.next_node}</div>
<div style="color:rgba(100,160,255,0.6);font-size:9px;letter-spacing:2px;margin-bottom:6px;">INSIGHT</div>
<div style="color:rgba(180,220,255,0.7);margin-bottom:14px;font-style:italic;">${parsed.insight}</div>
<div style="display:flex;align-items:center;gap:10px;margin-top:6px;">
  <div style="color:rgba(100,160,255,0.6);font-size:9px;letter-spacing:2px;">COHERENCE</div>
  <div style="flex:1;height:3px;background:rgba(40,100,200,0.2);border-radius:2px;">
    <div style="height:100%;width:${parsed.coherence*10}%;background:rgba(0,255,180,0.7);border-radius:2px;transition:width 0.5s;"></div>
  </div>
  <div style="color:rgba(0,255,180,0.9);font-size:11px;">${parsed.coherence}/10</div>
</div>`;
    } else {
      responseText.innerHTML = `<div class="thinking">${raw}</div>`;
    }

    promptPreview.innerHTML = `<div style="color:rgba(40,100,200,0.6);font-size:8px;letter-spacing:2px;margin-bottom:6px;">STRUCTURED PROMPT</div>${patternNodes.map(n=>`${n.type}`).join('')}`;

  } catch (err) {
    document.getElementById('loading-overlay').className = 'hidden';
    aiPanel.className = 'visible';
    responseText.innerHTML = `<div style="color:rgba(255,60,80,0.8);">Connection error. Check API access.<br><br><span style="color:rgba(100,160,255,0.5);font-size:9px;">${err.message}</span></div>`;
  }

  btn.disabled = false;
}

// ---- Buttons & Context ----
document.getElementById('spawn-btn').addEventListener('click', () => spawnNodeAtMouse(cx + (Math.random()-0.5)*200, cy + (Math.random()-0.5)*200));
document.getElementById('interpret-btn').addEventListener('click', interpretPattern);
document.getElementById('clear-btn').addEventListener('click', clearPattern);
document.getElementById('ai-panel-close').addEventListener('click', () => {
  document.getElementById('ai-panel').className = '';
});

document.getElementById('ctx-add-node').addEventListener('click', () => {
  spawnNodeAtMouse(contextX, contextY);
  closeContextMenu();
});
document.getElementById('ctx-add-cluster').addEventListener('click', () => {
  spawnCluster(contextX, contextY);
  closeContextMenu();
});
document.getElementById('ctx-clear-pattern').addEventListener('click', () => { clearPattern(); closeContextMenu(); });
document.getElementById('ctx-reset-field').addEventListener('click', () => { resetField(); closeContextMenu(); });

// ---- Utility ----
function hexToRgba(hex, alpha) {
  const r = parseInt(hex.slice(1,3),16);
  const g = parseInt(hex.slice(3,5),16);
  const b = parseInt(hex.slice(5,7),16);
  return `rgba(${r},${g},${b},${alpha})`;
}

// ---- Init ----
function initField() {
  // Spawn initial nodes in spiral arrangement
  const count = 7;
  for (let i = 0; i < count; i++) {
    const angle = (i/count)*Math.PI*2;
    const r = 140 + i*15;
    const x = cx + Math.cos(angle)*r;
    const y = cy + Math.sin(angle)*r;
    const n = createNode(x, y, i % NODE_TYPES.length);
    nodes.push(n);
  }

  // Initial edges forming a partial network
  edges.push({ from: nodes[0].id, to: nodes[1].id, hot: false });
  edges.push({ from: nodes[1].id, to: nodes[2].id, hot: false });
  edges.push({ from: nodes[2].id, to: nodes[3].id, hot: false });
  edges.push({ from: nodes[3].id, to: nodes[4].id, hot: false });
  edges.push({ from: nodes[4].id, to: nodes[5].id, hot: false });
  edges.push({ from: nodes[5].id, to: nodes[6].id, hot: false });
  edges.push({ from: nodes[0].id, to: nodes[3].id, hot: false });
  edges.push({ from: nodes[1].id, to: nodes[5].id, hot: false });
}

window.addEventListener('resize', () => { resize(); });
resize();
initField();
render();
</script>
</body>
</html>

Enter fullscreen mode Exit fullscreen mode

Top comments (0)