DEV Community

Cover image for Nature Forgets by Addition
MORC
MORC

Posted on

Nature Forgets by Addition

Nature Forgets by Addition: A Fibonacci-Based Model for Hierarchical Memory Compression and Weighted Fractal Forgetting

Abstract

Digital information—logs, emails, conversation histories, and version control records—accumulates without limit. Conventional systems rely on deletion, but biological memory suggests a different principle: information is not erased but gradually compressed and relegated to deeper layers. This paper introduces Fibonacci Forgetting, a memory model inspired by natural growth patterns and human forgetting. The model uses Fibonacci capacities to trigger overflow-driven bundling and downward abstraction using only addition. We present a minimal implementation, discuss applications, and extend the model into a weighted fractal architecture suitable for Git history management, where newer items naturally carry greater importance. The extended model demonstrates how forgetting can be implemented as a natural, self-similar, and importance-preserving process.


1. Introduction

Human memory reorganizes itself during sleep, assigning weights to experiences and pushing less relevant details into deeper layers. This stands in contrast to digital systems, which typically rely on deletion or fixed-size caches. This paper proposes a biologically inspired alternative: memory compression through addition, grounded in the Fibonacci sequence—a universal structure underlying natural growth, fractals, and efficient packing.


2. Natural Foundations: The Fibonacci Mechanism

The Fibonacci recurrence

F(n+1) = F(n) + F(n-1)

produces structures found throughout nature, including phyllotaxis, spirals, and neural patterns. Because biological systems grow by incremental addition, the Fibonacci sequence emerges as a fundamental organizing principle. We extend this principle to forgetting: compression through additive bundling, not erasure.


3. The Fibonacci Forgetting Model

3.1 Layered Architecture

Information is stored in hierarchical layers:

  • Layer 0 — individual items
  • Layer 1 — small bundles
  • Layer 2 — larger bundles
  • Layer 3+ — conceptual aggregates

Each layer has a capacity defined by the Fibonacci sequence:

1, 1, 2, 3, 5, 8, 13,

3.2 Forgetting by Addition

When a layer exceeds its capacity:

  1. Items are bundled by addition (e.g., counted).
  2. The bundle is pushed to the next layer.
  3. Cascading continues if the lower layer also overflows.

This yields:

  • upper layers — fine-grained, recent
  • lower layers — coarse, older

We call this mechanism Fibonacci Forgetting.


4. Minimal Implementation

from dataclasses import dataclass

FIB = [1, 1, 2, 3, 5, 8, 13, 21]

@dataclass
class FibonacciForgettingMemory:
    layers: list

    @classmethod
    def new(cls, depth=6):
        return cls(layers=[[] for _ in range(depth)])

    def add(self, item):
        self.layers[0].append(item)
        self._cascade(0)

    def _cascade(self, level):
        if level >= len(self.layers) - 1:
            return

        limit = FIB[level]

        if len(self.layers[level]) > limit:
            bundle = len(self.layers[level])
            self.layers[level] = []
            self.layers[level + 1].append(bundle)
            self._cascade(level + 1)

    def state(self):
        return self.layers
Enter fullscreen mode Exit fullscreen mode

5. Applications

  • email management
  • log compression
  • cache systems
  • LLM long-term memory
  • Git history management

The entire mechanism uses addition only, making it simple and robust.


6. Weighted Fractal Fibonacci Memory (Full Integration of the “Bonus” Section)

6.1 Overview

To make Fibonacci Forgetting practical for Git, we introduce a weighted fractal extension.

This model assigns Fibonacci-based weights to items (e.g., commits), bundles them while preserving total weight, and forms a self-similar fractal hierarchy.


*6.2 Specification *

  • Each item (e.g., a commit) is assigned a weight.
  • Weights are automatically selected from the Fibonacci sequence, with newer items receiving larger Fibonacci values.
  • During bundling (cascade), the sum of weights is preserved and treated as the “overall importance” of the bundle.
  • Upper layers contain heavier, fine-grained items; lower layers contain coarser, lighter bundles.
  • This produces a natural hierarchy where recent changes remain prominent.

6.3 Full Code

from typing import Any, List, Dict, Optional

class FractalFibMemoryWithFibWeights:
    """
    Fractal Fibonacci Forgetting with Fibonacci-based weights.
    - Newer items receive larger Fibonacci weights.
    - During bundling, the total weight is preserved and propagated downward.
    - Ideal for Git commit management: recent changes remain heavy, older ones become coarse.
    """

    # Fibonacci weights (reversed so larger weights are assigned first)
    FIB_WEIGHTS = [1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377][::-1]

    def __init__(self, level: int = 0, max_depth: int = 10):
        self.level = level
        self.max_depth = max_depth
        self.items: List[Dict] = []  # {'content': Any, 'weight': int}
        self.children: List['FractalFibMemoryWithFibWeights'] = []
        self.bundle_weight: Optional[int] = None

        # Layer capacity defined by Fibonacci sequence
        fib_caps = [1, 1, 2, 3, 5, 8, 13, 21, 34, 55]
        self.capacity = fib_caps[min(level, len(fib_caps)-1)]

    def add(self, content: Any):
        """Add a new item with an automatically assigned Fibonacci weight."""
        weight_index = min(len(self.items), len(self.FIB_WEIGHTS) - 1)
        weight = self.FIB_WEIGHTS[weight_index]

        self.items.append({
            'content': content,
            'weight': weight
        })

        if len(self.items) > self.capacity:
            self._cascade()

    def _cascade(self):
        if not self.items:
            return

        # Compute total weight of the bundle
        total_weight = sum(item['weight'] for item in self.items)

        bundle = {
            'level': self.level,
            'count': len(self.items),
            'items': self.items[:],
            'total_weight': total_weight,
            'type': 'bundle'
        }
        self.items.clear()

        # Deep-layer handling
        if self.level >= self.max_depth - 2:
            self.children.append(bundle)
            return

        # Create a child fractal node if needed
        if not self.children or (isinstance(self.children[-1], dict) or len(self.children[-1].items) > 0):
            child = FractalFibMemoryWithFibWeights(self.level + 1, self.max_depth)
            self.children.append(child)

        # Pass the bundle downward
        self.children[-1].add(bundle)

    def state(self, depth: int = 0) -> str:
        """Visualize the fractal structure with weights."""
        indent = "  " * depth
        if hasattr(self, 'bundle_weight') and self.bundle_weight is not None:
            return f"{indent}Bundle @ Level {self.level} (total_weight={self.bundle_weight}, count=?)\n"

        s = f"{indent}Level {self.level} (cap={self.capacity}): {len(self.items)} items\n"
        for item in self.items:
            s += f"{indent}  - {item['content']} (weight={item['weight']})\n"

        for child in self.children:
            if isinstance(child, dict):
                s += f"{indent}  └─ Bundle L{child['level']} (count={child['count']}, total_weight={child['total_weight']})\n"
            else:
                s += child.state(depth + 1)
        return s

    def get_weighted_recent(self, n: int = 10) -> List[Dict]:
        """Retrieve the most recent items sorted by weight (importance)."""
        collected = []

        def collect(node):
            collected.extend(node.items)
            for c in node.children:
                if isinstance(c, dict):
                    collected.append({
                        'content': f"[Bundle {c['count']} items]",
                        'weight': c['total_weight']
                    })
                else:
                    collect(c)

        collect(self)
        collected.sort(key=lambda x: x['weight'], reverse=True)
        return collected[:n]


# Demo
mem = FractalFibMemoryWithFibWeights()

# Add commits (newer commits receive larger weights)
for i in range(80):
    msg = f"commit_{i:03d} - {'fix' if i % 3 == 0 else 'refactor' if i % 5 == 0 else 'feat'}"
    mem.add(msg)

print("=== Fractal Structure (Weighted) ===")
print(mem.state())

print("\n=== Top 10 Most Important Recent Items ===")
recent = mem.get_weighted_recent(10)
for item in recent:
    print(f"- {item['content']} (weight={item['weight']})")
Enter fullscreen mode Exit fullscreen mode

6.4 Summary of Characteristics (Fully Translated)

  • Weight assignment: newer items receive larger Fibonacci values (e.g., 233 → 1).
  • Bundle weight propagation: total weight is preserved and passed downward, so deep-layer bundles represent “clusters of past important changes.”
  • Natural forgetting: upper layers store fine-grained items; lower layers store coarse aggregates.
  • Practical benefit: in Git logs, major recent refactors remain prominent.

7. Discussion

Fibonacci Forgetting provides a biologically inspired alternative to deletion-based memory management. The weighted fractal extension shows that forgetting can preserve importance while reducing detail.


8. Conclusion

This paper presented Fibonacci Forgetting and its weighted fractal extension, demonstrating how natural principles can guide memory compression in digital systems.


License

All code in this paper is released under the MIT License.

Copyright (c) 2024 moroc.b13


Top comments (0)