DEV Community

Mario Alexandre
Mario Alexandre

Posted on • Originally published at tokencalc.pro

The Nyquist-Shannon Theorem Applied to AI Prompts

The Nyquist-Shannon Theorem Applied to AI Prompts

By Mario Alexandre
March 21, 2026
sinc-LLM
Prompt Engineering

From Telecommunications to AI

The Nyquist-Shannon sampling theorem is one of the most important results in information theory. Published by Claude Shannon in 1949, it states that a continuous signal can be perfectly reconstructed from discrete samples if the sampling rate is at least twice the signal's bandwidth.

x(t) = Σ x(nT) · sinc((t - nT) / T)

This theorem underpins all digital audio, video, telecommunications, and data conversion. In 2026, the sinc-LLM paper by Mario Alexandre demonstrated that it also applies to LLM prompts, with profound implications for AI reliability and cost.

The Mapping: Signals to Specifications

The analogy is precise, not metaphorical:

Signal Processing LLM Prompting
Continuous signal Complete specification (user's intent)
Frequency bands Specification bands (PERSONA, CONTEXT, DATA, CONSTRAINTS, FORMAT, TASK)
Bandwidth B 6 distinct specification bands
Sampling rate Number of bands explicitly covered
Nyquist rate (2B) All 6 bands present (minimum for faithful reconstruction)
Aliasing Hallucination (model invents missing specifications)
sinc interpolation Model's reconstruction of intent from prompt samples

Why 6 Bands and Not More?

The number 6 was determined empirically from 275 production prompts, not assumed a priori. The analysis used spectral decomposition to identify distinct specification frequencies across 11 autonomous agents performing tasks from code execution to content evaluation.

Key finding: all 11 agents, despite wildly different domains, converged to the same 6 bands with the same relative weightings. This convergence suggests that 6 bands capture the fundamental dimensions of LLM specification, similar to how the visible spectrum has a finite number of distinguishable color bands.

Adding more bands (e.g., splitting CONSTRAINTS into "positive constraints" and "negative constraints") did not improve output quality. The 6 bands represent the specification bandwidth.

The CONSTRAINTS Dominance Finding

The most unexpected result: the CONSTRAINTS band (n=3) accounts for 42.7% of output quality, followed by FORMAT at 26.3%. Together, they represent 69% of quality from just 2 of 6 bands.

In signal processing terms, CONSTRAINTS is the highest-energy frequency band. When it is missing, the aliasing is most severe. This explains why adding "do not hallucinate" to a prompt has minimal effect (it is a constraint about constraints, meta, not substantive) while adding specific rules like "maximum 200 words, no technical jargon, must include pricing, must not mention competitors" dramatically improves output.

Practical Applications

The theorem provides actionable engineering guidance:

  • Completeness check, Any prompt missing one or more of the 6 bands is undersampled and will alias

  • Token allocation, Invest 50% of tokens in CONSTRAINTS + FORMAT, 40% in CONTEXT + DATA, 10% in PERSONA + TASK

  • Cost optimization, Properly sampled prompts reduce token usage by 97% because the model does not need exploratory output

  • Quality prediction, A prompt's SNR can be estimated from its band coverage before execution

Try the free sinc-LLM prompt transformer or explore the source on GitHub.

Transform any prompt into 6 Nyquist-compliant bands

Try sinc-LLM Free

Related Articles

Real sinc-LLM Prompt Example

This is the exact JSON format that sinc-LLM uses. Paste any raw prompt at tokencalc.pro to generate one automatically.

{
"formula": "x(t) = Σ x(nT) · sinc((t - nT) / T)",
"T": "specification-axis",
"fragments": [
{
"n": 0,
"t": "PERSONA",
"x": "You are a signal processing professor who can explain complex mathematical concepts to a non-technical audience using analogies and visual descriptions."
},
{
"n": 1,
"t": "CONTEXT",
"x": "The Nyquist-Shannon sampling theorem was published in 1949. It states that a signal can be perfectly reconstructed from its samples if the sampling rate is at least twice the highest frequency. This theorem has never been applied to LLM prompts until sinc-LLM."
},
{
"n": 2,
"t": "DATA",
"x": "Shannon 1949. Formula: x(t) = Sigma x(nT) * sinc((t - nT) / T). 6 bands on the specification axis. Nyquist rate = 6 samples. Raw prompt = 1 sample = 6:1 undersampling. SNR improvement: 0.003 to 0.92."
},
{
"n": 3,
"t": "CONSTRAINTS",
"x": "Explain the theorem without requiring calculus knowledge. Use the music/audio analogy (44.1kHz sampling). Connect every signal processing concept directly to its LLM equivalent. Never say 'it is complicated.' Make it simple."
},
{
"n": 4,
"t": "FORMAT",
"x": "Return: (1) The Theorem in Plain English. (2) The Music Analogy (3 paragraphs). (3) Mapping Table: Signal Processing Term to LLM Equivalent. (4) Why 6 Samples (one paragraph per band)."
},
{
"n": 5,
"t": "TASK",
"x": "Explain the Nyquist-Shannon sampling theorem and how it applies to LLM prompts in a way that someone with no math background can understand and remember."
}
]
}
Install: pip install sinc-llm | GitHub | Paper


Originally published at tokencalc.pro

sinc-LLM applies the Nyquist-Shannon sampling theorem to LLM prompts. Read the spec | pip install sinc-prompt | npm install sinc-prompt

Top comments (0)