Thursday, November 13, 2025

Self-Completing String

Self-Completing String — Grammar Edition (pressure + tail loop, fixed)

Self-Completing String — Grammar Edition

Rotating Subject + Predicate or flat random tokens. Halts via Stopword, Loop, Entropy.
Word banks (big lexicon by part of speech)
Comma or newline separated. Paste thousands.
Formulas (rotating Subject + Predicate)
steps0 unique0 ratio0.00 stateidle press0%
Copied

Wednesday, November 12, 2025

Autocatalytic Grammar

Autocatalytic Grammars - Self-Evolving Text Engine (Viewport-Fit)

Autocatalytic Grammars

Rules generate new rules during derivation. The grammar evolves while you generate text.

Tip: Generate runs a full derivation from the start symbol and lets the grammar evolve as it fires.
Format: Nonterminal -> production | production. Terminals in quotes, nonterminals bare.
Advanced Settings
Starter Grammar (Editable)
Lexicon
Runtime Stats
lexicon: 0 rules: 0 firings: 0 spawned: 0 mutations: 0

Tuesday, November 11, 2025

Nonsense Poem Generator

Nonsense Poem Generator

Nonsense Poem Generator (CC + V + CC)





Settings — edit clusters & vowels (optional)
Comma/space separated lists. With phonotactics-lite on, “ng” is coda-only, onsets avoid vowel-final clusters, and codas avoid vowel-initial clusters.

Monday, November 10, 2025

Word Generator

CC-V-CC One-Word Generator + Speak

Consonant-Cluster Word Generator (CC + V + CC)

Pattern: Onset + Vowel + Coda. Click the word to copy • Press Space to generate • Use Speak to hear it.
Settings — edit clusters & vowels (optional)
Comma/space separated. You can add digraphs like ai, ea, oo. “ng” is coda-only when phonotactics-lite is on.
Onset clusters (beginnings)
Vowels (nucleus)
Coda clusters (endings)

Sunday, November 9, 2025

Syntax Mandala

Syntax Mandala — Compact Edition

Syntax Mandala

Reflexive Grammar Visualizer — draws its own parse tree.
Use | for alternatives.
Ready
Each | is random; add a seed for reproducible runs.

Saturday, November 8, 2025

Neural Grammar Distiller

Neural Grammar Distiller

Neural Grammar Distiller - Recovered Scripture

Hard-coded Pantheon corpus -> tiny recurrent model -> PMI chunking -> PCFG -> scripture.
Options (tuck away)
idle
-
rules: -
perplexity: -
Recovered RulesPCFG
Recovered ScriptureS -> terminals
Debug / Logs

Friday, November 7, 2025

Context Sensitive Grammar

Context Sensitive Grammar

Entanglement Engine Context Sensitive Grammar

Top-down · Blogger-safe · single file
Options (steps & speed)
Ready.
Outputstep 0
len 1
Tip: Context sensitivity comes from multi-token LHS (e.g., A B -> A x B). Use a preset and press Run.
About: Entanglement Engine

A compact emulator for Type-1, context-sensitive rewriting. This Simple Mode hides advanced controls so the poem is easy to run on Blogger.

Deterministic run: applies the leftmost applicable rule once per step. “Run” repeats until Max Steps or no rule applies.

Scripting Language — Quick Guide

1) Tokens

  • Input/output are space-separated tokens: e.g., A B x. Upper/lowercase have no special behavior.
  • The engine starts from the preset’s start string (shown in each example).

2) Rule Syntax

LHS -> RHS
  • One rule per line. The arrow is a literal ->.
  • Alternatives: separate multiple RHS options with | (each option is a separate rule).
  • Epsilon (empty): use a blank RHS or aliases , epsilon, eps, ε to delete the LHS.
  • Comments: anything after ; or // on a line is ignored.

3) Context Sensitivity

The LHS may be multiple tokens. A rule matches only when the entire LHS sequence occurs, so rewrites can depend on neighbors.

A B -> A x B
A x B -> A x x B

“Insert x between A and B” only triggers when A is immediately followed by B.

4) Execution Model

  • Step: apply one rewrite at the leftmost match; then print the string.
  • Run: repeat Step until Max Steps or no rules apply.
  • Deterministic: if multiple rules match at the same position, the earlier rule (higher in the list) fires.

5) Mini Cheat-Sheet

# basic form
A -> x y z

# multi-token LHS (context required)
A B -> A x B

# alternatives
S -> A S A | B S B | A A | B B

# epsilon (delete)
C B ->    ; removes the sequence "C B"

# comments
A B -> A x B   ; insert x between A and B

6) Tips

  • For strict Type-1 experiments, avoid contracting rules (don’t make RHS shorter than LHS).
  • Put more specific rules earlier if they compete with general ones.
  • Use short tokens for structural symbols (e.g., A, B, x), and words for terminal text if you like (AND, light, debt).