◈ CAPTAIN'S LOG GENERATOR

Markov chain · trigram (order-2) · trained on 123 TNG logs · vanilla JS
Order
Max words 80
Generated Entry
Loading training corpus…
Training entries
Unique states
Transitions
Avg transitions / state
Generated 0

How It Works

A Markov chain models text as a probabilistic sequence: given the last N words, what word comes next? Order-2 (trigram) means each token is predicted from the previous two. Training data: 123 Star Trek: TNG captain's log entries scraped from transcripts. Total corpus: ~5,600 words.

The chain is built entirely in your browser — no server round-trip. The training file (logs.txt, 33 KB) is fetched once and cached. Generation runs in microseconds. Each click produces a unique log entry from the learned probability distribution.

Higher order → more coherent output but less creative variance. Lower order → wilder results, more likely to drift. Order-2 (trigram) is the sweet spot for a corpus this size. Source code in the page's <script> block; training data at logs.txt. Full write-up: the original post.

Initialising…