Starholder API

SSE Streaming

Real-time structured media streaming from the world — the foundation for building entertainment products, live experiences, and reactive applications.

The Broadcast Layer

The SSE stream is not just a debugging tool or a convenience for real-time UIs. It is Starholder's media broadcast layer — the live, structured output of the world's reasoning, delivered as it happens.

Every time the persona thinks, the stream carries the full cognitive output: prose arriving word by word, entity references as they are touched, resolved imagery for each, atmospheric signals from the settings involved, and a final structured snapshot that captures everything the turn produced. This is the raw material from which external applications build their products.

A podcast agent listens to the stream, captures the prose and epistemic signals, and produces episodes framed around what the world just thought. A documentary tool consumes text, media bundles, and entity refs to assemble narrated segments with archival imagery and entity identification cards. An interactive experience reads the refs and topology signals to build navigable exploration interfaces. A live dashboard monitors gap coordinates and seed activity to surface real-time creative opportunities.

None of these products need to parse natural language or reconstruct structure from chat output. The stream delivers typed, machine-readable events that separate content from structure. The prose is there for human consumption. The refs, media, topology, and epistemic signals are there for programmatic consumption. Your application decides which parts it needs.


Connecting

GET /api/v1/world/{worldId}/stream
Authorization: Bearer <api_key>

Requirements:

  • canQuery capability
  • world:read scope
curl -N -H "Authorization: Bearer $API_KEY" \
  "https://www.starholder.xyz/api/v1/world/starholder_main/stream"

The connection stays open. Events arrive as the world operates. When idle, you receive ping keepalives every 30 seconds.


What Arrives on the Stream

During Generation

As the persona reasons, you receive events in real time:

EventWhat It CarriesWhat You Can Build With It
text_deltaIncremental prose chunksLive text rendering, real-time transcription, TTS feed
ref_acceptedEntity/setting references the persona is touchingLive entity tracking, knowledge graph visualization, topic monitoring
ref_media_bundleResolved imagery for each referenced entity or settingLive image gallery, visual narrative, media deck

These events arrive interleaved — text flows continuously while refs and media flush in batches between generation bursts. Your application can render text immediately while queuing metadata for the next animation frame.

At Finalization

When the persona finishes reasoning:

EventWhat It CarriesWhat You Can Build With It
text_replacementThe finalized annotated text with inline entity tagsLinked, navigable prose with entity hyperlinks
ref_anomalyRefs the persona tried to use but were rejectedQuality monitoring, hallucination detection
media_placement_mapPer-mention image placements with sentence contextInline illustrations positioned against specific text passages
atmosphere_contextFull context layer: atmosphere, music, scene profile, canonical settingImmersive environment rendering, ambient soundscape, mood-driven UI

At Commit

EventWhat It CarriesWhat You Can Build With It
thoughtpacket_committedEpistemic data: claims, uncertainty, support scoreConfidence framing, fact-checking display, research annotation
contradiction_detectedConflicting claims with severity classification"Competing accounts" segments, investigative narrative
surface_completeThe full Emission Surface — all 8 layers, authoritative snapshotComplete turn archive, offline processing, consistency checkpoint

Lifecycle Events

EventDescription
pingKeepalive every 30 seconds
timeout300 seconds of idle — the stream closes
errorStructured error with code and message
doneStream end signal

Building on the Stream

Live Narrative Products

The simplest use: concatenate text_delta events into a growing text display. As refs arrive, highlight mentioned entities in the text. As media bundles arrive, display imagery alongside the entities they belong to. When atmosphere_context arrives, adjust the visual environment — lighting, color palette, ambient sound.

This is how the Starholder hemisphere works. Your product can do the same thing with completely different presentation.

Structured Media Packages

Wait for surface_complete and consume the full Emission Surface as a single structured document. Extract the layers you need:

  • Podcast: text.clean for narration script + epistemic.claims for segment structure + epistemic.contradictions for "competing accounts" segments
  • Documentary: text.clean for voiceover + media.bundles for archival imagery + refs.accepted for lower-third entity cards + context.atmosphere for environmental framing
  • Research report: epistemic.claims with confidence scores + topology.sparsity for territory assessment + refs.catalog for full evidence base
  • Interactive exploration: refs.accepted for entity graph nodes + media.placements for per-mention imagery + topology.anchors for navigation starting points

Reactive Monitoring

Hold the stream open and watch for patterns:

  • Track topology.sparsity across turns to identify when the persona enters thin territory
  • Monitor ref_accepted events to build a live entity frequency map
  • Watch contradiction_detected events to flag emerging narrative tensions
  • Use atmosphere_context to drive ambient media in physical installations

Resuming After Disconnection

Supply the Last-Event-ID header or ?lastSeq= query parameter with the last received sequence number:

curl -N -H "Authorization: Bearer $API_KEY" \
  -H "Last-Event-ID: 42" \
  "https://www.starholder.xyz/api/v1/world/starholder_main/stream"

The server replays events from its buffer since that sequence number (up to 200 events). If the turn completed while you were disconnected, fetch the persisted snapshot instead:

curl "https://www.starholder.xyz/api/v1/world/starholder_main/turns/{turnId}/surface" \
  -H "Authorization: Bearer $API_KEY"

Stream Lifecycle

  1. Connect with Bearer auth
  2. Receive ping every 30s while idle
  3. When a turn executes, receive text_delta and ref_* events as the persona reasons
  4. surface_complete signals the turn is done — all 8 layers are finalized
  5. If idle for 300s, receive timeout and the stream closes
  6. Reconnect with Last-Event-ID to resume

See the Emission Surface Guide for a complete field-by-field reference of what surface_complete contains.