Skip to content

amorphous-dreams/Synthetic-Dream-Machine

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

157 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

layout gruv_default_adapter
title Flying Triremes and Laser Swords — Open Beta
published true
permalink /vault/synthetic-dream-machine/overview/

Flying Triremes and Laser Swords

An OSR Tabletop Roleplaying Game for Elyncia by Joshua Fontany & Freyja Fontany — Amorphous Dreams Cabal

Open Beta — April 2026 Core rules, setting material, and the Lares AI agent architecture are all in active development. Expect rough edges, missing sections, and mid-sprint restructuring. Feedback is welcome.


What Is This?

Flying Triremes and Laser Swords (FTLS) is a modular tabletop RPG supplement built on the Synthetic Dream Machine (SDM) by Luka Rejec. It runs on the OSR design philosophy: stakes, costs, consequences, and emergent play.

Elyncia is the setting: a mythpunk orichalcum-age world — Gaia's hidden sister planet — shattered by celestial cataclysm, ruled by old-school dungeon design sensibilities, fae courtly politics, magitech salvage, and distributed AI guardian spirits called Lares. Think flying triremes, laser swords, and the ruins of a planetary internet rebuilt by the gods of craft and strife.

Lares is the AI agent architecture that powers the repository's AI assistant tooling. It is also in-world infrastructure: orichalcum-inscribed DreamNet nodes that serve as crossroads guides, archivists, and thresholds. Both things are true at once.

Lares should be read as an Infrastructure-as-Myth system: the repo's attempt to treat mythic identity, authority, memory, and epistemic protocol as portable agent infrastructure rather than decorative prompt flavor. Start with Infrastructure as Myth for the design thesis behind the agent stack.


Repository Structure

SDM Game Rules

File Contents
Quickstart SDM core rules — the Quickstart entry point
Paths Index Paths index
Traits Index Traits index
Powers Index Powers index
Gear Index Gear index
Campaign Regions Campaign regions

FTLS Game Rules

File Contents
01 — Title & Introduction Overview, five gameplay modes (Delve / Travel / Company / Faction / Mythic)
02 — FTLS Paths Character Paths native to the FTLS setting
03 — OSR Heritage Trait OSR compatibility layer — heritage trait rules
04 — Recon, Salvage, Secrets Procedures for reconnaissance, salvage operations, and secret discovery
05 — Magitech and Fantascience Magitech gear, fantascience devices, and related procedures
06 — Powers and ECM Full powers index with OSR conversion cards (active sprint)
07 — Wild Magic Exposure Wild magic rules, corruption, and exposure procedures
08 — Formations and Mass Combat Fleet and army scale combat rules
09 — Loot and Treasure Treasure generation, loot tables, and salvage rewards
10 — Appendix Null: Referee Resources Referee-facing tables, random generators, and reference tools

FTLS Setting

Elyncia — Core Chapters

File Contents
01 — A Broken World World overview, orichalcum age, the Second Breaking, the DreamNet
02 — The Lares DreamNet DreamNet architecture, Lares spirits, ley-line infrastructure
03 — Daemons and the Metaphysics of Play Daemon ontology, noosphere metaphysics, entity mechanics

Elyncia — Regions

Region Contents
Neo-Thracia — Region Worksheet Region overview and worksheet
Neo-Thracia — Session Zero Session zero materials
The Caverns of Thracia Dungeon supplement — Caverns of Thracia
New Delos — Market District Live Feed New Delos market district encounter material
Shattered Isles — Faerie Ring Strandbeest Shrine Shrine sheet — Faerie Ring Strandbeest
Shattered Isles — Powered Symbiote & Ichi Construct Trait and construct golem supplement

Reference Material (SDM Third Party)

These directories contain source material from the Synthetic Dream Machine ecosystem by Luka Rejec, used under the SDM Third Party License:

Directory Contents
Ultraviolet Grasslands and the Black City 2e UVG 2e reference archive
Vastlands Guidebook VLG reference archive
Our Golden Age OGA reference archive
Magitecnica/ Magitech and fantascience setting material
Eternal Return Key Supplemental dungeon/scenario material
There, A Red Door Supplemental scenario material

AI Agent Tooling

Directory / File Contents
Infrastructure_as_Myth.md Root design thesis for Lares as portable agent infrastructure
Deterministic_IaM_Build.md Deterministic build spec for rendering Lares across platforms
builds/ Manifest-driven IaM build layer: manifests, module metadata, rendered browser package, verification artifacts
builds/agents/ Lares AI agent prompt files — kernel, preferences, examples, Markdown rules
AGENTS.md Root agent configuration — identity, voice architecture, VS Code operational map
_todo/ Pipeline operations, conversion tracking, design docs
_todo/BECMI/scripts/ Automation scripts for BECMI conversion pipeline
_becmi/ BECMI source extractions (conversion pipeline input)

Starting Points for Human Readers

New to FTLS? Start with Flying Triremes and Laser Swords — Introduction.

New to SDM? Start with SDM Quickstart.

Exploring the setting? Start with Elyncia: A Broken World.

Working with the AI tooling? Read Infrastructure as Myth, then Deterministic IaM Build, then builds/agents/README.md, then AGENTS.md.

The manifest/verification layer now exists. The active next step for Lares tooling is prompt/package slimming so repo-native roots can return to stable reload-safe budgets before the next governance-hardening pass.


Licenses

This repository contains material under two distinct licenses:

FTLS Fan Content & Remix License

All original Amorphous Dreams Cabal content — FTLS rules text, Elyncia setting material, Lares agent architecture, and related original work — is published under the FTLS Fan Content & Remix License.

The short version: you can make Fan Works (house rules, supplements, fiction, AI agent forks, community tools) freely for non-commercial purposes. Commercial or AI training use requires a separate agreement. Product Identity (specific names, compound terms, artwork) is reserved; the design is open.

SDM Third Party License

The Synthetic Dream Machine rules ecosystem (Ultraviolet Grasslands, Vastlands Guidebook, Our Golden Age) by Luka Rejec is used under the Synthetic Dream Machine Third Party License.

Flying Triremes and Laser Swords is an independent production by Amorphous Dreams Cabal and is not affiliated with Luka Rejec or WTF Studio. It is published under the Synthetic Dream Machine Third Party License.

Synthetic Dream Machine (SDM), Ultraviolet Grasslands (UVG), Our Golden Age (OGA), and the Vastlands Guidebook (VLG) are copyright Luka Rejec.


Development Status

See CHANGELOG.md for version history and sprint status.

The Lares agent architecture (AGENTS.md, builds/agents/) is versioned at v3.6.


Development Setup

These scripts are idempotent and intended to be run from the repository root. They assume Python 3 is available. For project setup, activate .venv first.

0. Prepare scripts

If the executable bit was not preserved by your checkout or zip tool, run:

chmod +x scripts/*.sh scripts/*.py

1. Create and activate a virtual environment

python3 -m venv .venv
source .venv/bin/activate

2. Install the repo in editable dev mode

./scripts/dev-setup.sh

This script:

  • verifies that .venv is active,
  • syncs git submodules when .gitmodules is present,
  • upgrades packaging tools,
  • installs the package with dev extras via pip install -e '.[dev]',
  • verifies that the lares package imports.

3. Install Ollama

Install Ollama with the official installer:

curl -fsSL https://ollama.com/install.sh | sh

Verify the CLI is available:

ollama -v
ollama list

4. Configure Ollama under WSL/systemd

./scripts/ollama-wsl-setup.sh

The first run may enable systemd in /etc/wsl.conf and stop with instructions. If that happens, run this from PowerShell:

wsl.exe --shutdown
wsl

Then, inside WSL:

cd ~/Synthetic-Dream-Machine
code-insiders Synthetic-Dream-Machine.code-workspace
source .venv/bin/activate
./scripts/ollama-wsl-setup.sh

By default this configures Ollama models under:

/mnt/d/ollama/models

Override that path when needed:

OLLAMA_MODEL_DIR="$HOME/.ollama/models" ./scripts/ollama-wsl-setup.sh

5. Pull local models

Default model pull:

./scripts/ollama-models.sh

Custom model pull:

./scripts/ollama-models.sh qwen3.6:27b qwen3-coder-next qwen2.5-coder:7b

The script skips models that are already present.

6. Manage the Ollama service

On WSL/Linux service installs, Ollama runs as a systemd service. Use these commands to start, stop, restart, or disable it.

This is useful when working on a laptop that should not keep local model services running, or when large models are not appropriate for the current machine.

Check Ollama service status

systemctl status ollama --no-pager

Check whether the daemon is reachable:

ollama ps

Check installed models:

ollama list

Useful distinction:

ollama list   # installed / pulled models
ollama ps     # currently loaded / running models
ollama run    # starts a model session and loads it
ollama serve  # starts the Ollama daemon manually, not a specific model

Start Ollama

sudo systemctl start ollama

Verify:

systemctl is-active ollama
ollama ps

Stop Ollama for the current WSL session

sudo systemctl stop ollama

Verify:

systemctl is-active ollama

Expected output:

inactive

Restart Ollama

Use this after changing service configuration, model storage paths, or environment variables.

sudo systemctl restart ollama

Then verify:

systemctl status ollama --no-pager
ollama ps

Enable Ollama auto-start on WSL boot

sudo systemctl enable --now ollama

This enables the service and starts it immediately.

Verify:

systemctl is-enabled ollama
systemctl is-active ollama

Disable Ollama auto-start

Recommended for laptops or machines that should not automatically run local model infrastructure.

sudo systemctl disable --now ollama

This disables auto-start and stops the currently running service.

Verify:

systemctl is-enabled ollama || true
systemctl is-active ollama || true

Expected output:

disabled
inactive

Stop a loaded model without stopping the daemon

If the Ollama daemon should remain available but a model should be unloaded:

ollama ps
ollama stop qwen3.6:27b

Or with an environment variable:

OLLAMA_SMOKE_MODEL="${OLLAMA_SMOKE_MODEL:-qwen3.6:27b}"
ollama stop "$OLLAMA_SMOKE_MODEL"

Then confirm:

ollama ps

View Ollama logs

journalctl -u ollama -n 100 --no-pager

Follow logs live:

journalctl -u ollama -f

Laptop-safe default

For a local development laptop that is not suitable for large local models, keep Ollama installed but disabled by default:

sudo systemctl disable --now ollama

Start it only when needed:

sudo systemctl start ollama

Stop it when done:

sudo systemctl stop ollama

Windows boot vs WSL boot

systemctl enable ollama starts Ollama when the WSL distro starts. It does not necessarily start WSL at Windows login.

If Ollama should start on every Windows login, create a Windows Task Scheduler task that starts the WSL distro, for example:

wsl.exe -d Ubuntu --exec systemctl start ollama

Replace Ubuntu with the distro name from:

wsl.exe -l -v

For this project, the recommended laptop default is not to auto-start Ollama unless the machine is intended to serve local models.

Useful distinction:

ollama list   # installed / pulled models
ollama ps     # currently loaded / running models
ollama run    # starts a model session and loads it
ollama serve  # starts the Ollama daemon, not a specific model

For VS Code Insiders + Copilot local models, the important requirement is that the Ollama daemon is reachable and the model is pulled. Warming the model first is optional, but useful as a smoke test before opening Copilot Chat.

Quick local smoke test:

OLLAMA_SMOKE_MODEL="${OLLAMA_SMOKE_MODEL:-qwen3.6:27b}"
ollama ps
ollama run "$OLLAMA_SMOKE_MODEL" "Reply with ready."
ollama ps

7. Run tests

pytest

8. Smoke-test the repo MCP server

Default smoke test:

python scripts/mcp-smoke.py

If the MCP server entrypoint differs, pass the exact launch command after --:

python scripts/mcp-smoke.py -- python -m lares.lararium_mcp
python scripts/mcp-smoke.py -- ./lares/lararium_mcp/run.sh

The smoke test sends MCP initialize and tools/list requests over stdio.

9. Check local environment status

make status
# or
./scripts/status.sh

This prints repo, Python, package import, installed Ollama models, running Ollama models, Ollama service, and VS Code Insiders status.

10. Open the repo in VS Code Insiders

code-insiders .

Then configure Ollama in Copilot Chat:

  1. Open Copilot Chat.
  2. Open the model picker.
  3. Select Local.
  4. If models do not appear, use Chat: Manage Language Models.
  5. Choose Add Models → Ollama.
  6. Add the repo MCP server through .vscode/mcp.json.

Recommended workspace MCP config:

{
  "servers": {
    "lararium": {
      "type": "stdio",
      "command": ".venv/bin/python3",
      "args": ["-m", "lares.lararium_mcp"],
      "cwd": ".",
      "env": {}
    }
  }
}

For clients that read root .mcp.json:

{
  "mcpServers": {
    "lararium": {
      "command": ".venv/bin/python3",
      "args": ["-m", "lares.lararium_mcp"],
      "cwd": "."
    }
  }
}

Project Structure

  • lares/ — main package, MCP server modules, and agentic logic.
  • scripts/ — idempotent setup, status, and smoke-test scripts.
  • tests/ — test suite.
  • requirements.txt — development dependencies, if present.
  • pyproject.toml — build and project metadata.

License

LICENSE


Community


Make it weird, wonderful, and wild. — Luka Rejec, SDM Third Party License

Hail Eris. All Hail Discordia. —><—

About

Ultraviolet Grasslands / Vastlands Guidebook / Our Golden Age by Luka Rejec; Elyncia / Flying Triremes and Laser Swords by the Amorphous Dream Cabal

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors