Governed Chat Command Language — Implementation Plan¶
For agentic workers: REQUIRED: Use superpowers:subagent-driven-development (if subagents available) or superpowers:executing-plans to implement this plan. Steps use checkbox (
- [ ]) syntax for tracking.
Goal: Replace the hardcoded constraint UI with a slash command language, pinned constraint bar, and inline system messages — backed by a unified ConstraintStore and migrated API contract.
Architecture: Three-layer change: (1) governor engine gains Include/Coverage constraints and public API fixes, (2) dashboard backend migrates from profile-based to inline-constraint generation with rekeyed cache, (3) dashboard frontend replaces ConstraintPanel with command parser + constraint bar + system messages, all driven by a single ConstraintStore.
Tech Stack: Python (diffusion_governors, FastAPI, Pydantic), TypeScript/React (Zustand, Tailwind CSS), pytest, Vitest
Spec: docs/superpowers/specs/2026-03-16-governed-chat-command-language-design.md
File Map¶
New files¶
| Path | Responsibility |
|---|---|
packages/governors/src/diffusion_governors/include.py |
IncludeConstraint + CoverageConstraint classes |
packages/governors/tests/test_include.py |
Tests for Include/Coverage constraints |
packages/dashboard/frontend/src/store/constraintStore.ts |
Zustand store — single source of truth for active constraints |
packages/dashboard/frontend/src/commands/parser.ts |
Command parser — / prefix detection, verb dispatch, argument parsing |
packages/dashboard/frontend/src/commands/registry.ts |
Command registry — verb definitions, syntax, defaults, help text |
packages/dashboard/frontend/src/commands/normalize.ts |
Phoneme normalization — ASCII → IPA mapping for command input |
packages/dashboard/frontend/src/commands/compiler.ts |
Compile per-entry store → merged constraint array for API |
packages/dashboard/frontend/src/commands/__tests__/parser.test.ts |
Parser unit tests |
packages/dashboard/frontend/src/commands/__tests__/compiler.test.ts |
Compiler unit tests |
packages/dashboard/frontend/src/commands/__tests__/normalize.test.ts |
Normalization unit tests |
packages/dashboard/frontend/src/components/ConstraintBar/index.tsx |
Pinned constraint bar with dismissible chips |
packages/dashboard/frontend/src/components/ConstraintBar/ConstraintChip.tsx |
Individual chip component with color coding + dismiss |
packages/dashboard/frontend/src/components/ChatWindow/SystemMessage.tsx |
Inline system message component (✓/✗/⚠ prefixed) |
Modified files¶
| Path | Change |
|---|---|
packages/governors/src/diffusion_governors/__init__.py |
Export MinPairBoost, MaxOppositionBoost, MSHStage, IncludeConstraint, CoverageConstraint |
packages/governors/src/diffusion_governors/constraints.py |
Deprecation note on Density |
packages/dashboard/server/schemas.py |
Add IncludeConstraint, CoverageConstraint, MSHConstraint, MinPairBoostConstraint, MaxOppositionBoostConstraint models; add mechanism to BoundConstraint; update GenerateRequest/GenerateSingleRequest to accept constraints instead of profile_id; update Turn to record constraints |
packages/dashboard/server/governor.py |
Rekey GovernorCache by constraint hash; wire new constraint types in _to_dg_constraint(); pass step/total_steps in HFGovernorProcessor |
packages/dashboard/server/routes/generate.py |
Accept inline constraints instead of profile_id; record constraint snapshot on Turn |
packages/dashboard/server/main.py |
Instantiate ProfileStore with persistence path |
packages/dashboard/server/sessions.py |
Update append_turn() for new Turn shape |
packages/dashboard/frontend/src/types.ts |
Add new constraint types, StoreEntry type, SystemMessage type; update Turn |
packages/dashboard/frontend/src/api/client.ts |
Update generate()/generateSingle() to send inline constraints |
packages/dashboard/frontend/src/App.tsx |
Replace ConstraintPanel with ConstraintBar; wire ConstraintStore |
packages/dashboard/frontend/src/hooks/useSession.ts |
Send constraints from store instead of profile_id |
packages/dashboard/frontend/src/hooks/useProfile.ts |
Load profile → store.load(); save → store.snapshot() |
packages/dashboard/frontend/src/components/ChatWindow/index.tsx |
Render system messages in message list |
packages/dashboard/frontend/src/components/ChatWindow/ChatInput.tsx |
Detect / prefix, route to command parser |
packages/dashboard/frontend/src/components/ChatWindow/MessageList.tsx |
Interleave SystemMessage components |
packages/dashboard/frontend/src/components/CompliancePanel/ActiveConstraints.tsx |
Read from ConstraintStore instead of profile |
Deleted files (migration cleanup)¶
| Path | Reason |
|---|---|
packages/dashboard/frontend/src/components/ConstraintPanel/index.tsx |
Replaced by ConstraintBar + command input |
packages/dashboard/frontend/src/components/ConstraintPanel/PhonemeToggles.tsx |
Replaced by /exclude command |
packages/dashboard/frontend/src/components/ConstraintPanel/NormSliders.tsx |
Replaced by norm commands |
packages/dashboard/frontend/src/components/ConstraintPanel/ComplexityBounds.tsx |
Replaced by /complexity command |
packages/dashboard/frontend/src/components/ConstraintPanel/ProfileActions.tsx |
Folded into profile selector |
Chunk 1: Governor Engine — Include/Coverage + Prerequisite Fixes¶
Task 1: Export missing constraints from __init__.py¶
Files:
- Modify: packages/governors/src/diffusion_governors/__init__.py
- [ ] Step 1: Read current exports
Read packages/governors/src/diffusion_governors/__init__.py to see what's currently exported.
- [ ] Step 2: Add missing exports
Add MinPairBoost, MaxOppositionBoost, MSHStage to the imports and __all__ list:
from .constraints import (
Exclude,
ExcludeInClusters,
Bound,
NormCovered,
Complexity,
MSHStage,
Density,
MinPairBoost,
MaxOppositionBoost,
VocabOnly,
STOP_WORDS,
ESSENTIAL_ENGLISH,
)
- [ ] Step 3: Verify imports work
Run: cd packages/governors && python -c "from diffusion_governors import MinPairBoost, MaxOppositionBoost, MSHStage; print('OK')"
Expected: OK
- [ ] Step 4: Commit
git add packages/governors/src/diffusion_governors/__init__.py
git commit -m "fix: export MinPairBoost, MaxOppositionBoost, MSHStage from diffusion_governors"
Task 2: Implement IncludeConstraint¶
Files:
- Create: packages/governors/src/diffusion_governors/include.py
- Create: packages/governors/tests/test_include.py
- Modify: packages/governors/src/diffusion_governors/__init__.py
- [ ] Step 1: Write failing test for IncludeConstraint
# packages/governors/tests/test_include.py
import torch
import pytest
from diffusion_governors import Governor, GovernorContext
from diffusion_governors.include import IncludeConstraint
from diffusion_governors.lookups import TokenFeatures, PhonoFeatures
def _make_lookup(token_phonemes: dict[int, list[str]]) -> dict[str, dict]:
"""Build minimal lookup: token_id -> TokenFeatures with phonemes."""
lookup = {}
for tid, phonemes in token_phonemes.items():
tf = TokenFeatures(
word=f"tok{tid}",
phono=PhonoFeatures(
phonemes=phonemes,
ipa="".join(phonemes),
syllable_count=1,
syllable_shapes=["CVC"],
wcm=0,
cluster_phonemes=[],
syllables=[],
biphone_avg=0.0,
pos_seg_avg=0.0,
),
norms={},
vocab_memberships=set(),
)
lookup[str(tid)] = tf.to_dict()
return lookup
class TestIncludeConstraint:
"""IncludeConstraint boosts tokens containing target phonemes."""
def test_builds_logit_boost(self):
"""Include compiles to a LogitBoost mechanism."""
lookup = _make_lookup({
0: ["k", "æ", "t"], # "cat" — has /k/
1: ["d", "ɔ", "ɡ"], # "dog" — no /k/
2: ["k", "ɪ", "k"], # "kick" — has /k/
})
constraint = IncludeConstraint(phonemes={"k"}, strength=2.0)
gov = Governor.from_constraints(constraint, vocab_size=4, device="cpu", lookup=lookup)
assert len(gov.boosts) == 1
assert len(gov.gates) == 0
def test_boosts_target_phoneme_tokens(self):
"""Tokens containing target phonemes get boosted logits."""
lookup = _make_lookup({
0: ["k", "æ", "t"], # has /k/
1: ["d", "ɔ", "ɡ"], # no /k/
2: ["k", "ɪ", "k"], # has /k/
})
constraint = IncludeConstraint(phonemes={"k"}, strength=2.0)
gov = Governor.from_constraints(constraint, vocab_size=4, device="cpu", lookup=lookup)
logits = torch.zeros(1, 4)
ctx = GovernorContext(device="cpu")
result = gov(logits, ctx)
# Tokens 0 and 2 should be boosted, token 1 should not
assert result[0, 0].item() > 0 # "cat" boosted
assert result[0, 1].item() == 0 # "dog" not boosted
assert result[0, 2].item() > 0 # "kick" boosted
def test_default_strength(self):
"""Default strength is 2.0."""
constraint = IncludeConstraint(phonemes={"k"})
assert constraint.strength == 2.0
def test_multiple_phonemes(self):
"""Include with multiple phonemes boosts tokens containing any of them."""
lookup = _make_lookup({
0: ["k", "æ", "t"], # has /k/
1: ["s", "ɪ", "t"], # has /s/
2: ["d", "ɔ", "ɡ"], # neither
})
constraint = IncludeConstraint(phonemes={"k", "s"}, strength=2.0)
gov = Governor.from_constraints(constraint, vocab_size=4, device="cpu", lookup=lookup)
logits = torch.zeros(1, 4)
ctx = GovernorContext(device="cpu")
result = gov(logits, ctx)
assert result[0, 0].item() > 0 # has /k/
assert result[0, 1].item() > 0 # has /s/
assert result[0, 2].item() == 0 # neither
- [ ] Step 2: Run test to verify it fails
Run: cd packages/governors && python -m pytest tests/test_include.py -v
Expected: FAIL — ImportError: cannot import name 'IncludeConstraint'
- [ ] Step 3: Implement IncludeConstraint
# packages/governors/src/diffusion_governors/include.py
"""Include and Coverage constraints — phoneme boosting and targeting."""
from __future__ import annotations
from .boosts import LogitBoost
from .constraints import Constraint
from .core import Mechanism
class IncludeConstraint(Constraint):
"""Boost tokens containing target phonemes.
Compiles to LogitBoost — adds +strength to logits of matching tokens.
Does not block anything; just shifts the distribution toward targets.
"""
def __init__(self, phonemes: set[str], strength: float = 2.0):
self.phonemes = phonemes
self.strength = strength
@property
def mechanism_kind(self) -> str:
return "boost"
def build(self, **kwargs) -> Mechanism:
vocab_size = kwargs["vocab_size"]
lookup = kwargs["lookup"]
scores: dict[int, float] = {}
for tid_str, feats in lookup.items():
tid = int(tid_str)
phono = feats.get("phono")
if phono and set(phono.get("phonemes", [])) & self.phonemes:
scores[tid] = self.strength
return LogitBoost.from_scores(scores, vocab_size)
- [ ] Step 4: Export from
__init__.py
Add to packages/governors/src/diffusion_governors/__init__.py:
from .include import IncludeConstraint
- [ ] Step 5: Run tests to verify they pass
Run: cd packages/governors && python -m pytest tests/test_include.py -v
Expected: All 4 tests PASS
- [ ] Step 6: Commit
git add packages/governors/src/diffusion_governors/include.py \
packages/governors/tests/test_include.py \
packages/governors/src/diffusion_governors/__init__.py
git commit -m "feat: add IncludeConstraint — phoneme boost for governed generation"
Task 3: Implement CoverageConstraint¶
Files:
- Modify: packages/governors/src/diffusion_governors/include.py
- Modify: packages/governors/tests/test_include.py
- Modify: packages/governors/src/diffusion_governors/__init__.py
This is the stateful projection that tracks phoneme hit rate across generation and adjusts boost strength dynamically. It reads GovernorContext.token_ids to compute running coverage.
- [ ] Step 1: Write failing test for CoverageConstraint
Add to packages/governors/tests/test_include.py:
from diffusion_governors.include import CoverageConstraint
class TestCoverageConstraint:
"""CoverageConstraint tracks phoneme rate and adjusts boost dynamically."""
def test_builds_projection(self):
"""Coverage compiles to a projection (not a gate or boost)."""
lookup = _make_lookup({
0: ["k", "æ", "t"],
1: ["d", "ɔ", "ɡ"],
})
constraint = CoverageConstraint(phonemes={"k"}, target_rate=0.2)
gov = Governor.from_constraints(constraint, vocab_size=4, device="cpu", lookup=lookup)
assert len(gov.projections) == 1
assert len(gov.gates) == 0
assert len(gov.boosts) == 0
def test_boosts_when_below_target(self):
"""When current coverage is below target, target tokens are boosted."""
lookup = _make_lookup({
0: ["k", "æ", "t"], # has /k/
1: ["d", "ɔ", "ɡ"], # no /k/
2: ["b", "æ", "t"], # no /k/
})
constraint = CoverageConstraint(phonemes={"k"}, target_rate=0.5)
gov = Governor.from_constraints(constraint, vocab_size=4, device="cpu", lookup=lookup)
# Simulate: 3 tokens generated so far, none with /k/ → 0% coverage, target 50%
# Should boost /k/ tokens
generated_ids = torch.tensor([[1, 2, 1]]) # all non-/k/ tokens
logits = torch.zeros(1, 4)
ctx = GovernorContext(device="cpu", token_ids=generated_ids)
result = gov(logits, ctx)
# Token 0 (has /k/) should be boosted relative to token 1 (no /k/)
assert result[0, 0].item() > result[0, 1].item()
def test_no_boost_when_at_target(self):
"""When current coverage meets target, no additional boost applied."""
lookup = _make_lookup({
0: ["k", "æ", "t"], # has /k/
1: ["d", "ɔ", "ɡ"], # no /k/
})
constraint = CoverageConstraint(phonemes={"k"}, target_rate=0.5)
gov = Governor.from_constraints(constraint, vocab_size=4, device="cpu", lookup=lookup)
# Simulate: 2 tokens generated, 1 with /k/ → 50% coverage = target
generated_ids = torch.tensor([[0, 1]]) # 50/50
logits = torch.zeros(1, 4)
ctx = GovernorContext(device="cpu", token_ids=generated_ids)
result = gov(logits, ctx)
# Boost should be minimal or zero when at target
diff = abs(result[0, 0].item() - result[0, 1].item())
assert diff < 0.5 # near-zero difference
def test_default_target_rate(self):
"""target_rate is required (no default)."""
with pytest.raises(TypeError):
CoverageConstraint(phonemes={"k"})
- [ ] Step 2: Run test to verify it fails
Run: cd packages/governors && python -m pytest tests/test_include.py::TestCoverageConstraint -v
Expected: FAIL — ImportError: cannot import name 'CoverageConstraint'
- [ ] Step 3: Implement CoverageConstraint
Add to packages/governors/src/diffusion_governors/include.py:
import torch
from torch import Tensor
from .constraints import Constraint
from .core import GovernorContext, Mechanism
class _CoverageProjection:
"""Stateful projection that adjusts boost based on running phoneme coverage."""
def __init__(self, target_ids: set[int], content_ids: set[int],
target_rate: float, max_boost: float, vocab_size: int):
self.target_ids = target_ids
self.content_ids = content_ids # all token IDs with phono data
self.target_rate = target_rate
self.max_boost = max_boost
self.vocab_size = vocab_size
def apply(self, logits: Tensor, ctx: GovernorContext) -> Tensor:
boost_strength = self.max_boost
if ctx.token_ids is not None and ctx.token_ids.shape[1] > 0:
generated = ctx.token_ids[0].tolist()
total_content = sum(1 for tid in generated if tid in self.content_ids)
target_hits = sum(1 for tid in generated if tid in self.target_ids)
if total_content > 0:
current_rate = target_hits / total_content
gap = self.target_rate - current_rate
if gap <= 0:
boost_strength = 0.0
else:
boost_strength = self.max_boost * min(gap / self.target_rate, 1.0)
bias = torch.zeros(logits.shape[-1], device=logits.device)
for tid in self.target_ids:
if tid < logits.shape[-1]:
bias[tid] = boost_strength
return logits + bias
class CoverageConstraint(Constraint):
"""Target a phoneme coverage rate across generated text.
Stateful projection: tracks how many content tokens contain target
phonemes and adjusts boost strength to converge toward target_rate.
Coverage is approximate — generation is stochastic.
"""
def __init__(self, phonemes: set[str], target_rate: float, max_boost: float = 3.0):
self.phonemes = phonemes
self.target_rate = target_rate
self.max_boost = max_boost
@property
def mechanism_kind(self) -> str:
return "projection"
def build(self, **kwargs) -> Mechanism:
lookup = kwargs["lookup"]
vocab_size = kwargs["vocab_size"]
target_ids: set[int] = set()
content_ids: set[int] = set()
for tid_str, feats in lookup.items():
tid = int(tid_str)
phono = feats.get("phono")
if phono and phono.get("phonemes"):
content_ids.add(tid)
if set(phono["phonemes"]) & self.phonemes:
target_ids.add(tid)
return _CoverageProjection(
target_ids=target_ids,
content_ids=content_ids,
target_rate=self.target_rate,
max_boost=self.max_boost,
vocab_size=vocab_size,
)
Note: _CoverageProjection stores precomputed target_ids and content_ids sets — NOT the full lookup dict. The lookup is only read during build() to identify which token IDs are content tokens.
- [ ] Step 4: Export from
__init__.py
Add CoverageConstraint to the imports in __init__.py.
- [ ] Step 5: Run tests
Run: cd packages/governors && python -m pytest tests/test_include.py -v
Expected: All 8 tests PASS (4 Include + 4 Coverage)
- [ ] Step 6: Run full governor test suite
Run: cd packages/governors && python -m pytest tests/ -v
Expected: All existing tests still pass
- [ ] Step 7: Commit
git add packages/governors/src/diffusion_governors/include.py \
packages/governors/tests/test_include.py \
packages/governors/src/diffusion_governors/__init__.py
git commit -m "feat: add CoverageConstraint — stateful phoneme coverage targeting"
Task 4: Add deprecation note to Density¶
Files:
- Modify: packages/governors/src/diffusion_governors/constraints.py
- [ ] Step 1: Add deprecation docstring to Density class
Read packages/governors/src/diffusion_governors/constraints.py, find the Density class, and add a deprecation note to its docstring:
# Add to Density class docstring:
# .. deprecated::
# Use CoverageConstraint from include.py instead. CoverageConstraint provides
# a more intuitive API (target rate percentage) and the same phoneme coverage
# targeting behavior.
- [ ] Step 2: Commit
git add packages/governors/src/diffusion_governors/constraints.py
git commit -m "docs: deprecate Density in favor of CoverageConstraint"
Chunk 2: Schema + API Contract Migration¶
Task 5: Add new constraint schema types¶
Files:
- Modify: packages/dashboard/server/schemas.py
- Modify: packages/dashboard/server/tests/test_schemas.py
- [ ] Step 1: Read current schemas
Read packages/dashboard/server/schemas.py to understand existing constraint type definitions and the Constraint union type.
- [ ] Step 2: Write failing tests for new schema types
Add to packages/dashboard/server/tests/test_schemas.py:
def test_include_constraint_schema():
data = {"type": "include", "phonemes": ["k", "s"], "strength": 2.0}
c = IncludeConstraint(**data)
assert c.type == "include"
assert c.phonemes == ["k", "s"]
assert c.strength == 2.0
def test_include_constraint_default_strength():
data = {"type": "include", "phonemes": ["k"]}
c = IncludeConstraint(**data)
assert c.strength == 2.0
def test_coverage_constraint_schema():
data = {"type": "coverage", "phonemes": ["k"], "target_rate": 0.2}
c = CoverageConstraint(**data)
assert c.target_rate == 0.2
def test_msh_constraint_schema():
data = {"type": "msh", "max_stage": 3}
c = MSHConstraint(**data)
assert c.max_stage == 3
def test_minpair_boost_schema():
data = {"type": "boost_minpair", "target": "s", "contrast": "ʃ", "strength": 2.0}
c = MinPairBoostConstraint(**data)
assert c.target == "s"
def test_maxopp_boost_schema():
data = {"type": "boost_maxopp", "target": "s", "contrast": "m", "strength": 2.0}
c = MaxOppositionBoostConstraint(**data)
assert c.target == "s"
def test_bound_constraint_mechanism_field():
data = {"type": "bound", "norm": "aoa_kuperman", "max": 5.0, "mechanism": "cdd"}
c = BoundConstraint(**data)
assert c.mechanism == "cdd"
def test_bound_constraint_mechanism_default():
data = {"type": "bound", "norm": "aoa_kuperman", "max": 5.0}
c = BoundConstraint(**data)
assert c.mechanism == "gate"
- [ ] Step 3: Run tests to verify they fail
Run: cd packages/dashboard && python -m pytest server/tests/test_schemas.py -v -k "include or coverage or msh or minpair or maxopp or mechanism"
Expected: FAIL — names not defined
- [ ] Step 4: Add new Pydantic models to schemas.py
Add the following models to packages/dashboard/server/schemas.py and update the Constraint discriminated union:
class IncludeConstraint(BaseModel):
type: Literal["include"] = "include"
phonemes: list[str]
strength: float = 2.0
class CoverageConstraint(BaseModel):
type: Literal["coverage"] = "coverage"
phonemes: list[str]
target_rate: float
class MSHConstraint(BaseModel):
type: Literal["msh"] = "msh"
max_stage: int
class MinPairBoostConstraint(BaseModel):
type: Literal["boost_minpair"] = "boost_minpair"
target: str
contrast: str
strength: float = 2.0
class MaxOppositionBoostConstraint(BaseModel):
type: Literal["boost_maxopp"] = "boost_maxopp"
target: str
contrast: str
strength: float = 2.0
Add mechanism: Literal["gate", "cdd"] = "gate" field to existing BoundConstraint.
Update the Constraint union to include all new types.
- [ ] Step 5: Run tests
Run: cd packages/dashboard && python -m pytest server/tests/test_schemas.py -v
Expected: All tests PASS
- [ ] Step 6: Commit
git add packages/dashboard/server/schemas.py packages/dashboard/server/tests/test_schemas.py
git commit -m "feat: add Include, Coverage, MSH, MinPair, MaxOpp constraint schemas"
Task 6: Wire new constraints in governor.py¶
Files:
- Modify: packages/dashboard/server/governor.py
- Modify: packages/dashboard/server/tests/test_governor.py
- [ ] Step 1: Read current governor.py
Read packages/dashboard/server/governor.py to understand _to_dg_constraint() and GovernorCache.
- [ ] Step 2: Write failing tests
Add tests for each new constraint type mapping in _to_dg_constraint(). Also test that HFGovernorProcessor passes step/total_steps.
- [ ] Step 3: Wire new types in
_to_dg_constraint()
Add cases for: include, coverage, msh, boost_minpair, boost_maxopp. Pass mechanism through for bound.
elif c.type == "include":
return dg.IncludeConstraint(phonemes=set(c.phonemes), strength=c.strength)
elif c.type == "coverage":
return dg.CoverageConstraint(phonemes=set(c.phonemes), target_rate=c.target_rate)
elif c.type == "msh":
return dg.MSHStage(max_stage=c.max_stage)
elif c.type == "boost_minpair":
return dg.MinPairBoost(target=c.target, contrast=c.contrast, boost=c.strength)
elif c.type == "boost_maxopp":
return dg.MaxOppositionBoost(target=c.target, contrast=c.contrast, boost=c.strength)
For bound, pass mechanism: dg.Bound(norm=c.norm, min=c.min, max=c.max, mechanism=c.mechanism)
- [ ] Step 4: Fix HFGovernorProcessor to pass step/total_steps
Update HFGovernorProcessor.__call__() to track step count and pass it to GovernorContext.
- [ ] Step 5: Run tests
Run: cd packages/dashboard && python -m pytest server/tests/test_governor.py -v
Expected: All tests PASS
- [ ] Step 6: Commit
git add packages/dashboard/server/governor.py packages/dashboard/server/tests/test_governor.py
git commit -m "feat: wire Include, Coverage, MSH, MinPair, MaxOpp into governor bridge"
Task 7: Migrate API contract — inline constraints¶
Files:
- Modify: packages/dashboard/server/schemas.py
- Modify: packages/dashboard/server/routes/generate.py
- Modify: packages/dashboard/server/governor.py
- Modify: packages/dashboard/server/sessions.py
- Modify: packages/dashboard/server/tests/test_api.py
- [ ] Step 1: Update GenerateRequest and GenerateSingleRequest
Replace profile_id: str with constraints: list[Constraint] = [] in both request models.
- [ ] Step 2: Update Turn model
Replace profile_id: str with constraints: list[Constraint] = [] in the Turn model.
- [ ] Step 3: Rekey GovernorCache
Replace GovernorCache keying from profile.id to a hash of the sorted, merged constraint list. Add a _constraint_hash() static method that serializes and hashes the constraint array.
@staticmethod
def _constraint_hash(constraints: list) -> str:
"""Order-independent hash of constraint set."""
serialized = sorted(c.model_dump_json() for c in constraints)
return hashlib.sha256("\x00".join(serialized).encode()).hexdigest()[:16]
- [ ] Step 4: Update generate routes
Update both /api/generate and /api/generate-single to:
- Accept constraints from request body
- Build governor from constraint list (via cache)
- Record constraint snapshot on Turn
- [ ] Step 5: Update sessions.py append_turn()
Update to accept constraints instead of profile_id.
- [ ] Step 6: Fix ProfileStore persistence path
In main.py, update ProfileStore() to ProfileStore(custom_path="data/profiles.json").
- [ ] Step 7: Update tests
Update test_api.py to send constraints instead of profile_id.
- [ ] Step 8: Run full test suite
Run: cd packages/dashboard && python -m pytest server/tests/ -v
Expected: All tests PASS
- [ ] Step 9: Commit
git add packages/dashboard/server/ packages/dashboard/tests/
git commit -m "feat: migrate API from profile_id to inline constraints with hashed cache"
Chunk 3: Frontend — ConstraintStore + Command Parser¶
Task 8: Define frontend types¶
Files:
- Modify: packages/dashboard/frontend/src/types.ts
- [ ] Step 1: Read current types.ts
Read packages/dashboard/frontend/src/types.ts to see existing constraint and turn types.
- [ ] Step 2: Add StoreEntry types and SystemMessage type
Add per-entry store types that map to the spec's granular model:
// Per-entry constraint store types (granular, one per command)
export type StoreEntry =
| { type: "exclude"; phoneme: string }
| { type: "exclude_clusters"; phoneme: string }
| { type: "include"; phoneme: string; strength: number }
| { type: "coverage"; phoneme: string; targetRate: number }
| { type: "bound"; norm: string; direction: "min" | "max"; value: number; mechanism?: "gate" | "cdd" }
| { type: "complexity_wcm"; max: number }
| { type: "complexity_syllables"; max: number }
| { type: "complexity_shapes"; shapes: string[] }
| { type: "msh"; maxStage: number }
| { type: "boost_minpair"; target: string; contrast: string; strength: number }
| { type: "boost_maxopp"; target: string; contrast: string; strength: number };
// System message in chat feed
export interface SystemMessage {
id: string;
type: "system";
prefix: "success" | "removed" | "error" | "warning";
text: string;
timestamp: number;
}
Add new API constraint types: IncludeConstraint, CoverageConstraint, MSHConstraint, MinPairBoostConstraint, MaxOppositionBoostConstraint. Update Constraint union. Update Turn to use constraints instead of profileId.
- [ ] Step 3: Commit
git add packages/dashboard/frontend/src/types.ts
git commit -m "feat: add StoreEntry, SystemMessage, and new constraint types"
Task 9: Implement ConstraintStore¶
Files:
- Create: packages/dashboard/frontend/src/store/constraintStore.ts
- [ ] Step 1: Install Zustand (if not present)
Run: cd packages/dashboard/frontend && npm ls zustand 2>/dev/null || npm install zustand
- [ ] Step 2: Implement ConstraintStore
// packages/dashboard/frontend/src/store/constraintStore.ts
import { create } from "zustand";
import type { StoreEntry } from "../types";
interface ConstraintStoreState {
entries: StoreEntry[];
add: (entry: StoreEntry) => void;
remove: (type: string, match?: Partial<StoreEntry>) => void;
clear: () => void;
load: (entries: StoreEntry[]) => void;
snapshot: () => StoreEntry[];
}
export const useConstraintStore = create<ConstraintStoreState>((set, get) => ({
entries: [],
add: (entry) =>
set((state) => {
// Check for duplicates
const isDuplicate = state.entries.some(
(e) => JSON.stringify(e) === JSON.stringify(entry)
);
if (isDuplicate) return state;
return { entries: [...state.entries, entry] };
}),
remove: (type, match) =>
set((state) => ({
entries: state.entries.filter((e) => {
if (e.type !== type) return true;
if (!match) return false; // remove all of this type
// Match specific fields
return !Object.entries(match).every(
([k, v]) => (e as Record<string, unknown>)[k] === v
);
}),
})),
clear: () => set({ entries: [] }),
load: (entries) => set({ entries }),
snapshot: () => get().entries,
}));
- [ ] Step 3: Commit
git add packages/dashboard/frontend/src/store/constraintStore.ts
git commit -m "feat: add ConstraintStore — Zustand store for active constraints"
Task 10: Implement phoneme normalization¶
Files:
- Create: packages/dashboard/frontend/src/commands/normalize.ts
- Create: packages/dashboard/frontend/src/commands/__tests__/normalize.test.ts
- [ ] Step 1: Write failing tests
// packages/dashboard/frontend/src/commands/__tests__/normalize.test.ts
import { describe, it, expect } from "vitest";
import { normalizePhoneme, isValidPhoneme } from "../normalize";
describe("normalizePhoneme", () => {
it("passes through IPA symbols unchanged", () => {
expect(normalizePhoneme("ɹ")).toBe("ɹ");
expect(normalizePhoneme("θ")).toBe("θ");
expect(normalizePhoneme("ʃ")).toBe("ʃ");
});
it("normalizes common ASCII shortcuts", () => {
expect(normalizePhoneme("r")).toBe("ɹ");
expect(normalizePhoneme("g")).toBe("ɡ");
expect(normalizePhoneme("th")).toBe("θ");
expect(normalizePhoneme("sh")).toBe("ʃ");
expect(normalizePhoneme("zh")).toBe("ʒ");
expect(normalizePhoneme("ch")).toBe("tʃ");
expect(normalizePhoneme("ng")).toBe("ŋ");
expect(normalizePhoneme("dh")).toBe("ð");
});
it("is case-insensitive", () => {
expect(normalizePhoneme("R")).toBe("ɹ");
expect(normalizePhoneme("TH")).toBe("θ");
});
});
describe("isValidPhoneme", () => {
it("returns true for valid English phonemes", () => {
expect(isValidPhoneme("ɹ")).toBe(true);
expect(isValidPhoneme("k")).toBe(true);
expect(isValidPhoneme("æ")).toBe(true);
});
it("returns false for invalid phonemes", () => {
expect(isValidPhoneme("blorp")).toBe(false);
expect(isValidPhoneme("x")).toBe(false);
});
});
- [ ] Step 2: Run test to verify it fails
Run: cd packages/dashboard/frontend && npx vitest run src/commands/__tests__/normalize.test.ts
- [ ] Step 3: Implement normalize.ts
Map ASCII shortcuts → IPA. Include the full English phoneme inventory for validation. Reference: packages/data/src/phonolex_data/mappings/arpa_to_ipa.json for the canonical mapping.
- [ ] Step 4: Run tests
Run: cd packages/dashboard/frontend && npx vitest run src/commands/__tests__/normalize.test.ts
Expected: All PASS
- [ ] Step 5: Commit
git add packages/dashboard/frontend/src/commands/normalize.ts \
packages/dashboard/frontend/src/commands/__tests__/normalize.test.ts
git commit -m "feat: add phoneme normalization — ASCII to IPA for command input"
Task 11: Implement command registry¶
Files:
- Create: packages/dashboard/frontend/src/commands/registry.ts
- [ ] Step 1: Implement registry
The registry defines all available commands: verb name, syntax pattern, default values, help text, norm key mapping.
// packages/dashboard/frontend/src/commands/registry.ts
export interface NormCommand {
verb: string;
normKey: string;
defaultDirection: "min" | "max";
description: string;
}
export const NORM_COMMANDS: NormCommand[] = [
{ verb: "aoa", normKey: "aoa_kuperman", defaultDirection: "max", description: "Age of acquisition" },
{ verb: "concreteness", normKey: "concreteness", defaultDirection: "min", description: "Concreteness rating" },
{ verb: "valence", normKey: "valence", defaultDirection: "min", description: "Emotional valence" },
{ verb: "arousal", normKey: "arousal", defaultDirection: "max", description: "Arousal level" },
{ verb: "imageability", normKey: "imageability", defaultDirection: "min", description: "Imageability rating" },
{ verb: "familiarity", normKey: "familiarity", defaultDirection: "min", description: "Word familiarity" },
{ verb: "frequency", normKey: "log_frequency", defaultDirection: "min", description: "Log word frequency" },
];
export const VERBS = [
"exclude", "exclude-clusters", "include", "complexity", "msh",
...NORM_COMMANDS.map((n) => n.verb),
"bound", "boost", "remove", "clear", "show", "help",
] as const;
export type Verb = (typeof VERBS)[number];
export function isVerb(s: string): s is Verb {
return (VERBS as readonly string[]).includes(s);
}
export function getNormCommand(verb: string): NormCommand | undefined {
return NORM_COMMANDS.find((n) => n.verb === verb);
}
- [ ] Step 2: Commit
git add packages/dashboard/frontend/src/commands/registry.ts
git commit -m "feat: add command registry — verb definitions, norm mappings, help text"
Task 12: Implement command parser¶
Files:
- Create: packages/dashboard/frontend/src/commands/parser.ts
- Create: packages/dashboard/frontend/src/commands/__tests__/parser.test.ts
- [ ] Step 1: Write failing tests
Test all major command forms:
// packages/dashboard/frontend/src/commands/__tests__/parser.test.ts
import { describe, it, expect } from "vitest";
import { parseCommand } from "../parser";
describe("parseCommand", () => {
it("returns null for non-command input", () => {
expect(parseCommand("hello world")).toBeNull();
expect(parseCommand("")).toBeNull();
});
it("parses /exclude with phonemes", () => {
const result = parseCommand("/exclude r s");
expect(result).toEqual({
type: "add",
entries: [
{ type: "exclude", phoneme: "ɹ" },
{ type: "exclude", phoneme: "s" },
],
confirmation: "Exclude /ɹ/, /s/",
});
});
it("parses /include with bare boost", () => {
const result = parseCommand("/include k");
expect(result).toEqual({
type: "add",
entries: [{ type: "include", phoneme: "k", strength: 2.0 }],
confirmation: "Include /k/",
});
});
it("parses /include with coverage target", () => {
const result = parseCommand("/include k 20%");
expect(result).toEqual({
type: "add",
entries: [{ type: "coverage", phoneme: "k", targetRate: 0.2 }],
confirmation: "Include /k/ ~20% (approximate)",
});
});
it("parses named norm with default direction", () => {
const result = parseCommand("/aoa 5");
expect(result).toEqual({
type: "add",
entries: [{ type: "bound", norm: "aoa_kuperman", direction: "max", value: 5.0 }],
confirmation: "AoA ≤ 5.0",
});
});
it("parses named norm with explicit direction", () => {
const result = parseCommand("/concreteness min 3.5");
expect(result).toEqual({
type: "add",
entries: [{ type: "bound", norm: "concreteness", direction: "min", value: 3.5 }],
confirmation: "Concreteness ≥ 3.5",
});
});
it("parses /remove", () => {
const result = parseCommand("/remove exclude ɹ");
expect(result).toEqual({
type: "remove",
targetType: "exclude",
match: { phoneme: "ɹ" },
confirmation: "Removed Exclude /ɹ/",
});
});
it("parses /clear", () => {
const result = parseCommand("/clear");
expect(result).toEqual({
type: "clear",
confirmation: "All constraints cleared",
});
});
it("parses /msh", () => {
const result = parseCommand("/msh 3");
expect(result).toEqual({
type: "add",
entries: [{ type: "msh", maxStage: 3 }],
confirmation: "MSH ≤ 3",
});
});
it("parses /complexity wcm", () => {
const result = parseCommand("/complexity wcm max 8");
expect(result).toEqual({
type: "add",
entries: [{ type: "complexity_wcm", max: 8 }],
confirmation: "WCM ≤ 8",
});
});
it("parses /boost minpair", () => {
const result = parseCommand("/boost minpair s sh");
expect(result).toEqual({
type: "add",
entries: [{ type: "boost_minpair", target: "s", contrast: "ʃ", strength: 2.0 }],
confirmation: "MinPair /s/ ~ /ʃ/",
});
});
it("returns error for unknown command", () => {
const result = parseCommand("/blorp");
expect(result?.type).toBe("error");
});
it("returns error for invalid phoneme", () => {
const result = parseCommand("/exclude blorp");
expect(result?.type).toBe("error");
});
it("parses /show", () => {
const result = parseCommand("/show");
expect(result).toEqual({ type: "show" });
});
it("parses /help", () => {
const result = parseCommand("/help");
expect(result).toEqual({ type: "help" });
});
it("parses /help with verb", () => {
const result = parseCommand("/help exclude");
expect(result).toEqual({ type: "help", verb: "exclude" });
});
});
- [ ] Step 2: Run tests to verify they fail
Run: cd packages/dashboard/frontend && npx vitest run src/commands/__tests__/parser.test.ts
- [ ] Step 3: Implement parser.ts
The parser detects / prefix, extracts the verb, dispatches to verb-specific handlers. Each handler returns ParseResult:
export type ParseResult =
| { type: "add"; entries: StoreEntry[]; confirmation: string }
| { type: "remove"; targetType: string; match?: Partial<StoreEntry>; confirmation: string }
| { type: "clear"; confirmation: string }
| { type: "show" }
| { type: "help"; verb?: string }
| { type: "error"; message: string };
export function parseCommand(input: string): ParseResult | null {
const trimmed = input.trim();
if (!trimmed.startsWith("/")) return null;
const parts = trimmed.slice(1).split(/\s+/);
const verb = parts[0]?.toLowerCase();
const args = parts.slice(1);
// Dispatch to verb handler...
}
Implement handlers for each verb using the grammar from the spec. Use normalizePhoneme() for phoneme args. Use getNormCommand() for norm verbs.
- [ ] Step 4: Run tests
Run: cd packages/dashboard/frontend && npx vitest run src/commands/__tests__/parser.test.ts
Expected: All PASS
- [ ] Step 5: Commit
git add packages/dashboard/frontend/src/commands/parser.ts \
packages/dashboard/frontend/src/commands/__tests__/parser.test.ts
git commit -m "feat: add command parser — full grammar for all constraint commands"
Task 13: Implement constraint compiler¶
Files:
- Create: packages/dashboard/frontend/src/commands/compiler.ts
- Create: packages/dashboard/frontend/src/commands/__tests__/compiler.test.ts
The compiler merges per-entry store items into the API constraint format.
- [ ] Step 1: Write failing tests
Test merging of exclude entries, complexity entries, and passthrough of other types.
- [ ] Step 2: Implement compiler
// Merges StoreEntry[] → Constraint[] for the API
export function compileConstraints(entries: StoreEntry[]): Constraint[] { ... }
Groups excludes into one { type: "exclude", phonemes: [...] }, merges complexity sub-entries, passes other types through. Also auto-inserts NormCovered gates: when compiling bound entries, collect all referenced norm keys and emit a single { type: "norm_covered", norms: [...] } constraint alongside the bounds. This is invisible to the user (no chip, no system message) but ensures tokens without norm data don't slip through bounds.
- [ ] Step 3: Run tests, commit
Chunk 4: Frontend — Constraint Bar + System Messages¶
Note: Chunk 4 depends on Chunk 3 (ConstraintStore + parser types). Do not start Chunk 4 until Task 9 (ConstraintStore) is complete.
Task 14: Implement ConstraintChip component¶
Files:
- Create: packages/dashboard/frontend/src/components/ConstraintBar/ConstraintChip.tsx
- [ ] Step 1: Implement chip with color coding and dismiss
Color scheme from spec: red/warm for exclusion, blue for bounds, green for include/boost, neutral for structural.
interface ConstraintChipProps {
label: string;
category: "exclusion" | "bound" | "include" | "structural";
onDismiss: () => void;
}
- [ ] Step 2: Commit
Task 15: Implement ConstraintBar component¶
Files:
- Create: packages/dashboard/frontend/src/components/ConstraintBar/index.tsx
- [ ] Step 1: Implement bar with chips from store
Reads from useConstraintStore. Maps entries to chips with labels (per spec chip label table). Shows coherence warning when entries.length >= 3. Per-phoneme chips for exclude/include entries.
- [ ] Step 2: Commit
Task 16: Implement SystemMessage component¶
Files:
- Create: packages/dashboard/frontend/src/components/ChatWindow/SystemMessage.tsx
- [ ] Step 1: Implement styled system message
interface SystemMessageProps {
prefix: "success" | "removed" | "error" | "warning";
text: string;
}
Prefix mapping: success → ✓ green, removed → ✗ red, error → ✗ red, warning → ⚠ amber. Styled distinctly from user/assistant messages (compact, left-aligned, muted background).
- [ ] Step 2: Commit
Task 17: Wire command input into ChatInput¶
Files:
- Modify: packages/dashboard/frontend/src/components/ChatWindow/ChatInput.tsx
-
[ ] Step 1: Read current ChatInput.tsx
-
[ ] Step 2: Add command detection and dispatch
When the user submits text starting with /:
1. Call parseCommand(text)
2. If result is add: call store.add() for each entry, emit system message
3. If result is remove: call store.remove(), emit system message
4. If result is clear: call store.clear(), emit system message
5. If result is error: emit error system message
6. If result is null: proceed with normal chat message send
Do NOT send command text as a chat message to the model.
- [ ] Step 3: Commit
Task 18: Wire system messages into MessageList¶
Files:
- Modify: packages/dashboard/frontend/src/components/ChatWindow/MessageList.tsx
- Modify: packages/dashboard/frontend/src/components/ChatWindow/index.tsx
- [ ] Step 1: Add system message state
Track system messages in the chat window (local state or store). Interleave with user/assistant messages by timestamp.
-
[ ] Step 2: Render SystemMessage components in MessageList
-
[ ] Step 3: Commit
Task 19: Wire ConstraintBar into App layout¶
Files:
- Modify: packages/dashboard/frontend/src/App.tsx
-
[ ] Step 1: Read current App.tsx layout
-
[ ] Step 2: Add ConstraintBar between header and chat area
Replace the left ConstraintPanel with the pinned ConstraintBar above the chat window. This changes the layout from three-column (constraint | chat | compliance) to two-column (chat | compliance) with a top bar.
- [ ] Step 3: Commit
Chunk 5: Frontend — API Integration + Migration¶
Task 20: Update API client¶
Files:
- Modify: packages/dashboard/frontend/src/api/client.ts
-
[ ] Step 1: Read current client.ts
-
[ ] Step 2: Update generate() and generateSingle()
Change from sending profile_id to sending constraints array. Use the compiler to convert store entries to API constraints before sending.
- [ ] Step 3: Commit
Task 21: Update useSession hook¶
Files:
- Modify: packages/dashboard/frontend/src/hooks/useSession.ts
-
[ ] Step 1: Read current useSession.ts
-
[ ] Step 2: Wire ConstraintStore into sendMessage
When sending a message, read useConstraintStore.getState().snapshot(), compile with compileConstraints(), and pass as constraints in the generate request.
- [ ] Step 3: Commit
Task 22: Update useProfile hook for store loading¶
Files:
- Modify: packages/dashboard/frontend/src/hooks/useProfile.ts
-
[ ] Step 1: Read current useProfile.ts
-
[ ] Step 2: Wire profile selection to store.load()
When a profile is selected, convert its constraints array into StoreEntry[] (one entry per phoneme, one per bound, etc.) and call store.load(). Discard the profile name.
- [ ] Step 3: Commit
Task 23: Update CompliancePanel to read from store¶
Files:
- Modify: packages/dashboard/frontend/src/components/CompliancePanel/ActiveConstraints.tsx
-
[ ] Step 1: Read current ActiveConstraints.tsx
-
[ ] Step 2: Read from ConstraintStore instead of profile
Use useConstraintStore() to get active entries. Render all constraint types (no more "Unknown" fallthrough for vocab_only).
- [ ] Step 3: Commit
Task 24: Wire GenerateWindow to ConstraintStore¶
Files:
- Modify: packages/dashboard/frontend/src/components/GenerateWindow/index.tsx
- [ ] Step 1: Read current GenerateWindow
Read packages/dashboard/frontend/src/components/GenerateWindow/index.tsx — currently calls generateSingle(prompt, profileId).
- [ ] Step 2: Wire to ConstraintStore
Import useConstraintStore and compileConstraints. Replace profileId with compiled constraints from the store. Both chat and generate modes now read from the same store.
- [ ] Step 3: Commit
git add packages/dashboard/frontend/src/components/GenerateWindow/
git commit -m "feat: wire GenerateWindow to ConstraintStore for inline constraints"
Task 25: Handle session schema backward compatibility¶
Files:
- Modify: packages/dashboard/server/schemas.py
- Modify: packages/dashboard/server/sessions.py
- [ ] Step 1: Make
constraintsoptional with fallback
In the Turn model, make constraints optional with a default of []. Add a validator that converts legacy profile_id to an empty constraint list. This prevents existing persisted sessions from failing validation on load.
- [ ] Step 2: Update SessionCreateRequest
Remove profile_id from SessionCreateRequest — sessions are no longer tied to a profile at creation time. Constraints are provided per-generation, not per-session.
- [ ] Step 3: Update
model.enrich_tokens()call in generate routes
In routes/generate.py, the enrich_tokens() call currently receives profile.constraints. Update it to receive the constraint list from the request body instead.
- [ ] Step 4: Run tests, commit
cd packages/dashboard && python -m pytest server/tests/ -v
git add packages/dashboard/server/
git commit -m "fix: session schema backward compat — constraints optional with legacy fallback"
Task 26: Remove old ConstraintPanel¶
Files:
- Delete: packages/dashboard/frontend/src/components/ConstraintPanel/index.tsx
- Delete: packages/dashboard/frontend/src/components/ConstraintPanel/PhonemeToggles.tsx
- Delete: packages/dashboard/frontend/src/components/ConstraintPanel/NormSliders.tsx
- Delete: packages/dashboard/frontend/src/components/ConstraintPanel/ComplexityBounds.tsx
- Delete: packages/dashboard/frontend/src/components/ConstraintPanel/ProfileActions.tsx
- Modify: packages/dashboard/frontend/src/components/ConstraintPanel/ProfileSelector.tsx — keep as standalone, wire to store
-
[ ] Step 1: Remove deleted files and update imports
-
[ ] Step 2: Move ProfileSelector to be accessible from ConstraintBar or a dropdown
-
[ ] Step 3: Verify no broken imports
Run: cd packages/dashboard/frontend && npx tsc --noEmit
- [ ] Step 4: Commit
git add -A packages/dashboard/frontend/src/components/ConstraintPanel/
git commit -m "refactor: remove old ConstraintPanel — replaced by command language + constraint bar"
Task 25: End-to-end verification¶
- [ ] Step 1: Start dashboard backend
Run: cd packages/dashboard && python -m uvicorn server.main:app --reload
- [ ] Step 2: Start dashboard frontend
Run: cd packages/dashboard/frontend && npm run dev
-
[ ] Step 3: Test command flow
-
Type
/exclude r→ verify chip appears in bar, system message in chat - Type
/aoa 5→ verify second chip, system message - Verify coherence warning appears (3+ constraints: add
/concreteness 3) - Click × on a chip → verify removal, system message
- Type a chat message → verify generation uses active constraints
- Type
/clear→ verify all chips removed -
Select a built-in profile → verify store populated, chips appear
-
[ ] Step 4: Run full test suites
cd packages/governors && python -m pytest tests/ -v
cd packages/dashboard && python -m pytest server/tests/ -v
cd packages/dashboard/frontend && npx vitest run
-
[ ] Step 5: Commit any fixes from E2E testing
-
[ ] Step 6: Final commit
git commit -m "feat: governed chat command language — complete implementation"
Chunk 6: Autocomplete + IPA Keyboard (Deferred)¶
This chunk implements the hybrid input model (autocomplete dropdown + IPA keyboard toggle). It builds on the parser from Chunk 3 but is not required for the core command language to function — users can type commands with raw IPA or ASCII shortcuts.
Deferred to a follow-up session. The core command language (Chunks 1-5) is fully functional without autocomplete.
Prerequisites¶
- Vitest setup: The dashboard frontend may not have a Vitest config. Before running frontend tests, check for
packages/dashboard/frontend/vitest.config.tsand create one if missing (model onpackages/web/workers/vitest.config.ts). - Zustand install: Run
cd packages/dashboard/frontend && npm install zustandbefore Task 9. - mkdir: Create
packages/dashboard/frontend/src/store/andpackages/dashboard/frontend/src/commands/directories before writing files there.
Dependency Graph¶
Chunk 1 (Engine) ──────┐
├──→ Chunk 2 (Schema/API) ──→ Chunk 5 (Integration)
Chunk 3 (Store/Parser) ─┘ │
│ ↓
└──→ Chunk 4 (Bar/Messages) ─────────────────→ Chunk 5
│
Chunk 6 (Autocomplete — deferred)
Parallelism: Chunks 1 and 3 can run in parallel. Chunk 4 depends on Chunk 3 (needs ConstraintStore). Chunk 2 depends on Chunk 1. Chunk 5 depends on all others.