8.5 KiB
AGENTS.md - tricu Project Guide
For AI agents and contributors working in this repository.
1. Build & Test
# Tests
nix flake check
# Full build
nix build .#
⚠️ Never call cabal directly
Rule of thumb: if it builds, links, or tests, it goes through
nix.
2. Project Overview
tricu (pronounced "tree-shoe") is a programming-language experiment written in Haskell. It implements Triage Calculus, an extension of Barry Jay's Tree Calculus, with lambda-abstraction sugar that gets eliminated back to pure tree calculus terms.
Core types (in src/Research.hs)
| Type | Description |
|---|---|
T = Leaf | Stem T | Fork T T |
Tree Calculus term (the runtime value) |
TricuAST |
Parsed AST with SDef, SApp, SLambda, etc. |
LToken |
Lexer tokens |
Node / MerkleHash |
Content-addressed Merkle DAG nodes |
Source modules
| Module | Purpose |
|---|---|
Main.hs |
CLI entry point (cmdargs), three modes: repl, eval, decode |
Eval.hs |
Interpreter: evalTricu, result, evalSingle |
Parser.hs |
Megaparsec parser → TricuAST |
Lexer.hs |
Megaparsec lexer → LToken |
FileEval.hs |
File loading, module imports, !import |
REPL.hs |
Interactive Read-Eval-Print Loop (haskeline) |
Research.hs |
Core types, apply reduction, booleans, marshalling (ofString, ofNumber), output formatters (toAscii, toTernaryString, decodeResult) |
ContentStore.hs |
SQLite-backed term persistence |
Wire.hs |
Arboricx portable wire format — encode/decode/import/export of Merkle-DAG bundle blobs |
File extensions
.hs- Haskell source.tri- tricu language source (used inlib/,test/,demos/)
3. Test Suite
Tests live in test/Spec.hs and use Tasty + HUnit.
nix flake check
Test groups
| Group | What it covers |
|---|---|
lexer |
Megaparsec lexer - identifiers, keywords, strings, escapes, invalid tokens |
parser |
Parser - defs, lambda, applications, lists, comments, parentheses |
simpleEvaluation |
Core apply reduction rules, variable substitution, immutability |
lambdas |
Lambda elimination, SKI calculus, higher-order functions, currying, shadowing, free vars |
providedLibraries |
lib/list.tri - triage, booleans, list ops (head, tail, map, emptyList?, append, equal?) |
fileEval |
Loading .tri files, multi-file context, decode |
modules |
!import, cyclic deps, namespacing, multi-level imports, unresolved vars, local namespaces |
demos |
demos/*.tri - structural equality, toSource, size, level-order traversal |
decoding |
decodeResult - Leaf, numbers, strings, lists, mixed |
elimLambdaSingle |
Lambda elimination: eta reduction, SDef binding, semantics preservation |
stressElimLambda |
Lambda elimination stress test: 200 vars, 800-body curried lambda |
Suggesting tests
You do not write or modify tests. The user writes tests to constrain your outputs. You must adhere your code to tests or suggest modifications to tests.
If the user gives you explicit permission to implement a test you may proceed.
4. tricu Language Quick Reference
t → Leaf (the base term)
t t → Stem Leaf
t t t → Fork Leaf Leaf
x = t → Define term x = Leaf
id = (a : a) → Lambda identity (eliminates to tree calculus)
head (map f xs) → From lib/list.tri
!import "./path.tri" NS → Import file under namespace
-- line comment
CRITICAL:
When working with recursion in tricu files:
- Put consumed data first in recursive workers.
- Let data shape drive recursion.
- Do not let counters unroll over abstract input.
5. Output Formats
The eval command accepts --form (shorthand -t):
| Format | Value | Description |
|---|---|---|
tree |
TreeCalculus |
Simple t form (default) |
fsl |
FSL |
Full show representation |
ast |
AST |
Parsed AST representation |
ternary |
Ternary |
Ternary string encoding |
ascii |
Ascii |
ASCII-art tree diagram |
decode |
Decode |
Human-readable (strings, numbers, lists) |
6. Content Addressing
Each T term is content-addressed via a Merkle DAG:
NLeaf → 0x00
NStem(h) → 0x01 || h (32 bytes)
NFork(l,r) → 0x02 || l (32 bytes) || r (32 bytes)
hash = SHA256("arboricx.merkle.node.v1" <> 0x00 <> serialized_node)
This is stored in SQLite via ContentStore.hs. Hash suffixes on identifiers (e.g., foo_abc123...) are validated: 16–64 hex characters (SHA256).
7. Arboricx Portable Bundles (.arboricx)
Portable executable bundles are generated via Wire.hs. See docs/arboricx-bundle-format.md for the full binary format spec.
# Export a bundle from the content store
./result/bin/tricu export -o myterm.arboricx myterm
# Run a bundle (requires TRICU_DB_PATH)
./result/bin/tricu import -f lib/list.tri
TRICU_DB_PATH=/tmp/tricu.db ./result/bin/tricu export -o list_ops.arboricx append
8. Directory Layout
tricu/
├── flake.nix # Nix flake: packages, tests, devShell
├── tricu.cabal # Cabal package (used via callCabal2nix)
├── src/ # Haskell modules
│ ├── Main.hs
│ ├── Eval.hs
│ ├── Parser.hs
│ ├── Lexer.hs
│ ├── FileEval.hs
│ ├── REPL.hs
│ ├── Research.hs
│ ├── ContentStore.hs
│ └── Wire.hs # Arboricx portable wire format
├── test/
│ ├── Spec.hs # Tasty + HUnit tests
│ ├── *.tri # tricu test programs
│ └── local-ns/ # Module namespace test files
├── lib/
│ ├── base.tri
│ ├── list.tri
│ └── patterns.tri
├── demos/
│ ├── equality.tri
│ ├── size.tri
│ ├── toSource.tri
│ ├── levelOrderTraversal.tri
│ └── patternMatching.tri
└── AGENTS.md # This file
9. JS Arboricx Runtime
A JavaScript implementation of the Arboricx portable bundle runtime lives in ext/js/.
It is a reference implementation — not a tricu source parser. It reads .arboricx files produced by the Haskell toolchain, verifies Merkle node hashes, reconstructs tree values, and reduces them.
From project root:
node ext/js/src/cli.js inspect test/fixtures/id.arboricx
node ext/js/src/cli.js run test/fixtures/true.arboricx
The JS runtime implements:
- Bundle binary format parsing (header, section directory, manifest, nodes)
- SHA-256 Merkle node hash verification against canonical payloads
- Closure verification (all child references present)
- Tree reconstruction from node DAG
- Core
applyreduction rules - Basic codecs (decodeResult)
- CLI:
inspectandruncommands
10. Content Store Workflow (Custom DB)
The content store location is controlled by the TRICU_DB_PATH environment variable. When set, eval mode automatically loads all stored terms into the initial environment, so you can call any previously imported/evaluated term by name.
# Use a local DB
export TRICU_DB_PATH=/tmp/tricu-local.db
# Import terms from the standard library
./result/bin/tricu import -f lib/list.tri
# Now use them in eval mode
echo "not? (t t)" | ./result/bin/tricu eval -t decode
# Output: t
echo "not? (t t t)" | ./result/bin/tricu eval -t decode
# Output: Stem Leaf
echo "equal? (t t) (t t t)" | ./result/bin/tricu eval -t decode
# Output: t
# Check what's in the store
./result/bin/tricu
t> !definitions
Without TRICU_DB_PATH set, eval uses only the terms defined in the input file(s).
11. Development Tips
- REPL:
nix run .#starts the interactive tricu REPL. - Evaluate files:
nix run .# -- eval -f demos/equality.tri - GHC options:
-threaded -rtsopts -with-rtsopts=-Nfor parallel runtime. Use-NRTS flag for multi-core. - Upx is in the devShell for binary compression if needed.
12. Viewing Haskell Dependency Docs from Nix
When you need Haddock documentation for a Haskell dependency available in Nixpkgs, build the package's doc output directly with ^doc.
Example:
Replace megaparsec with the dependency name you need:
nix build "nixpkgs#haskellPackages.${pkg}^doc"
View the available documentation files:
find ./result-doc -type f \( -name '*.html' -o -name '*.haddock' \) | sort