Initial JS runtime and Arborix Implementation
This commit is contained in:
183
AGENTS.md
183
AGENTS.md
@@ -1,4 +1,4 @@
|
||||
# AGENTS.md — tricu Project Guide
|
||||
# AGENTS.md - tricu Project Guide
|
||||
|
||||
> For AI agents and contributors working in this repository.
|
||||
|
||||
@@ -38,11 +38,12 @@ nix build .#
|
||||
| `REPL.hs` | Interactive Read-Eval-Print Loop (haskeline) |
|
||||
| `Research.hs` | Core types, `apply` reduction, booleans, marshalling (`ofString`, `ofNumber`), output formatters (`toAscii`, `toTernaryString`, `decodeResult`) |
|
||||
| `ContentStore.hs` | SQLite-backed term persistence |
|
||||
| `Wire.hs` | Arborix portable wire format — encode/decode/import/export of Merkle-DAG bundle blobs |
|
||||
|
||||
### File extensions
|
||||
|
||||
- `.hs` — Haskell source
|
||||
- `.tri` — tricu language source (used in `lib/`, `test/`, `demos/`)
|
||||
- `.hs` - Haskell source
|
||||
- `.tri` - tricu language source (used in `lib/`, `test/`, `demos/`)
|
||||
|
||||
## 3. Test Suite
|
||||
|
||||
@@ -56,15 +57,15 @@ nix flake check # or: nix build .#test
|
||||
|
||||
| Group | What it covers |
|
||||
|-------|----------------|
|
||||
| `lexer` | Megaparsec lexer — identifiers, keywords, strings, escapes, invalid tokens |
|
||||
| `parser` | Parser — defs, lambda, applications, lists, comments, parentheses |
|
||||
| `lexer` | Megaparsec lexer - identifiers, keywords, strings, escapes, invalid tokens |
|
||||
| `parser` | Parser - defs, lambda, applications, lists, comments, parentheses |
|
||||
| `simpleEvaluation` | Core `apply` reduction rules, variable substitution, immutability |
|
||||
| `lambdas` | Lambda elimination, SKI calculus, higher-order functions, currying, shadowing, free vars |
|
||||
| `providedLibraries` | `lib/list.tri` — triage, booleans, list ops (`head`, `tail`, `map`, `emptyList?`, `append`, `equal?`) |
|
||||
| `providedLibraries` | `lib/list.tri` - triage, booleans, list ops (`head`, `tail`, `map`, `emptyList?`, `append`, `equal?`) |
|
||||
| `fileEval` | Loading `.tri` files, multi-file context, decode |
|
||||
| `modules` | `!import`, cyclic deps, namespacing, multi-level imports, unresolved vars, local namespaces |
|
||||
| `demos` | `demos/*.tri` — structural equality, `toSource`, `size`, level-order traversal |
|
||||
| `decoding` | `decodeResult` — Leaf, numbers, strings, lists, mixed |
|
||||
| `demos` | `demos/*.tri` - structural equality, `toSource`, `size`, level-order traversal |
|
||||
| `decoding` | `decodeResult` - Leaf, numbers, strings, lists, mixed |
|
||||
| `elimLambdaSingle` | Lambda elimination: eta reduction, SDef binding, semantics preservation |
|
||||
| `stressElimLambda` | Lambda elimination stress test: 200 vars, 800-body curried lambda |
|
||||
|
||||
@@ -112,12 +113,121 @@ NLeaf → 0x00
|
||||
NStem(h) → 0x01 || h (32 bytes)
|
||||
NFork(l,r) → 0x02 || l (32 bytes) || r (32 bytes)
|
||||
|
||||
hash = SHA256("tricu.merkle.node.v1" <> 0x00 <> serialized_node)
|
||||
hash = SHA256("arborix.merkle.node.v1" <> 0x00 <> serialized_node)
|
||||
```
|
||||
|
||||
This is stored in SQLite via `ContentStore.hs`. Hash suffixes on identifiers (e.g., `foo_abc123...`) are validated: 16–64 hex characters (SHA256).
|
||||
|
||||
## 7. Directory Layout
|
||||
## 7. Arborix Portable Wire Format
|
||||
|
||||
The **Arborix wire format** (module `Wire.hs`) defines a portable binary bundle for exchanging Tree Calculus terms, their Merkle DAGs, and associated metadata. It is versioned and schema-driven.
|
||||
|
||||
### Header
|
||||
|
||||
```
|
||||
+------------------+-----------------+------------------+----------------+
|
||||
| Magic (8 bytes) | Major (2 bytes) | Minor (2 bytes) | Section Count |
|
||||
| | | | (4 bytes) |
|
||||
+------------------+-----------------+------------------+----------------+
|
||||
| Flags (8 bytes) | Dir Offset (8 bytes)
|
||||
+------------------+-----------------+------------------+
|
||||
```
|
||||
|
||||
- **Magic**: `ARBORIX\0` (`0x41 0x52 0x42 0x4f 0x52 0x49 0x58 0x00`)
|
||||
- **Header length**: 32 bytes
|
||||
- **Major version**: `1` | **Minor version**: `0`
|
||||
|
||||
### Section Directory
|
||||
|
||||
Immediately follows the header. Each section entry is 60 bytes:
|
||||
|
||||
```
|
||||
+------------------+------------------+-----------------+------------------+
|
||||
| Type (4 bytes) | Version (2 bytes)| Flags (2 bytes) | Compression (2) |
|
||||
+------------------+------------------+-----------------+------------------+
|
||||
| Digest Algo (2) | Offset (8 bytes) | Length (8 bytes)| SHA256 digest (32)|
|
||||
+------------------+------------------+-----------------+------------------+
|
||||
```
|
||||
|
||||
Known section types:
|
||||
|
||||
| Type | Name | Required | Description |
|
||||
|------|-----------|----------|-------------|
|
||||
| 1 | manifest | Yes | JSON manifest metadata |
|
||||
| 2 | nodes | Yes | Binary Merkle node payloads |
|
||||
|
||||
### Section 1 — Manifest (JSON)
|
||||
|
||||
The manifest describes the bundle's semantics, exports, and schema. Key fields:
|
||||
|
||||
| Field | Value | Description |
|
||||
|-------|-------|-------------|
|
||||
| `schema` | `"arborix.bundle.manifest.v1"` | Manifest schema version |
|
||||
| `bundleType` | `"tree-calculus-executable-object"` | Bundle category |
|
||||
| `tree.calculus` | `"tree-calculus.v1"` | Tree calculus version |
|
||||
| `tree.nodeHash.algorithm` | `"sha256"` | Hash algorithm |
|
||||
| `tree.nodeHash.domain` | `"arborix.merkle.node.v1"` | Hash domain string |
|
||||
| `tree.nodePayload` | `"arborix.merkle.payload.v1"` | Payload encoding |
|
||||
| `runtime.semantics` | `"tree-calculus.v1"` | Evaluation semantics |
|
||||
| `runtime.abi` | `"arborix.abi.tree.v1"` | Runtime ABI |
|
||||
| `closure` | `"complete"` | Bundle must be a complete DAG |
|
||||
| `roots` | `[{"hash": "...", "role": "..."}]` | Named root hashes |
|
||||
| `exports` | `[{"name": "...", "root": "..."}]` | Export aliases for roots |
|
||||
| `metadata.createdBy` | `"arborix"` | Originator |
|
||||
|
||||
### Section 2 — Nodes (Binary)
|
||||
|
||||
```
|
||||
+------------------+-------------------+-------------------+-----------------+
|
||||
| Node Count (8) | Hash (32 bytes) | Payload Len (4) | Payload (N) |
|
||||
+------------------+-------------------+-------------------+-----------------+
|
||||
```
|
||||
|
||||
Each node entry contains:
|
||||
- 32-byte Merkle hash (hex-encoded in identifiers, raw in binary)
|
||||
- 4-byte big-endian payload length
|
||||
- N bytes of serialized node payload (`0x00` for Leaf, `0x01 || hash` for Stem, `0x02 || left || right` for Fork)
|
||||
|
||||
### Bundle verification flow
|
||||
|
||||
1. Check magic bytes
|
||||
2. Validate major version
|
||||
3. Parse section directory
|
||||
4. For each section: verify SHA256 digest against actual bytes
|
||||
5. Decode JSON manifest
|
||||
6. Decode binary node entries into Merkle DAG
|
||||
7. Verify all root hashes present in manifest exist in node map
|
||||
8. Verify export root hashes present
|
||||
9. Verify children references are complete (no dangling nodes)
|
||||
10. Reject unknown critical sections
|
||||
|
||||
### Data types (Wire.hs)
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `Bundle` | Top-level bundle: version, roots, nodes map, manifest |
|
||||
| `BundleManifest` | JSON metadata: schema, tree spec, runtime spec, roots, exports |
|
||||
| `TreeSpec` | Tree calculus version + hash algorithm + payload encoding |
|
||||
| `NodeHashSpec` | Hash algorithm and domain string |
|
||||
| `RuntimeSpec` | Semantics, evaluation order, ABI, capabilities |
|
||||
| `BundleRoot` | Root hash + role (`"default"` or `"root"`) |
|
||||
| `BundleExport` | Export name + root hash + kind + ABI |
|
||||
| `BundleMetadata` | Optional package, version, description, license, createdBy |
|
||||
| `ClosureMode` | `ClosureComplete` or `ClosurePartial` |
|
||||
|
||||
### Key functions
|
||||
|
||||
| Function | Signature | Purpose |
|
||||
|----------|-----------|---------|
|
||||
| `encodeBundle` | `Bundle → ByteString` | Serialize bundle to wire bytes |
|
||||
| `decodeBundle` | `ByteString → Either String Bundle` | Parse wire bytes into Bundle |
|
||||
| `verifyBundle` | `Bundle → Either String ()` | Validate DAG, manifest, roots |
|
||||
| `collectReachableNodes` | `Connection → MerkleHash → IO [(MerkleHash, ByteString)]` | Traverse DAG from root |
|
||||
| `exportBundle` | `Connection → [MerkleHash] → IO ByteString` | Build bundle from content store |
|
||||
| `exportNamedBundle` | `Connection → [(Text, MerkleHash)] → IO ByteString` | Build with named roots |
|
||||
| `importBundle` | `Connection → ByteString → IO [MerkleHash]` | Import bundle into content store |
|
||||
|
||||
## 8. Directory Layout
|
||||
|
||||
```
|
||||
tricu/
|
||||
@@ -131,7 +241,8 @@ tricu/
|
||||
│ ├── FileEval.hs
|
||||
│ ├── REPL.hs
|
||||
│ ├── Research.hs
|
||||
│ └── ContentStore.hs
|
||||
│ ├── ContentStore.hs
|
||||
│ └── Wire.hs # Arborix portable wire format
|
||||
├── test/
|
||||
│ ├── Spec.hs # Tasty + HUnit tests
|
||||
│ ├── *.tri # tricu test programs
|
||||
@@ -149,7 +260,55 @@ tricu/
|
||||
└── AGENTS.md # This file
|
||||
```
|
||||
|
||||
## 8. Development Tips
|
||||
## 9. JS Arborix Runtime
|
||||
|
||||
A JavaScript implementation of the Arborix portable bundle runtime lives in `ext/js/`.
|
||||
It is a reference implementation — not a tricu source parser. It reads `.tri.bundle` files produced by the Haskell toolchain, verifies Merkle node hashes, reconstructs tree values, and reduces them.
|
||||
|
||||
From project root:
|
||||
```bash
|
||||
node ext/js/src/cli.js inspect test/fixtures/id.tri.bundle
|
||||
node ext/js/src/cli.js run test/fixtures/true.tri.bundle
|
||||
```
|
||||
|
||||
The JS runtime implements:
|
||||
- Bundle binary format parsing (header, section directory, manifest, nodes)
|
||||
- SHA-256 Merkle node hash verification against canonical payloads
|
||||
- Closure verification (all child references present)
|
||||
- Tree reconstruction from node DAG
|
||||
- Core `apply` reduction rules
|
||||
- Basic codecs (decodeResult)
|
||||
- CLI: `inspect` and `run` commands
|
||||
|
||||
## 10. Content Store Workflow (Custom DB)
|
||||
|
||||
The content store location is controlled by the `TRICU_DB_PATH` environment variable. When set, `eval` mode automatically loads all stored terms into the initial environment, so you can call any previously imported/evaluated term by name.
|
||||
|
||||
```bash
|
||||
# Use a local DB
|
||||
export TRICU_DB_PATH=/tmp/tricu-local.db
|
||||
|
||||
# Import terms from the standard library
|
||||
./result/bin/tricu import -f lib/list.tri
|
||||
|
||||
# Now use them in eval mode
|
||||
echo "not? (t t)" | ./result/bin/tricu eval -t decode
|
||||
# Output: t
|
||||
|
||||
echo "not? (t t t)" | ./result/bin/tricu eval -t decode
|
||||
# Output: Stem Leaf
|
||||
|
||||
echo "equal? (t t) (t t t)" | ./result/bin/tricu eval -t decode
|
||||
# Output: t
|
||||
|
||||
# Check what's in the store
|
||||
./result/bin/tricu
|
||||
t> !definitions
|
||||
```
|
||||
|
||||
Without `TRICU_DB_PATH` set, `eval` uses only the terms defined in the input file(s).
|
||||
|
||||
## 11. Development Tips
|
||||
|
||||
- **REPL:** `nix run .#` starts the interactive tricu REPL.
|
||||
- **Evaluate files:** `nix run .# -- eval -f demos/equality.tri`
|
||||
|
||||
49
ext/bundle-runtime-profile-v1.md
Normal file
49
ext/bundle-runtime-profile-v1.md
Normal file
@@ -0,0 +1,49 @@
|
||||
1. Scope
|
||||
This profile defines the minimum required behavior for runtimes that execute tricu bundles.
|
||||
|
||||
2. Non-goals
|
||||
No tricu source parsing.
|
||||
No lambda elimination.
|
||||
No module system.
|
||||
No package manager.
|
||||
No local DB requirement.
|
||||
No authoring names beyond bundle exports.
|
||||
|
||||
3. Required bundle sections
|
||||
Header
|
||||
Manifest/exports
|
||||
Merkle nodes
|
||||
|
||||
4. Optional/skippable sections
|
||||
Source, debug, package metadata, signatures, provenance, etc.
|
||||
|
||||
5. Entrypoint selection
|
||||
Explicit export name first.
|
||||
Else export named main.
|
||||
Else single default root.
|
||||
Else error.
|
||||
|
||||
6. Node payload format
|
||||
Leaf/Stem/Fork byte layouts.
|
||||
|
||||
7. Hash verification
|
||||
Domain string and payload hashing rules.
|
||||
|
||||
8. Closure verification
|
||||
All referenced child hashes must exist.
|
||||
|
||||
9. Runtime representation
|
||||
Suggested JS representation, but not normative.
|
||||
|
||||
10. Reduction semantics
|
||||
The six Tree Calculus apply rules.
|
||||
|
||||
11. Codecs for v1
|
||||
Raw tree required.
|
||||
Maybe string/bool optional or experimental.
|
||||
|
||||
12. Required error cases
|
||||
Bad magic/version, missing export, hash mismatch, malformed payload, missing child.
|
||||
|
||||
13. Test fixtures
|
||||
List of bundles the implementation must pass.
|
||||
17
ext/js/package.json
Normal file
17
ext/js/package.json
Normal file
@@ -0,0 +1,17 @@
|
||||
{
|
||||
"name": "arborix-runtime",
|
||||
"version": "0.1.0",
|
||||
"description": "Arborix portable bundle runtime — JavaScript reference implementation",
|
||||
"type": "module",
|
||||
"main": "src/bundle.js",
|
||||
"bin": {
|
||||
"arborix-run": "src/cli.js"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "node --test test/*.test.js",
|
||||
"inspect": "node src/cli.js inspect",
|
||||
"run": "node src/cli.js run"
|
||||
},
|
||||
"keywords": ["arborix", "tree-calculus", "trie", "runtime"],
|
||||
"license": "MIT"
|
||||
}
|
||||
188
ext/js/src/bundle.js
Normal file
188
ext/js/src/bundle.js
Normal file
@@ -0,0 +1,188 @@
|
||||
/**
|
||||
* bundle.js — Parse an Arborix portable bundle binary into a JavaScript object.
|
||||
*
|
||||
* Format (v1):
|
||||
* Header (32 bytes):
|
||||
* Magic 8B "ARBORIX\0"
|
||||
* Major 2B u16 BE (must be 1)
|
||||
* Minor 2B u16 BE
|
||||
* SectionCount 4B u32 BE
|
||||
* Flags 8B u64 BE
|
||||
* DirOffset 8B u64 BE
|
||||
* Section Directory (SectionCount × 60 bytes):
|
||||
* Type 4B u32 BE
|
||||
* Version 2B u16 BE
|
||||
* Flags 2B u16 BE (bit 0 = critical)
|
||||
* Compression 2B u16 BE
|
||||
* DigestAlgo 2B u16 BE
|
||||
* Offset 8B u64 BE
|
||||
* Length 8B u64 BE
|
||||
* SHA256Digest 32B raw
|
||||
*/
|
||||
|
||||
import { createHash } from "node:crypto";
|
||||
|
||||
// ── Constants ───────────────────────────────────────────────────────────────
|
||||
|
||||
const MAGIC = Buffer.from([0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x58, 0x00]); // "ARBORIX\0"
|
||||
const HEADER_LENGTH = 32;
|
||||
const SECTION_ENTRY_LENGTH = 60;
|
||||
const SECTION_MANIFEST = 1;
|
||||
const SECTION_NODES = 2;
|
||||
const FLAG_CRITICAL = 0x0001;
|
||||
const COMPRESSION_NONE = 0;
|
||||
const DIGEST_SHA256 = 1;
|
||||
const MAJOR_VERSION = 1;
|
||||
const MINOR_VERSION = 0;
|
||||
|
||||
// ── Helpers ─────────────────────────────────────────────────────────────────
|
||||
|
||||
function readU16BE(buf, offset) {
|
||||
return buf.readUint16BE(offset);
|
||||
}
|
||||
function readU32BE(buf, offset) {
|
||||
return buf.readUint32BE(offset);
|
||||
}
|
||||
function readU64BE(buf, offset) {
|
||||
return buf.readBigUInt64BE(offset);
|
||||
}
|
||||
|
||||
function sha256(data) {
|
||||
return createHash("sha256").update(data).digest();
|
||||
}
|
||||
|
||||
// ── Public API ──────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Parse a bundle Buffer into a Bundle object.
|
||||
*
|
||||
* Returns { version, sectionCount, sections } where sections maps
|
||||
* section type numbers to parsed section info (offset, length, data).
|
||||
*/
|
||||
export function parseBundle(buffer) {
|
||||
if (buffer.length < HEADER_LENGTH) {
|
||||
throw new Error("bundle too short for header");
|
||||
}
|
||||
|
||||
// Check magic
|
||||
if (!buffer.slice(0, 8).equals(MAGIC)) {
|
||||
throw new Error("invalid magic: expected ARBORIX\\0");
|
||||
}
|
||||
|
||||
// Parse header
|
||||
const major = readU16BE(buffer, 8);
|
||||
const minor = readU16BE(buffer, 10);
|
||||
const sectionCount = readU32BE(buffer, 12);
|
||||
|
||||
if (major !== MAJOR_VERSION) {
|
||||
throw new Error(
|
||||
`unsupported bundle major version: ${major} (expected ${MAJOR_VERSION})`
|
||||
);
|
||||
}
|
||||
|
||||
const dirOffset = Number(readU64BE(buffer, 24));
|
||||
|
||||
// Parse section directory
|
||||
const dirStart = dirOffset;
|
||||
const dirEnd = dirStart + sectionCount * SECTION_ENTRY_LENGTH;
|
||||
|
||||
if (buffer.length < dirEnd) {
|
||||
throw new Error("bundle truncated in section directory");
|
||||
}
|
||||
|
||||
const entries = [];
|
||||
for (let i = 0; i < sectionCount; i++) {
|
||||
const off = dirStart + i * SECTION_ENTRY_LENGTH;
|
||||
const entry = {
|
||||
type: readU32BE(buffer, off),
|
||||
version: readU16BE(buffer, off + 4),
|
||||
flags: readU16BE(buffer, off + 6),
|
||||
compression: readU16BE(buffer, off + 8),
|
||||
digestAlgorithm: readU16BE(buffer, off + 10),
|
||||
offset: Number(readU64BE(buffer, off + 12)),
|
||||
length: Number(readU64BE(buffer, off + 20)),
|
||||
digest: buffer.slice(off + 28, off + 28 + 32),
|
||||
};
|
||||
entries.push(entry);
|
||||
}
|
||||
|
||||
// Validate sections
|
||||
for (const entry of entries) {
|
||||
const isCritical = (entry.flags & FLAG_CRITICAL) !== 0;
|
||||
const isKnown =
|
||||
entry.type === SECTION_MANIFEST || entry.type === SECTION_NODES;
|
||||
if (isCritical && !isKnown) {
|
||||
throw new Error(`unknown critical section type: ${entry.type}`);
|
||||
}
|
||||
if (entry.compression !== COMPRESSION_NONE) {
|
||||
throw new Error(
|
||||
`unsupported compression codec in section ${entry.type}`
|
||||
);
|
||||
}
|
||||
if (entry.digestAlgorithm !== DIGEST_SHA256) {
|
||||
throw new Error(
|
||||
`unsupported digest algorithm in section ${entry.type}`
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Verify section digests and extract data
|
||||
const sections = new Map();
|
||||
for (const entry of entries) {
|
||||
if (entry.offset < 0 || entry.length < 0) {
|
||||
throw new Error(`section ${entry.type} has negative offset/length`);
|
||||
}
|
||||
if (buffer.length < entry.offset + entry.length) {
|
||||
throw new Error(
|
||||
`section ${entry.type} extends beyond bundle end`
|
||||
);
|
||||
}
|
||||
|
||||
const data = buffer.slice(entry.offset, entry.offset + entry.length);
|
||||
|
||||
// Verify digest
|
||||
const computed = sha256(data);
|
||||
if (!computed.equals(entry.digest)) {
|
||||
throw new Error(
|
||||
`section digest mismatch for section type ${entry.type}`
|
||||
);
|
||||
}
|
||||
|
||||
sections.set(entry.type, {
|
||||
...entry,
|
||||
data,
|
||||
});
|
||||
}
|
||||
|
||||
// Check required sections
|
||||
if (!sections.has(SECTION_MANIFEST)) {
|
||||
throw new Error("missing required section: manifest");
|
||||
}
|
||||
if (!sections.has(SECTION_NODES)) {
|
||||
throw new Error("missing required section: nodes");
|
||||
}
|
||||
|
||||
return {
|
||||
version: `${major}.${minor}`,
|
||||
sectionCount,
|
||||
sections,
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Convenience: parse and return just the manifest JSON.
|
||||
*/
|
||||
export function parseManifest(buffer) {
|
||||
const bundle = parseBundle(buffer);
|
||||
const manifestEntry = bundle.sections.get(SECTION_MANIFEST);
|
||||
return JSON.parse(manifestEntry.data.toString("utf-8"));
|
||||
}
|
||||
|
||||
/**
|
||||
* Convenience: parse and return the node section binary.
|
||||
*/
|
||||
export function parseNodeSection(buffer) {
|
||||
const bundle = parseBundle(buffer);
|
||||
const nodesEntry = bundle.sections.get(SECTION_NODES);
|
||||
return nodesEntry.data;
|
||||
}
|
||||
249
ext/js/src/cli.js
Normal file
249
ext/js/src/cli.js
Normal file
@@ -0,0 +1,249 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* cli.js — Minimal CLI for inspecting and running Arborix bundles.
|
||||
*
|
||||
* Usage:
|
||||
* node cli.js inspect <bundle>
|
||||
* node cli.js run <bundle> [exportName] [input]
|
||||
*/
|
||||
|
||||
import { readFileSync } from "node:fs";
|
||||
import { parseBundle, parseManifest } from "./bundle.js";
|
||||
import { parseNodeSection as parseNodeSectionMerkle } from "./merkle.js";
|
||||
import {
|
||||
validateManifest,
|
||||
selectExport,
|
||||
printManifestInfo,
|
||||
} from "./manifest.js";
|
||||
import { parseNodeSection as parseNodeSectionBundle } from "./bundle.js";
|
||||
import {
|
||||
verifyNodeHashes,
|
||||
verifyClosure,
|
||||
verifyRootClosure,
|
||||
} from "./merkle.js";
|
||||
import { isTree, apply, triage, isFork, isStem } from "./tree.js";
|
||||
import { decodeResult, formatTree } from "./codecs.js";
|
||||
|
||||
// ── Commands ────────────────────────────────────────────────────────────────
|
||||
|
||||
function cmdInspect(bundlePath) {
|
||||
const buffer = readFileSync(bundlePath);
|
||||
try {
|
||||
const manifest = parseManifest(buffer);
|
||||
validateManifest(manifest);
|
||||
|
||||
const nodeSectionBytes = parseNodeSectionBundle(buffer);
|
||||
const { nodeMap } = parseNodeSectionMerkle(nodeSectionBytes);
|
||||
|
||||
console.log(`Bundle: ${bundlePath}`);
|
||||
console.log("");
|
||||
|
||||
printManifestInfo(manifest, " ");
|
||||
|
||||
console.log(` Nodes: ${nodeMap.size}`);
|
||||
|
||||
// Verify hashes
|
||||
const { verified: hashesOk, mismatches } = verifyNodeHashes(nodeMap);
|
||||
console.log(` Hash verification: ${hashesOk ? "OK" : "FAIL"}`);
|
||||
for (const m of mismatches) {
|
||||
console.log(` MISMATCH ${m.type} ${m.hash.substring(0, 16)}... expected ${m.expected.substring(0, 16)}...`);
|
||||
}
|
||||
|
||||
// Verify closure
|
||||
const { complete: closureOk, missing } = verifyClosure(nodeMap);
|
||||
console.log(` Closure verification: ${closureOk ? "OK" : "FAIL"}`);
|
||||
for (const m of missing) {
|
||||
console.log(` MISSING ${m.parent.substring(0, 16)}... → ${m.child.substring(0, 16)}...`);
|
||||
}
|
||||
|
||||
// Verify root closure for each export
|
||||
for (const exp of manifest.exports || []) {
|
||||
const { complete, missingRoots } = verifyRootClosure(
|
||||
nodeMap,
|
||||
exp.root
|
||||
);
|
||||
if (!complete) {
|
||||
console.log(
|
||||
` Root closure for "${exp.name}": FAIL — missing: ${missingRoots
|
||||
.map((r) => r.substring(0, 16) + "...")
|
||||
.join(", ")}`
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
console.log("");
|
||||
console.log("Inspection complete.");
|
||||
} catch (e) {
|
||||
console.error(`Error: ${e.message}`);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
function cmdRun(bundlePath, exportName, inputArg) {
|
||||
const buffer = readFileSync(bundlePath);
|
||||
let result;
|
||||
try {
|
||||
const manifest = parseManifest(buffer);
|
||||
validateManifest(manifest);
|
||||
|
||||
const selectedExport = selectExport(manifest, exportName);
|
||||
|
||||
const nodeSectionBytes = parseNodeSectionBundle(buffer);
|
||||
const { nodeMap } = parseNodeSectionMerkle(nodeSectionBytes);
|
||||
|
||||
// Verify hashes
|
||||
const { verified, mismatches } = verifyNodeHashes(nodeMap);
|
||||
if (!verified) {
|
||||
console.error(
|
||||
`Node hash mismatch:\n ${mismatches
|
||||
.map((m) => ` ${m.type}: ${m.hash} (expected ${m.expected})`)
|
||||
.join("\n")}`
|
||||
);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Reconstruct the tree for the selected export
|
||||
const root = buildTreeFromNodeMap(nodeMap, selectedExport.root);
|
||||
if (!isTree(root)) {
|
||||
console.error("Reconstructed root is not a valid tree value");
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Apply input if provided
|
||||
let term = root;
|
||||
if (inputArg !== undefined) {
|
||||
// TODO: parse input (string/number) into a tree
|
||||
// For now, just run the term as-is
|
||||
}
|
||||
|
||||
// Reduce with fuel limit
|
||||
const finalTerm = reduce(term, 1_000_000);
|
||||
|
||||
// Print result as tree calculus form
|
||||
console.log(formatTree(finalTerm));
|
||||
} catch (e) {
|
||||
console.error(`Error: ${e.message}`);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
// ── Tree reconstruction ─────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Reconstruct a tree from a node map.
|
||||
*
|
||||
* Node map: Map<hexHash, { type, childHash?, leftHash?, rightHash? }>
|
||||
*
|
||||
* Returns the tree representation: [] for Leaf, [child] for Stem, [right, left] for Fork.
|
||||
* Uses memoization to avoid re-processing nodes.
|
||||
*/
|
||||
export function buildTreeFromNodeMap(nodeMap, hash, memo = new Map()) {
|
||||
if (memo.has(hash)) return memo.get(hash);
|
||||
|
||||
const node = nodeMap.get(hash);
|
||||
if (!node) {
|
||||
throw new Error(`missing node in bundle: ${hash}`);
|
||||
}
|
||||
|
||||
let tree;
|
||||
switch (node.type) {
|
||||
case "leaf":
|
||||
tree = [];
|
||||
break;
|
||||
case "stem":
|
||||
tree = [buildTreeFromNodeMap(nodeMap, node.childHash, memo)];
|
||||
break;
|
||||
case "fork":
|
||||
tree = [
|
||||
buildTreeFromNodeMap(nodeMap, node.rightHash, memo),
|
||||
buildTreeFromNodeMap(nodeMap, node.leftHash, memo),
|
||||
];
|
||||
break;
|
||||
default:
|
||||
throw new Error(`unknown node type: ${node.type}`);
|
||||
}
|
||||
|
||||
memo.set(hash, tree);
|
||||
return tree;
|
||||
}
|
||||
|
||||
// ── Reduction ───────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Reduce a term to normal form with a fuel limit.
|
||||
* Uses the stack-based approach from the TS evaluator.
|
||||
*/
|
||||
export function reduce(term, fuel) {
|
||||
const stack = [term];
|
||||
let remaining = fuel;
|
||||
|
||||
while (stack.length >= 2 && remaining-- > 0) {
|
||||
// Pop right (top), then left
|
||||
const b = stack.pop(); // right
|
||||
const a = stack.pop(); // left
|
||||
|
||||
if (stack.length >= 2) {
|
||||
// Push a back for potential further reduction
|
||||
stack.push(a);
|
||||
}
|
||||
|
||||
const result = apply(a, b);
|
||||
|
||||
if (isTree(result)) {
|
||||
// If result is a value, push it. But if it's a Fork/Stem,
|
||||
// we need to push its components for further reduction.
|
||||
if (isFork(result)) {
|
||||
// Push right first (so it's popped second), then left
|
||||
stack.push(result[1]); // left
|
||||
stack.push(result[0]); // right
|
||||
} else if (isStem(result)) {
|
||||
stack.push(result[0]); // child
|
||||
} else {
|
||||
stack.push(result); // Leaf
|
||||
}
|
||||
} else {
|
||||
// Not a tree — push as-is (shouldn't happen after buildTree)
|
||||
stack.push(result);
|
||||
}
|
||||
}
|
||||
|
||||
if (remaining <= 0) {
|
||||
throw new Error("reduction step limit exceeded");
|
||||
}
|
||||
|
||||
if (stack.length === 1) {
|
||||
return stack[0];
|
||||
}
|
||||
return stack[0]; // fallback
|
||||
}
|
||||
|
||||
// ── Main ────────────────────────────────────────────────────────────────────
|
||||
|
||||
const args = process.argv.slice(2);
|
||||
const command = args[0];
|
||||
|
||||
switch (command) {
|
||||
case "inspect": {
|
||||
if (args.length < 2) {
|
||||
console.error("Usage: node cli.js inspect <bundle>");
|
||||
process.exit(1);
|
||||
}
|
||||
cmdInspect(args[1]);
|
||||
break;
|
||||
}
|
||||
case "run": {
|
||||
if (args.length < 2) {
|
||||
console.error("Usage: node cli.js run <bundle> [exportName] [input]");
|
||||
process.exit(1);
|
||||
}
|
||||
cmdRun(args[1], args[2], args[3]);
|
||||
break;
|
||||
}
|
||||
default:
|
||||
console.log("Arborix JS Runtime");
|
||||
console.log("");
|
||||
console.log("Usage:");
|
||||
console.log(" node cli.js inspect <bundle>");
|
||||
console.log(" node cli.js run <bundle> [exportName] [input]");
|
||||
break;
|
||||
}
|
||||
135
ext/js/src/codecs.js
Normal file
135
ext/js/src/codecs.js
Normal file
@@ -0,0 +1,135 @@
|
||||
/**
|
||||
* codecs.js — Minimal codecs for decoding tree results.
|
||||
*
|
||||
* Implements: decodeResult (from Research.hs)
|
||||
* - Leaf → "t"
|
||||
* - Numbers: toNumber
|
||||
* - Strings: toString
|
||||
* - Lists: toList
|
||||
* - Fallback: raw tree format
|
||||
*/
|
||||
|
||||
// ── toNumber ────────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Decode a tree as a binary number (big-endian).
|
||||
* Leaf = 0, Fork(Leaf, rest) = 2*n, Fork(Stem Leaf, rest) = 2*n+1.
|
||||
*/
|
||||
export function toNumber(t) {
|
||||
if (!Array.isArray(t)) return null;
|
||||
if (t.length === 0) return 0; // Leaf = 0
|
||||
if (t.length !== 2) return null; // must be Fork
|
||||
|
||||
const [right, left] = t;
|
||||
// Fork structure: [right, left]
|
||||
// left child determines bit: Leaf = 0, Stem(Leaf) = 1
|
||||
let bit;
|
||||
if (Array.isArray(left) && left.length === 0) {
|
||||
bit = 0; // Leaf
|
||||
} else if (Array.isArray(left) && left.length === 1) {
|
||||
const child = left[0];
|
||||
if (Array.isArray(child) && child.length === 0) {
|
||||
bit = 1; // Stem(Leaf) = 1
|
||||
} else {
|
||||
return null; // Stem of something other than Leaf
|
||||
}
|
||||
} else {
|
||||
return null;
|
||||
}
|
||||
|
||||
const rest = toNumber(right);
|
||||
if (rest === null) return null;
|
||||
|
||||
return bit + 2 * rest;
|
||||
}
|
||||
|
||||
// ── toString ────────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Decode a tree as a list of numbers (characters).
|
||||
* Fork(x, rest) = x : list.
|
||||
*/
|
||||
export function toList(t) {
|
||||
if (!Array.isArray(t)) return null;
|
||||
if (t.length === 0) return []; // Leaf = empty list
|
||||
if (t.length !== 2) return null; // must be Fork
|
||||
|
||||
const [right, left] = t;
|
||||
const rest = toList(right);
|
||||
if (rest === null) return null;
|
||||
|
||||
return [left, ...rest];
|
||||
}
|
||||
|
||||
/**
|
||||
* Decode a tree as a string.
|
||||
*/
|
||||
export function toString(t) {
|
||||
const list = toList(t);
|
||||
if (list === null) return null;
|
||||
try {
|
||||
return list.map((ch) => String.fromCharCode(ch)).join("");
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// ── decodeResult ────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Decode a tree result using multiple strategies:
|
||||
* 1. Leaf → "t"
|
||||
* 2. String (if all chars are printable)
|
||||
* 3. Number
|
||||
* 4. List
|
||||
* 5. Raw tree format
|
||||
*/
|
||||
export function decodeResult(t) {
|
||||
if (!Array.isArray(t)) {
|
||||
return String(t);
|
||||
}
|
||||
|
||||
// Leaf
|
||||
if (t.length === 0) {
|
||||
return "t";
|
||||
}
|
||||
|
||||
// Try string first (list of char codes)
|
||||
const list = toList(t);
|
||||
if (list !== null && list.length > 0) {
|
||||
const str = list.map((n) => {
|
||||
if (n < 32 || n > 126) return null;
|
||||
return String.fromCharCode(n);
|
||||
}).join("");
|
||||
if (str) return `"${str}"`;
|
||||
}
|
||||
|
||||
// Try number
|
||||
const num = toNumber(t);
|
||||
if (num !== null) {
|
||||
return String(num);
|
||||
}
|
||||
|
||||
// Try list (elements are trees)
|
||||
if (t.length === 2) {
|
||||
const elements = toList(t);
|
||||
if (elements !== null) {
|
||||
const decoded = elements.map((e) => decodeResult(e));
|
||||
return `[${decoded.join(", ")}]`;
|
||||
}
|
||||
}
|
||||
|
||||
// Raw tree format
|
||||
return formatTree(t);
|
||||
}
|
||||
|
||||
/**
|
||||
* Format a tree as a parenthesized expression.
|
||||
*/
|
||||
export function formatTree(t) {
|
||||
if (!Array.isArray(t)) return String(t);
|
||||
if (t.length === 0) return "Leaf";
|
||||
if (t.length === 1) return `Stem(${formatTree(t[0])})`;
|
||||
if (t.length === 2) return `Fork(${formatTree(t[1])}, ${formatTree(t[0])})`;
|
||||
return `[${t.map(formatTree).join(", ")}]`;
|
||||
}
|
||||
167
ext/js/src/manifest.js
Normal file
167
ext/js/src/manifest.js
Normal file
@@ -0,0 +1,167 @@
|
||||
/**
|
||||
* manifest.js — Minimal manifest parsing and export lookup.
|
||||
*
|
||||
* The manifest is a JSON object with fields:
|
||||
* schema, bundleType, tree, runtime, closure, roots, exports,
|
||||
* imports, sections, metadata
|
||||
*
|
||||
* We parse only what we need for runtime entrypoint selection.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Validate the manifest against the runtime profile requirements.
|
||||
* Throws on violation.
|
||||
*/
|
||||
export function validateManifest(manifest) {
|
||||
if (manifest.schema !== "arborix.bundle.manifest.v1") {
|
||||
throw new Error(
|
||||
`unsupported manifest schema: ${manifest.schema}`
|
||||
);
|
||||
}
|
||||
if (manifest.bundleType !== "tree-calculus-executable-object") {
|
||||
throw new Error(
|
||||
`unsupported bundle type: ${manifest.bundleType}`
|
||||
);
|
||||
}
|
||||
|
||||
const tree = manifest.tree;
|
||||
if (tree.calculus !== "tree-calculus.v1") {
|
||||
throw new Error(`unsupported calculus: ${tree.calculus}`);
|
||||
}
|
||||
if (tree.nodeHash.algorithm !== "sha256") {
|
||||
throw new Error(
|
||||
`unsupported node hash algorithm: ${tree.nodeHash.algorithm}`
|
||||
);
|
||||
}
|
||||
if (tree.nodeHash.domain !== "tricu.merkle.node.v1" && tree.nodeHash.domain !== "arborix.merkle.node.v1") {
|
||||
throw new Error(
|
||||
`unsupported node hash domain: ${tree.nodeHash.domain}`
|
||||
);
|
||||
}
|
||||
if (tree.nodePayload !== "arborix.merkle.payload.v1") {
|
||||
throw new Error(`unsupported node payload: ${tree.nodePayload}`);
|
||||
}
|
||||
|
||||
const runtime = manifest.runtime;
|
||||
if (runtime.semantics !== "tree-calculus.v1") {
|
||||
throw new Error(`unsupported runtime semantics: ${runtime.semantics}`);
|
||||
}
|
||||
if (runtime.abi !== "arborix.abi.tree.v1") {
|
||||
throw new Error(`unsupported runtime ABI: ${runtime.abi}`);
|
||||
}
|
||||
if (runtime.capabilities && runtime.capabilities.length > 0) {
|
||||
throw new Error(
|
||||
`host/runtime capabilities not supported: ${runtime.capabilities.join(", ")}`
|
||||
);
|
||||
}
|
||||
|
||||
if (manifest.closure !== "complete") {
|
||||
throw new Error("bundle v1 requires closure = complete");
|
||||
}
|
||||
if (manifest.imports && manifest.imports.length > 0) {
|
||||
throw new Error("bundle v1 requires an empty imports list");
|
||||
}
|
||||
if (!manifest.roots || manifest.roots.length === 0) {
|
||||
throw new Error("manifest has no roots");
|
||||
}
|
||||
if (!manifest.exports || manifest.exports.length === 0) {
|
||||
throw new Error("manifest has no exports");
|
||||
}
|
||||
|
||||
for (const exp of manifest.exports) {
|
||||
if (!exp.name) {
|
||||
throw new Error("manifest export has empty name");
|
||||
}
|
||||
if (!exp.root) {
|
||||
throw new Error("manifest export has empty root");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Select an export hash given a requested name.
|
||||
*
|
||||
* Selection strategy:
|
||||
* 1. Explicit export name
|
||||
* 2. Export named "main"
|
||||
* 3. Single export (auto-select)
|
||||
* 4. Error if multiple exports and no "main"
|
||||
*/
|
||||
export function selectExport(manifest, requestedName) {
|
||||
const exports = manifest.exports || [];
|
||||
|
||||
// Strategy 1: explicit name
|
||||
if (requestedName) {
|
||||
const found = exports.find((e) => e.name === requestedName);
|
||||
if (found) {
|
||||
return found;
|
||||
}
|
||||
throw new Error(
|
||||
`requested export "${requestedName}" not found. Available: ${exports.map((e) => e.name).join(", ")}`
|
||||
);
|
||||
}
|
||||
|
||||
// Strategy 2: prefer "main"
|
||||
const mainExport = exports.find((e) => e.name === "main");
|
||||
if (mainExport) {
|
||||
return mainExport;
|
||||
}
|
||||
|
||||
// Strategy 3: single export
|
||||
if (exports.length === 1) {
|
||||
return exports[0];
|
||||
}
|
||||
|
||||
// Strategy 4: multiple exports, require explicit
|
||||
throw new Error(
|
||||
`multiple exports available but none named "main": ${exports.map((e) => e.name).join(", ")}. Specify an export name.`
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all root hashes from the manifest.
|
||||
*/
|
||||
export function getRootHashes(manifest) {
|
||||
return (manifest.roots || []).map((r) => r.hash);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all export names.
|
||||
*/
|
||||
export function getExportNames(manifest) {
|
||||
return (manifest.exports || []).map((e) => e.name);
|
||||
}
|
||||
|
||||
/**
|
||||
* Print manifest summary info.
|
||||
*/
|
||||
export function printManifestInfo(manifest, indent = "") {
|
||||
const tree = manifest.tree;
|
||||
const runtime = manifest.runtime;
|
||||
|
||||
console.log(`${indent}Schema: ${manifest.schema}`);
|
||||
console.log(`${indent}Bundle type: ${manifest.bundleType}`);
|
||||
console.log(`${indent}Closure: ${manifest.closure}`);
|
||||
console.log(`${indent}Tree calculus: ${tree.calculus}`);
|
||||
console.log(`${indent}Hash algo: ${tree.nodeHash.algorithm}`);
|
||||
console.log(`${indent}Hash domain: ${tree.nodeHash.domain}`);
|
||||
console.log(`${indent}Runtime: ${runtime.semantics}`);
|
||||
console.log(`${indent}ABI: ${runtime.abi}`);
|
||||
console.log(`${indent}Evaluation: ${runtime.evaluation || "N/A"}`);
|
||||
console.log("");
|
||||
console.log(`${indent}Roots (${getRootHashes(manifest).length}):`);
|
||||
for (const root of getRootHashes(manifest)) {
|
||||
console.log(`${indent} ${root.substring(0, 16)}...`);
|
||||
}
|
||||
console.log("");
|
||||
console.log(`${indent}Exports (${getExportNames(manifest).length}):`);
|
||||
for (const name of getExportNames(manifest)) {
|
||||
console.log(`${indent} ${name}`);
|
||||
}
|
||||
|
||||
const meta = manifest.metadata;
|
||||
if (meta && meta.createdBy) {
|
||||
console.log("");
|
||||
console.log(`${indent}Created by: ${meta.createdBy}`);
|
||||
}
|
||||
}
|
||||
276
ext/js/src/merkle.js
Normal file
276
ext/js/src/merkle.js
Normal file
@@ -0,0 +1,276 @@
|
||||
/**
|
||||
* merkle.js — Node payload decoding and hash verification.
|
||||
*
|
||||
* Node payload format:
|
||||
* Leaf: 0x00
|
||||
* Stem: 0x01 || child_hash (32 bytes raw)
|
||||
* Fork: 0x02 || left_hash (32 bytes raw) || right_hash (32 bytes raw)
|
||||
*
|
||||
* Hash computation:
|
||||
* hash = SHA256( "tricu.merkle.node.v1" || 0x00 || node_payload )
|
||||
*/
|
||||
|
||||
import { createHash } from "node:crypto";
|
||||
|
||||
// ── Constants ───────────────────────────────────────────────────────────────
|
||||
|
||||
const DOMAIN_TAG = "tricu.merkle.node.v1";
|
||||
const HASH_LENGTH = 32; // raw hash bytes
|
||||
const HEX_LENGTH = 64; // hex-encoded hash length
|
||||
|
||||
// ── Helpers ─────────────────────────────────────────────────────────────────
|
||||
|
||||
function rawToHex(buf) {
|
||||
if (buf.length !== HASH_LENGTH) {
|
||||
throw new Error(`raw hash must be ${HASH_LENGTH} bytes, got ${buf.length}`);
|
||||
}
|
||||
return buf.toString("hex");
|
||||
}
|
||||
|
||||
function hexToRaw(hex) {
|
||||
const buf = Buffer.from(hex, "hex");
|
||||
if (buf.length !== HASH_LENGTH) {
|
||||
throw new Error(`hex hash must decode to ${HASH_LENGTH} bytes`);
|
||||
}
|
||||
return buf;
|
||||
}
|
||||
|
||||
function sha256(data) {
|
||||
return createHash("sha256").update(data).digest();
|
||||
}
|
||||
|
||||
function nodeHash(prefix, payload) {
|
||||
return sha256(Buffer.concat([Buffer.from(prefix), Buffer.from([0x00]), payload]));
|
||||
}
|
||||
|
||||
// ── Node payload types ──────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Deserialize a node payload into { type, childHash, leftHash, rightHash }.
|
||||
*
|
||||
* type: "leaf" | "stem" | "fork"
|
||||
* childHash: hex string (for stem)
|
||||
* leftHash: hex string (for fork)
|
||||
* rightHash: hex string (for fork)
|
||||
*/
|
||||
export function deserializePayload(payload) {
|
||||
if (payload.length === 0) {
|
||||
throw new Error("empty payload");
|
||||
}
|
||||
|
||||
const type = payload.readUInt8(0);
|
||||
|
||||
switch (type) {
|
||||
case 0x00:
|
||||
if (payload.length !== 1) {
|
||||
throw new Error(
|
||||
`invalid leaf payload: expected 1 byte, got ${payload.length}`
|
||||
);
|
||||
}
|
||||
return { type: "leaf" };
|
||||
|
||||
case 0x01:
|
||||
if (payload.length !== 1 + HASH_LENGTH) {
|
||||
throw new Error(
|
||||
`invalid stem payload: expected ${1 + HASH_LENGTH} bytes, got ${payload.length}`
|
||||
);
|
||||
}
|
||||
return {
|
||||
type: "stem",
|
||||
childHash: rawToHex(payload.slice(1, 1 + HASH_LENGTH)),
|
||||
};
|
||||
|
||||
case 0x02:
|
||||
if (payload.length !== 1 + 2 * HASH_LENGTH) {
|
||||
throw new Error(
|
||||
`invalid fork payload: expected ${1 + 2 * HASH_LENGTH} bytes, got ${payload.length}`
|
||||
);
|
||||
}
|
||||
return {
|
||||
type: "fork",
|
||||
leftHash: rawToHex(payload.slice(1, 1 + HASH_LENGTH)),
|
||||
rightHash: rawToHex(payload.slice(1 + HASH_LENGTH, 1 + 2 * HASH_LENGTH)),
|
||||
};
|
||||
|
||||
default:
|
||||
throw new Error(
|
||||
`invalid merkle node payload: unknown type 0x${type.toString(16)}`
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Compute the canonical payload bytes for a given tree node structure.
|
||||
*/
|
||||
export function serializeNode(node) {
|
||||
switch (node.type) {
|
||||
case "leaf":
|
||||
return Buffer.from([0x00]);
|
||||
case "stem":
|
||||
return Buffer.concat([Buffer.from([0x01]), hexToRaw(node.childHash)]);
|
||||
case "fork":
|
||||
return Buffer.concat([
|
||||
Buffer.from([0x02]),
|
||||
hexToRaw(node.leftHash),
|
||||
hexToRaw(node.rightHash),
|
||||
]);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Compute the Merkle hash of a node from its type and parameters.
|
||||
*/
|
||||
export function computeNodeHash(node) {
|
||||
const payload = serializeNode(node);
|
||||
const hash = nodeHash(DOMAIN_TAG, payload);
|
||||
return hash.toString("hex");
|
||||
}
|
||||
|
||||
// ── Node section parsing ────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Parse the node section binary into a Map<hexHash, { type, payload, node }>.
|
||||
*
|
||||
* Node section format:
|
||||
* nodeCount (8B u64 BE)
|
||||
* entries[]:
|
||||
* hash (32B raw)
|
||||
* payloadLen (4B u32 BE)
|
||||
* payload (payloadLen bytes)
|
||||
*/
|
||||
export function parseNodeSection(data) {
|
||||
if (data.length < 8) {
|
||||
throw new Error("node section too short for count");
|
||||
}
|
||||
|
||||
const nodeCount = Number(data.readBigUInt64BE(0));
|
||||
let offset = 8;
|
||||
|
||||
const nodeMap = new Map();
|
||||
const errors = [];
|
||||
|
||||
for (let i = 0; i < nodeCount; i++) {
|
||||
// Read hash
|
||||
if (offset + HASH_LENGTH > data.length) {
|
||||
errors.push(`node ${i}: not enough bytes for hash`);
|
||||
break;
|
||||
}
|
||||
const hash = rawToHex(data.slice(offset, offset + HASH_LENGTH));
|
||||
offset += HASH_LENGTH;
|
||||
|
||||
// Read payload length
|
||||
if (offset + 4 > data.length) {
|
||||
errors.push(`node ${i} (${hash}): not enough bytes for payload length`);
|
||||
break;
|
||||
}
|
||||
const payloadLen = data.readUint32BE(offset);
|
||||
offset += 4;
|
||||
|
||||
// Read payload
|
||||
if (offset + payloadLen > data.length) {
|
||||
errors.push(`node ${i} (${hash}): payload extends beyond section end`);
|
||||
break;
|
||||
}
|
||||
const payload = data.slice(offset, offset + payloadLen);
|
||||
offset += payloadLen;
|
||||
|
||||
// Deserialize payload
|
||||
let node;
|
||||
try {
|
||||
node = deserializePayload(payload);
|
||||
} catch (e) {
|
||||
errors.push(`node ${i} (${hash}): ${e.message}`);
|
||||
continue;
|
||||
}
|
||||
|
||||
nodeMap.set(hash, {
|
||||
hash,
|
||||
payload,
|
||||
...node,
|
||||
});
|
||||
}
|
||||
|
||||
if (errors.length > 0) {
|
||||
throw new Error(
|
||||
`node section parse errors:\n ${errors.join("\n ")}`
|
||||
);
|
||||
}
|
||||
|
||||
return { nodeMap, count: nodeCount };
|
||||
}
|
||||
|
||||
// ── Verification ────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Verify all node hashes match their payloads.
|
||||
* Returns { verified, mismatches }
|
||||
*/
|
||||
export function verifyNodeHashes(nodeMap) {
|
||||
const mismatches = [];
|
||||
|
||||
for (const [hash, node] of nodeMap) {
|
||||
const expected = computeNodeHash(node);
|
||||
if (hash !== expected) {
|
||||
mismatches.push({
|
||||
hash,
|
||||
expected,
|
||||
type: node.type,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return { verified: mismatches.length === 0, mismatches };
|
||||
}
|
||||
|
||||
/**
|
||||
* Verify that all child references exist in the node map (closure).
|
||||
* Returns { complete, missing } where missing is an array of { parent, child }.
|
||||
*/
|
||||
export function verifyClosure(nodeMap) {
|
||||
const missing = [];
|
||||
|
||||
for (const [hash, node] of nodeMap) {
|
||||
if (node.type === "stem") {
|
||||
if (!nodeMap.has(node.childHash)) {
|
||||
missing.push({ parent: hash, child: node.childHash });
|
||||
}
|
||||
} else if (node.type === "fork") {
|
||||
if (!nodeMap.has(node.leftHash)) {
|
||||
missing.push({ parent: hash, child: node.leftHash });
|
||||
}
|
||||
if (!nodeMap.has(node.rightHash)) {
|
||||
missing.push({ parent: hash, child: node.rightHash });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return { complete: missing.length === 0, missing };
|
||||
}
|
||||
|
||||
/**
|
||||
* Verify closure for a specific root hash (transitive reachability).
|
||||
* Returns { complete, missingRoots }.
|
||||
*/
|
||||
export function verifyRootClosure(nodeMap, rootHash) {
|
||||
const visited = new Set();
|
||||
const missingRoots = [];
|
||||
|
||||
function visit(hash) {
|
||||
if (visited.has(hash)) return;
|
||||
if (!nodeMap.has(hash)) {
|
||||
missingRoots.push(hash);
|
||||
return;
|
||||
}
|
||||
visited.add(hash);
|
||||
const node = nodeMap.get(hash);
|
||||
if (node.type === "stem") {
|
||||
visit(node.childHash);
|
||||
} else if (node.type === "fork") {
|
||||
visit(node.leftHash);
|
||||
visit(node.rightHash);
|
||||
}
|
||||
}
|
||||
|
||||
visit(rootHash);
|
||||
return { complete: missingRoots.length === 0, missingRoots };
|
||||
}
|
||||
125
ext/js/src/tree.js
Normal file
125
ext/js/src/tree.js
Normal file
@@ -0,0 +1,125 @@
|
||||
/**
|
||||
* tree.js — Runtime tree representation.
|
||||
*
|
||||
* The JS tree uses a simple array representation matching the
|
||||
* TypeScript reference evaluator:
|
||||
*
|
||||
* Leaf = []
|
||||
* Stem = [child] (array length === 1)
|
||||
* Fork = [right, left] (array length === 2)
|
||||
*
|
||||
* This is a "flattened stack" representation: when reduced, terms
|
||||
* become arrays and the evaluator pops three elements at a time.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Check if a value is a Leaf (empty array).
|
||||
*/
|
||||
export function isLeaf(t) {
|
||||
return Array.isArray(t) && t.length === 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a value is a Stem (single element array).
|
||||
*/
|
||||
export function isStem(t) {
|
||||
return Array.isArray(t) && t.length === 1;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a value is a Fork (two element array).
|
||||
*/
|
||||
export function isFork(t) {
|
||||
return Array.isArray(t) && t.length === 2;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a value is a valid tree calculus value (Leaf, Stem, or Fork).
|
||||
*/
|
||||
export function isTree(t) {
|
||||
return isLeaf(t) || isStem(t) || isFork(t);
|
||||
}
|
||||
|
||||
/**
|
||||
* Triage a tree: classify it as Leaf/Stem/Fork.
|
||||
* The tree must be in normal form (no reducible redexes).
|
||||
*
|
||||
* Returns { kind: "leaf"|"stem"|"fork", ...rest }
|
||||
*/
|
||||
export function triage(t) {
|
||||
if (!Array.isArray(t)) {
|
||||
throw new Error("not a tree (not an array)");
|
||||
}
|
||||
if (t.length === 0) return { kind: "leaf" };
|
||||
if (t.length === 1) return { kind: "stem", child: t[0] };
|
||||
if (t.length === 2) return { kind: "fork", right: t[0], left: t[1] };
|
||||
throw new Error(`not a value/binary tree: length ${t.length}`);
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply the Tree Calculus apply rules.
|
||||
*
|
||||
* apply(a, b) computes the application of term a to term b.
|
||||
*
|
||||
* Rules:
|
||||
* apply(Fork(Leaf, a), _) = a
|
||||
* apply(Fork(Stem(a), b), c) = apply(apply(a, c), apply(b, c))
|
||||
* apply(Fork(Fork, _, _), Leaf) = left of inner Fork
|
||||
* apply(Fork(Fork, _, _), Stem) = right of inner Fork
|
||||
* apply(Fork(Fork, _, _), Fork) = apply(apply(c, u), v) where c=Fork(u,v)
|
||||
* apply(Leaf, b) = Stem(b)
|
||||
* apply(Stem(a), b) = Fork(a, b)
|
||||
*
|
||||
* For Fork, the inner structure is [right, left], so:
|
||||
* a = right, b = left
|
||||
*/
|
||||
export function apply(a, b) {
|
||||
// apply(Fork(Leaf, a), _) = a
|
||||
// Fork = [right, left] = [Leaf, a] → left child is Leaf
|
||||
if (isFork(a) && isLeaf(a[1])) {
|
||||
return a[0]; // return right child
|
||||
}
|
||||
|
||||
// apply(Fork(Stem(a), b), c)
|
||||
if (isFork(a) && isStem(a[1])) {
|
||||
const stemChild = a[1][0]; // left child of fork
|
||||
const right = a[0]; // right child of fork
|
||||
const innerA = stemChild;
|
||||
const innerB = right;
|
||||
const appliedA = apply(innerA, b);
|
||||
const appliedB = apply(innerB, b);
|
||||
return apply(appliedA, appliedB);
|
||||
}
|
||||
|
||||
// apply(Fork(Fork, _, _), Leaf)
|
||||
if (isFork(a) && isFork(a[1]) && isLeaf(b)) {
|
||||
return a[1][0]; // right child of inner fork (which is left child)
|
||||
}
|
||||
|
||||
// apply(Fork(Fork, _, _), Stem)
|
||||
if (isFork(a) && isFork(a[1]) && isStem(b)) {
|
||||
return a[1][1]; // left child of inner fork
|
||||
}
|
||||
|
||||
// apply(Fork(Fork, _, _), Fork)
|
||||
if (isFork(a) && isFork(a[1]) && isFork(b)) {
|
||||
// b = Fork(u, v) = [v, u]
|
||||
const u = b[0];
|
||||
const v = b[1];
|
||||
// apply(apply(c, u), v) where c = inner fork
|
||||
const applied = apply(apply(a[1], u), v);
|
||||
return applied;
|
||||
}
|
||||
|
||||
// apply(Leaf, b) = Stem(b)
|
||||
if (isLeaf(a)) {
|
||||
return [b];
|
||||
}
|
||||
|
||||
// apply(Stem(a), b) = Fork(a, b)
|
||||
if (isStem(a)) {
|
||||
return [b, a[0]]; // [right, left]
|
||||
}
|
||||
|
||||
throw new Error("apply: undefined reduction for terms");
|
||||
}
|
||||
67
ext/js/test/bundle.test.js
Normal file
67
ext/js/test/bundle.test.js
Normal file
@@ -0,0 +1,67 @@
|
||||
import { readFileSync } from "node:fs";
|
||||
import { strictEqual, ok, throws } from "node:assert";
|
||||
import { describe, it } from "node:test";
|
||||
import {
|
||||
parseBundle,
|
||||
parseManifest,
|
||||
} from "../src/bundle.js";
|
||||
import {
|
||||
parseNodeSection as bundleParseNodeSection,
|
||||
} from "../src/bundle.js";
|
||||
import {
|
||||
verifyNodeHashes,
|
||||
parseNodeSection as parseNodes,
|
||||
} from "../src/merkle.js";
|
||||
|
||||
const fixtureDir = "test/fixtures";
|
||||
|
||||
describe("bundle parsing", () => {
|
||||
it("valid bundle parses header and sections", () => {
|
||||
const bundle = parseBundle(
|
||||
readFileSync(`${fixtureDir}/id.tri.bundle`)
|
||||
);
|
||||
strictEqual(bundle.version, "1.0");
|
||||
strictEqual(bundle.sectionCount, 2);
|
||||
ok(bundle.sections.has(1)); // manifest
|
||||
ok(bundle.sections.has(2)); // nodes
|
||||
});
|
||||
|
||||
it("parseManifest returns valid JSON", () => {
|
||||
const manifest = parseManifest(
|
||||
readFileSync(`${fixtureDir}/id.tri.bundle`)
|
||||
);
|
||||
strictEqual(manifest.schema, "arborix.bundle.manifest.v1");
|
||||
strictEqual(manifest.bundleType, "tree-calculus-executable-object");
|
||||
strictEqual(manifest.closure, "complete");
|
||||
strictEqual(manifest.tree.calculus, "tree-calculus.v1");
|
||||
strictEqual(manifest.tree.nodeHash.algorithm, "sha256");
|
||||
strictEqual(manifest.runtime.semantics, "tree-calculus.v1");
|
||||
strictEqual(manifest.runtime.abi, "arborix.abi.tree.v1");
|
||||
});
|
||||
});
|
||||
|
||||
describe("hash verification", () => {
|
||||
it("valid bundle nodes verify", () => {
|
||||
const data = bundleParseNodeSection(
|
||||
readFileSync(`${fixtureDir}/id.tri.bundle`)
|
||||
);
|
||||
const { nodeMap } = parseNodes(data);
|
||||
const { verified } = verifyNodeHashes(nodeMap);
|
||||
ok(verified, "all node hashes should verify");
|
||||
});
|
||||
});
|
||||
|
||||
describe("errors", () => {
|
||||
it("bad magic fails", () => {
|
||||
const buf = Buffer.alloc(32, 0);
|
||||
buf.write("WRONGMAG", 0, 8);
|
||||
throws(() => parseBundle(buf), /invalid magic/);
|
||||
});
|
||||
|
||||
it("unsupported version fails", () => {
|
||||
const buf = Buffer.alloc(32, 0);
|
||||
buf.write("ARBORIX\0", 0, 8);
|
||||
buf.writeUInt16BE(2, 8); // major version 2
|
||||
throws(() => parseBundle(buf), /unsupported bundle major version/);
|
||||
});
|
||||
});
|
||||
148
ext/js/test/merkle.test.js
Normal file
148
ext/js/test/merkle.test.js
Normal file
@@ -0,0 +1,148 @@
|
||||
import { readFileSync } from "node:fs";
|
||||
import { strictEqual, ok } from "node:assert";
|
||||
import { describe, it } from "node:test";
|
||||
import { parseNodeSection } from "../src/bundle.js";
|
||||
import {
|
||||
verifyNodeHashes,
|
||||
verifyClosure,
|
||||
verifyRootClosure,
|
||||
deserializePayload,
|
||||
computeNodeHash,
|
||||
} from "../src/merkle.js";
|
||||
|
||||
describe("merkle — deserializePayload", () => {
|
||||
it("Leaf (0x00)", () => {
|
||||
const result = deserializePayload(Buffer.from([0x00]));
|
||||
strictEqual(result.type, "leaf");
|
||||
});
|
||||
|
||||
it("Stem (0x01 + 32 bytes)", () => {
|
||||
const childHash = Buffer.alloc(32, 0xab);
|
||||
const payload = Buffer.concat([Buffer.from([0x01]), childHash]);
|
||||
const result = deserializePayload(payload);
|
||||
strictEqual(result.type, "stem");
|
||||
strictEqual(result.childHash, "ab".repeat(32));
|
||||
});
|
||||
|
||||
it("Fork (0x02 + 64 bytes)", () => {
|
||||
const left = Buffer.alloc(32, 0x01);
|
||||
const right = Buffer.alloc(32, 0x02);
|
||||
const payload = Buffer.concat([Buffer.from([0x02]), left, right]);
|
||||
const result = deserializePayload(payload);
|
||||
strictEqual(result.type, "fork");
|
||||
strictEqual(result.leftHash, "01".repeat(32));
|
||||
strictEqual(result.rightHash, "02".repeat(32));
|
||||
});
|
||||
|
||||
it("Leaf with extra bytes fails", () => {
|
||||
throws(() => deserializePayload(Buffer.from([0x00, 0x00])), /invalid leaf/);
|
||||
});
|
||||
|
||||
it("Unknown type fails", () => {
|
||||
throws(() => deserializePayload(Buffer.from([0xff])), /unknown type/);
|
||||
});
|
||||
});
|
||||
|
||||
describe("merkle — computeNodeHash", () => {
|
||||
it("Leaf hash is correct length", () => {
|
||||
const leaf = { type: "leaf" };
|
||||
const hash = computeNodeHash(leaf);
|
||||
strictEqual(hash.length, 64);
|
||||
});
|
||||
});
|
||||
|
||||
describe("merkle — node section parsing", () => {
|
||||
const fixtureDir = "test/fixtures";
|
||||
|
||||
it("parses id.tri.bundle with correct node count", () => {
|
||||
const data = parseNodeSection(
|
||||
readFileSync(`${fixtureDir}/id.tri.bundle`)
|
||||
);
|
||||
const { nodeMap } = parseNodes(data);
|
||||
strictEqual(nodeMap.size, 4);
|
||||
});
|
||||
|
||||
it("parses true.tri.bundle with correct node count", () => {
|
||||
const data = parseNodeSection(
|
||||
readFileSync(`${fixtureDir}/true.tri.bundle`)
|
||||
);
|
||||
const { nodeMap } = parseNodes(data);
|
||||
strictEqual(nodeMap.size, 2);
|
||||
});
|
||||
});
|
||||
|
||||
describe("merkle — hash verification", () => {
|
||||
const fixtureDir = "test/fixtures";
|
||||
|
||||
it("id.tri.bundle nodes all verify", () => {
|
||||
const data = parseNodeSection(
|
||||
readFileSync(`${fixtureDir}/id.tri.bundle`)
|
||||
);
|
||||
const { nodeMap } = parseNodes(data);
|
||||
const { verified, mismatches } = verifyNodeHashes(nodeMap);
|
||||
ok(verified, "id.tri.bundle node hashes should verify");
|
||||
strictEqual(mismatches.length, 0);
|
||||
});
|
||||
|
||||
it("corrupted node payload fails hash verification", () => {
|
||||
const data = parseNodeSection(
|
||||
readFileSync(`${fixtureDir}/id.tri.bundle`)
|
||||
);
|
||||
const { nodeMap } = parseNodes(data);
|
||||
// Find a stem node to corrupt
|
||||
let stemKey = null;
|
||||
for (const [key, node] of nodeMap) {
|
||||
if (node.type === "stem") { stemKey = key; break; }
|
||||
}
|
||||
ok(stemKey, "should find a stem node to corrupt");
|
||||
const stem = nodeMap.get(stemKey);
|
||||
// Corrupt the child hash so serializeNode produces a different payload
|
||||
const corrupted = {
|
||||
...stem,
|
||||
childHash: "00".repeat(32),
|
||||
payload: Buffer.concat([Buffer.from([0x01]), Buffer.alloc(32, 0x00)]),
|
||||
};
|
||||
nodeMap.set(stemKey, corrupted);
|
||||
const { verified, mismatches } = verifyNodeHashes(nodeMap);
|
||||
ok(!verified, "corrupted stem should fail hash verification");
|
||||
ok(mismatches.length > 0, "should have mismatches");
|
||||
});
|
||||
});
|
||||
|
||||
describe("merkle — closure verification", () => {
|
||||
const fixtureDir = "test/fixtures";
|
||||
|
||||
it("id.tri.bundle has complete closure", () => {
|
||||
const data = parseNodeSection(
|
||||
readFileSync(`${fixtureDir}/id.tri.bundle`)
|
||||
);
|
||||
const { nodeMap } = parseNodes(data);
|
||||
const { complete, missing } = verifyClosure(nodeMap);
|
||||
ok(complete, "id.tri.bundle should have complete closure");
|
||||
strictEqual(missing.length, 0);
|
||||
});
|
||||
|
||||
it("verifyRootClosure checks transitive reachability", () => {
|
||||
const data = parseNodeSection(
|
||||
readFileSync(`${fixtureDir}/id.tri.bundle`)
|
||||
);
|
||||
const { nodeMap } = parseNodes(data);
|
||||
const rootHash = "039cc9aacf5be78ec1975713e6ad154a36988e3f3df18589b0d0c801d0825d78";
|
||||
const { complete, missingRoots } = verifyRootClosure(nodeMap, rootHash);
|
||||
ok(complete, "root should be reachable");
|
||||
strictEqual(missingRoots.length, 0);
|
||||
});
|
||||
});
|
||||
|
||||
// Helper import
|
||||
import { parseNodeSection as parseNodes } from "../src/merkle.js";
|
||||
|
||||
// Helper for throws
|
||||
function throws(fn, expected) {
|
||||
try {
|
||||
fn();
|
||||
return false;
|
||||
} catch (e) {
|
||||
return expected.test(e.message);
|
||||
}
|
||||
}
|
||||
80
ext/js/test/reduce.test.js
Normal file
80
ext/js/test/reduce.test.js
Normal file
@@ -0,0 +1,80 @@
|
||||
import { strictEqual, ok } from "node:assert";
|
||||
import { describe, it } from "node:test";
|
||||
import { apply, isLeaf, isStem, isFork } from "../src/tree.js";
|
||||
import { reduce } from "../src/cli.js";
|
||||
|
||||
describe("tree — basic types", () => {
|
||||
it("Leaf is empty array", () => {
|
||||
ok(isLeaf([]));
|
||||
ok(!isStem([]));
|
||||
ok(!isFork([]));
|
||||
});
|
||||
|
||||
it("Stem is single-element array", () => {
|
||||
ok(isStem([[]]));
|
||||
ok(!isLeaf([[]]));
|
||||
});
|
||||
|
||||
it("Fork is two-element array", () => {
|
||||
ok(isFork([[], []]));
|
||||
ok(!isLeaf([[], []]));
|
||||
});
|
||||
});
|
||||
|
||||
describe("tree — apply rules", () => {
|
||||
// Leaf = [], Stem = [child], Fork = [right, left]
|
||||
|
||||
it("apply(Leaf, b) = Stem(b)", () => {
|
||||
const b = []; // Leaf
|
||||
const result = apply([], b);
|
||||
ok(isStem(result), "Stem(b) should be a Stem");
|
||||
strictEqual(result[0], b);
|
||||
});
|
||||
|
||||
it("apply(Stem(a), b) = Fork(a, b)", () => {
|
||||
const a = []; // Leaf
|
||||
const b = []; // Leaf
|
||||
const result = apply([a], b);
|
||||
ok(isFork(result), "Fork(a, b) should be a Fork");
|
||||
// Fork = [right, left] = [b, a]
|
||||
strictEqual(result[0], b);
|
||||
strictEqual(result[1], a);
|
||||
});
|
||||
|
||||
it("apply(Fork(Leaf, a), _) = a", () => {
|
||||
// Fork(Leaf, a) = [a, Leaf]
|
||||
const a = []; // Leaf
|
||||
const result = apply([a, []], []);
|
||||
strictEqual(result, a);
|
||||
ok(isLeaf(result));
|
||||
});
|
||||
});
|
||||
|
||||
describe("tree — reduction", () => {
|
||||
it("reduces Leaf to Leaf", () => {
|
||||
const result = reduce([], 100);
|
||||
ok(isLeaf(result));
|
||||
});
|
||||
|
||||
it("reduces Stem Leaf to Stem Leaf", () => {
|
||||
const result = reduce([[]], 100);
|
||||
ok(isStem(result));
|
||||
ok(isLeaf(result[0]));
|
||||
});
|
||||
|
||||
it("reduces Fork Leaf Leaf to Fork Leaf Leaf", () => {
|
||||
const result = reduce([[], []], 100);
|
||||
ok(isFork(result));
|
||||
ok(isLeaf(result[0]));
|
||||
ok(isLeaf(result[1]));
|
||||
});
|
||||
|
||||
it("S combinator applied to Leaf reduces", () => {
|
||||
// S = t (t (t t)) t = Fork (Fork (Fork Leaf Leaf) Leaf) Leaf
|
||||
// In array form: [[[], []], [], []]
|
||||
const s = [[], [[[], []], []]];
|
||||
const leaf = [];
|
||||
const result = reduce([s, leaf], 100);
|
||||
ok(Array.isArray(result), "S Leaf should reduce to an array");
|
||||
});
|
||||
});
|
||||
84
ext/js/test/run-bundle.test.js
Normal file
84
ext/js/test/run-bundle.test.js
Normal file
@@ -0,0 +1,84 @@
|
||||
import { readFileSync } from "node:fs";
|
||||
import { strictEqual, ok, throws } from "node:assert";
|
||||
import { describe, it } from "node:test";
|
||||
import { parseManifest } from "../src/bundle.js";
|
||||
import { parseNodeSection as bundleParseNodeSection } from "../src/bundle.js";
|
||||
import { validateManifest, selectExport } from "../src/manifest.js";
|
||||
import { verifyNodeHashes, parseNodeSection as parseNodes } from "../src/merkle.js";
|
||||
import { buildTreeFromNodeMap } from "../src/cli.js";
|
||||
|
||||
const fixtureDir = "test/fixtures";
|
||||
|
||||
describe("run bundle — id.tri.bundle", () => {
|
||||
const bundle = readFileSync(`${fixtureDir}/id.tri.bundle`);
|
||||
const manifest = parseManifest(bundle);
|
||||
const nodeSectionData = bundleParseNodeSection(bundle);
|
||||
const { nodeMap } = parseNodes(nodeSectionData);
|
||||
|
||||
it("manifest validates", () => {
|
||||
validateManifest(manifest);
|
||||
});
|
||||
|
||||
it("node hashes verify", () => {
|
||||
const { verified } = verifyNodeHashes(nodeMap);
|
||||
ok(verified);
|
||||
});
|
||||
|
||||
it("export 'id' is selectable", () => {
|
||||
const exp = selectExport(manifest, "id");
|
||||
strictEqual(exp.name, "id");
|
||||
});
|
||||
|
||||
it("tree reconstructs as a Fork", () => {
|
||||
const exp = selectExport(manifest, "id");
|
||||
const tree = buildTreeFromNodeMap(nodeMap, exp.root);
|
||||
ok(Array.isArray(tree));
|
||||
// id = t (t t) = Fork (Stem Leaf) Leaf...
|
||||
// In Haskell: id = S = t (t (t t)) t
|
||||
// This is Fork (Fork (Fork Leaf Leaf) Leaf) Leaf
|
||||
// In array form: [[[], []], [], []]
|
||||
ok(tree.length >= 2, "tree should be a Fork (length >= 2)");
|
||||
});
|
||||
});
|
||||
|
||||
describe("run bundle — true.tri.bundle", () => {
|
||||
const bundle = readFileSync(`${fixtureDir}/true.tri.bundle`);
|
||||
const manifest = parseManifest(bundle);
|
||||
const nodeSectionData = bundleParseNodeSection(bundle);
|
||||
const { nodeMap } = parseNodes(nodeSectionData);
|
||||
|
||||
it("manifest validates", () => {
|
||||
validateManifest(manifest);
|
||||
});
|
||||
|
||||
it("export 'const' is selectable", () => {
|
||||
const exp = selectExport(manifest, "const");
|
||||
strictEqual(exp.name, "const");
|
||||
});
|
||||
|
||||
it("tree reconstructs", () => {
|
||||
const exp = selectExport(manifest, "const");
|
||||
const tree = buildTreeFromNodeMap(nodeMap, exp.root);
|
||||
ok(Array.isArray(tree));
|
||||
});
|
||||
});
|
||||
|
||||
describe("run bundle — missing export", () => {
|
||||
const bundle = readFileSync(`${fixtureDir}/id.tri.bundle`);
|
||||
const manifest = parseManifest(bundle);
|
||||
|
||||
it("nonexistent export fails clearly", () => {
|
||||
throws(() => selectExport(manifest, "nonexistent"), /not found/);
|
||||
});
|
||||
});
|
||||
|
||||
describe("run bundle — auto-select", () => {
|
||||
// true.tri.bundle has only one export, should auto-select
|
||||
const bundle = readFileSync(`${fixtureDir}/true.tri.bundle`);
|
||||
const manifest = parseManifest(bundle);
|
||||
|
||||
it("single export auto-selects", () => {
|
||||
const exp = selectExport(manifest, undefined);
|
||||
ok(exp, "should auto-select the only export");
|
||||
});
|
||||
});
|
||||
@@ -9,6 +9,7 @@ import Data.Maybe (catMaybes, fromMaybe)
|
||||
import Data.Text (Text)
|
||||
import Database.SQLite.Simple
|
||||
import System.Directory (createDirectoryIfMissing, getXdgDirectory, XdgDirectory(..))
|
||||
import System.Environment (getEnv, lookupEnv)
|
||||
import System.FilePath ((</>), takeDirectory)
|
||||
|
||||
import qualified Data.Map as Map
|
||||
@@ -71,6 +72,10 @@ newContentStore = do
|
||||
|
||||
getContentStorePath :: IO FilePath
|
||||
getContentStorePath = do
|
||||
maybeLocalPath <- lookupEnv "TRICU_DB_PATH"
|
||||
case maybeLocalPath of
|
||||
Just p -> return p
|
||||
Nothing -> do
|
||||
dataDir <- getXdgDirectory XdgData "tricu"
|
||||
return $ dataDir </> "content-store.db"
|
||||
|
||||
|
||||
@@ -1,18 +1,32 @@
|
||||
module FileEval where
|
||||
module FileEval
|
||||
( preprocessFile
|
||||
, evaluateFile
|
||||
, evaluateFileWithContext
|
||||
, evaluateFileResult
|
||||
, evaluateFile
|
||||
, compileFile
|
||||
) where
|
||||
|
||||
import Eval
|
||||
import Eval (evalTricu)
|
||||
import Lexer
|
||||
import Parser
|
||||
import Research
|
||||
import ContentStore (initContentStore, storeTerm, hashTerm)
|
||||
import Wire (exportNamedBundle)
|
||||
|
||||
import Control.Monad ()
|
||||
import Data.List (partition)
|
||||
import Data.Maybe (mapMaybe)
|
||||
import Data.Maybe (fromMaybe, mapMaybe)
|
||||
import System.Environment (setEnv)
|
||||
import System.FilePath (takeDirectory, normalise, (</>))
|
||||
import System.IO ()
|
||||
import System.IO (hPutStrLn, stderr)
|
||||
import System.Exit (die)
|
||||
import Database.SQLite.Simple (Connection, close)
|
||||
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
import qualified Data.Map as Map
|
||||
import qualified Data.Set as Set
|
||||
import qualified Data.Text as T
|
||||
|
||||
extractMain :: Env -> Either String T
|
||||
extractMain env =
|
||||
@@ -152,3 +166,26 @@ isPrefixed name = '.' `elem` name
|
||||
nsVariable :: String -> String -> String
|
||||
nsVariable "" name = name
|
||||
nsVariable moduleName name = moduleName ++ "." ++ name
|
||||
|
||||
-- | Compile a tricu source file to a standalone Arborix bundle.
|
||||
-- Uses a temp content store so it does not collide with the global one.
|
||||
compileFile :: FilePath -> FilePath -> Maybe T.Text -> IO ()
|
||||
compileFile inputPath outputPath maybeExportName = do
|
||||
-- Evaluate the file to get the full environment
|
||||
env <- evaluateFile inputPath
|
||||
-- Look up the export name: prefer explicit, then fall back to "main"
|
||||
let name = fromMaybe "main" (T.unpack <$> maybeExportName)
|
||||
case Map.lookup name env of
|
||||
Nothing -> die $ "No definition '" ++ name ++ "' found in " ++ inputPath
|
||||
Just term -> do
|
||||
-- Create a temp content store
|
||||
setEnv "TRICU_DB_PATH" "/tmp/tricu-compile.db"
|
||||
conn <- initContentStore
|
||||
-- Store the term in the temp store
|
||||
_ <- storeTerm conn [name] term
|
||||
-- Export the bundle (exportNamedBundle returns already-encoded bytes)
|
||||
bundleData <- exportNamedBundle conn [(T.pack name, hashTerm term)]
|
||||
BL.writeFile outputPath (BL.fromStrict bundleData)
|
||||
close conn
|
||||
putStrLn $ "Compiled " ++ inputPath ++ " -> " ++ outputPath
|
||||
putStrLn $ " export: " ++ name
|
||||
|
||||
42
src/Main.hs
42
src/Main.hs
@@ -1,6 +1,6 @@
|
||||
module Main where
|
||||
|
||||
import ContentStore (initContentStore, termNames, hashToTerm, parseNameList)
|
||||
import ContentStore (getContentStorePath, initContentStore, termNames, hashToTerm, loadEnvironment, parseNameList)
|
||||
import Eval (evalTricu, mainResult, result)
|
||||
import FileEval
|
||||
import Parser (parseTricu)
|
||||
@@ -16,6 +16,7 @@ import qualified Data.Text as T
|
||||
import Data.Version (showVersion)
|
||||
import Paths_tricu (version)
|
||||
import System.Console.CmdArgs
|
||||
import System.Environment (lookupEnv)
|
||||
import System.IO (hPutStrLn, stderr)
|
||||
import System.Exit (die)
|
||||
import Text.Megaparsec ()
|
||||
@@ -30,6 +31,7 @@ data TricuArgs
|
||||
= Repl
|
||||
| Evaluate { file :: [FilePath], form :: EvaluatedForm }
|
||||
| TDecode { file :: [FilePath] }
|
||||
| Compile { inputFile :: FilePath, outFile :: FilePath, exportNameOpt :: String }
|
||||
| Export { hash :: String, exportNameOpt :: String, outFile :: FilePath }
|
||||
| Import { inFile :: FilePath }
|
||||
deriving (Show, Data, Typeable)
|
||||
@@ -86,10 +88,23 @@ importMode = Import
|
||||
&= explicit
|
||||
&= name "import"
|
||||
|
||||
compileMode :: TricuArgs
|
||||
compileMode = Compile
|
||||
{ inputFile = def &= help "Path to the tricu source file (.tri) to compile."
|
||||
&= name "f" &= typ "FILE"
|
||||
, outFile = def &= help "Output bundle file path (.tri.bundle)."
|
||||
&= name "o" &= typ "FILE"
|
||||
, exportNameOpt = def &= help "Definition name to use as the bundle root. Defaults to 'main'."
|
||||
&= name "x" &= typ "NAME"
|
||||
}
|
||||
&= help "Compile a tricu source file into a standalone Arborix portable bundle."
|
||||
&= explicit
|
||||
&= name "compile"
|
||||
|
||||
main :: IO ()
|
||||
main = do
|
||||
let versionStr = "tricu Evaluator and REPL " ++ showVersion version
|
||||
cmdArgsParsed <- cmdArgs $ modes [replMode, evaluateMode, decodeMode, exportMode, importMode]
|
||||
cmdArgsParsed <- cmdArgs $ modes [replMode, evaluateMode, decodeMode, compileMode, exportMode, importMode]
|
||||
&= help "tricu: Exploring Tree Calculus"
|
||||
&= program "tricu"
|
||||
&= summary versionStr
|
||||
@@ -100,10 +115,26 @@ main = do
|
||||
putStrLn "You may exit with `CTRL+D` or the `!exit` command."
|
||||
repl
|
||||
Evaluate { file = filePaths, form = outputForm } -> do
|
||||
maybeDbPath <- lookupEnv "TRICU_DB_PATH"
|
||||
evalResult <- case filePaths of
|
||||
[] -> runTricuT <$> getContents
|
||||
[] -> do
|
||||
initialEnv <- case maybeDbPath of
|
||||
Just dbPath -> do
|
||||
conn <- initContentStore
|
||||
env <- loadEnvironment conn
|
||||
close conn
|
||||
return env
|
||||
Nothing -> return Map.empty
|
||||
input <- getContents
|
||||
pure $ runTricuTEnv initialEnv input
|
||||
(filePath:restFilePaths) -> do
|
||||
initialEnv <- evaluateFile filePath
|
||||
initialEnv <- case maybeDbPath of
|
||||
Just _ -> do
|
||||
conn <- initContentStore
|
||||
env <- loadEnvironment conn
|
||||
close conn
|
||||
return env
|
||||
Nothing -> return Map.empty
|
||||
finalEnv <- foldM evaluateFileWithContext initialEnv restFilePaths
|
||||
pure $ mainResult finalEnv
|
||||
let fRes = formatT outputForm evalResult
|
||||
@@ -128,6 +159,9 @@ main = do
|
||||
putStrLn $ "Imported " ++ show (length roots) ++ " root(s):"
|
||||
mapM_ (\r -> putStrLn $ " " ++ unpack r) roots
|
||||
close conn
|
||||
Compile { inputFile = inputFile', outFile = outFile', exportNameOpt = exportNameArg } ->
|
||||
let exportName = if null exportNameArg then Nothing else Just (T.pack exportNameArg)
|
||||
in compileFile inputFile' outFile' exportName
|
||||
|
||||
runTricu :: String -> String
|
||||
runTricu = formatT TreeCalculus . runTricuT
|
||||
|
||||
89
src/Wire.hs
89
src/Wire.hs
@@ -64,7 +64,7 @@ bundleMinorVersion = 0
|
||||
|
||||
-- | Header magic for the portable executable-object container.
|
||||
bundleMagic :: ByteString
|
||||
bundleMagic = BS.pack [0x54, 0x52, 0x49, 0x43, 0x55, 0x42, 0x4e, 0x44] -- "TRICUBND"
|
||||
bundleMagic = BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x58, 0x00] -- "ARBORIX\0"
|
||||
|
||||
headerLength :: Int
|
||||
headerLength = 32
|
||||
@@ -83,13 +83,6 @@ compressionNone, digestSha256 :: Word16
|
||||
compressionNone = 0
|
||||
digestSha256 = 1
|
||||
|
||||
-- | Backwards compatibility for the original experimental node-list format.
|
||||
legacyMagic :: ByteString
|
||||
legacyMagic = BS.pack [0x54, 0x52, 0x49, 0x43, 0x55] -- "TRICU"
|
||||
|
||||
legacyWireVersion :: Word8
|
||||
legacyWireVersion = 0x01
|
||||
|
||||
-- | Closure declaration. V1 only accepts complete bundles for import.
|
||||
data ClosureMode = ClosureComplete | ClosurePartial
|
||||
deriving (Show, Eq, Ord, Generic)
|
||||
@@ -200,7 +193,7 @@ instance FromJSON BundleExport where
|
||||
<$> o .: "name"
|
||||
<*> o .: "root"
|
||||
<*> o .:? "kind" .!= "term"
|
||||
<*> o .:? "abi" .!= "tricu.abi.tree.v1"
|
||||
<*> o .:? "abi" .!= "arborix.abi.tree.v1"
|
||||
<*> o .:? "input"
|
||||
<*> o .:? "output"
|
||||
|
||||
@@ -302,12 +295,10 @@ encodeBundle bundle =
|
||||
(fromIntegral sectionCount) 0 dirOffset
|
||||
in header <> manifestEntry <> nodesEntry <> manifestBytes <> nodeSection
|
||||
|
||||
-- | Decode portable Bundle v1 bytes, with fallback support for the previous
|
||||
-- experimental TRICU node-list format.
|
||||
-- | Decode portable Bundle v1 bytes.
|
||||
decodeBundle :: ByteString -> Either String Bundle
|
||||
decodeBundle bs
|
||||
| BS.take (BS.length bundleMagic) bs == bundleMagic = decodePortableBundle bs
|
||||
| BS.take (BS.length legacyMagic) bs == legacyMagic = decodeLegacyBundle bs
|
||||
| otherwise = Left "invalid magic"
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
@@ -439,20 +430,20 @@ decodeSectionEntries count bytes = reverse <$> go count bytes []
|
||||
|
||||
defaultManifest :: [(Text, MerkleHash)] -> Int -> BundleManifest
|
||||
defaultManifest namedRoots nodeCount = BundleManifest
|
||||
{ manifestSchema = "tricu.bundle.manifest.v1"
|
||||
{ manifestSchema = "arborix.bundle.manifest.v1"
|
||||
, manifestBundleType = "tree-calculus-executable-object"
|
||||
, manifestTree = TreeSpec
|
||||
{ treeCalculus = "tree-calculus.v1"
|
||||
, treeNodeHash = NodeHashSpec
|
||||
{ nodeHashAlgorithm = "sha256"
|
||||
, nodeHashDomain = "tricu.merkle.node.v1"
|
||||
, nodeHashDomain = "arborix.merkle.node.v1"
|
||||
}
|
||||
, treeNodePayload = "tricu.merkle.payload.v1"
|
||||
, treeNodePayload = "arborix.merkle.payload.v1"
|
||||
}
|
||||
, manifestRuntime = RuntimeSpec
|
||||
{ runtimeSemantics = "tree-calculus.v1"
|
||||
, runtimeEvaluation = "normal-order"
|
||||
, runtimeAbi = "tricu.abi.tree.v1"
|
||||
, runtimeAbi = "arborix.abi.tree.v1"
|
||||
, runtimeCapabilities = []
|
||||
}
|
||||
, manifestClosure = ClosureComplete
|
||||
@@ -462,7 +453,7 @@ defaultManifest namedRoots nodeCount = BundleManifest
|
||||
, manifestSections = object
|
||||
[ "nodes" .= object
|
||||
[ "count" .= nodeCount
|
||||
, "payload" .= ("tricu.merkle.payload.v1" :: Text)
|
||||
, "payload" .= ("arborix.merkle.payload.v1" :: Text)
|
||||
]
|
||||
]
|
||||
, manifestMetadata = BundleMetadata
|
||||
@@ -470,7 +461,7 @@ defaultManifest namedRoots nodeCount = BundleManifest
|
||||
, metadataVersion = Nothing
|
||||
, metadataDescription = Nothing
|
||||
, metadataLicense = Nothing
|
||||
, metadataCreatedBy = Just "tricu"
|
||||
, metadataCreatedBy = Just "arborix"
|
||||
}
|
||||
}
|
||||
where
|
||||
@@ -480,7 +471,7 @@ defaultManifest namedRoots nodeCount = BundleManifest
|
||||
{ exportName = name
|
||||
, exportRoot = h
|
||||
, exportKind = "term"
|
||||
, exportAbi = "tricu.abi.tree.v1"
|
||||
, exportAbi = "arborix.abi.tree.v1"
|
||||
, exportInput = Nothing
|
||||
, exportOutput = Nothing
|
||||
}
|
||||
@@ -529,59 +520,7 @@ decodeNodeEntries count bs = go count bs Map.empty
|
||||
Left $ "duplicate node entry: " ++ unpack h
|
||||
go (n - 1) after (Map.insert h payload acc)
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
-- Legacy bundle decoding (read-only compatibility)
|
||||
-- ---------------------------------------------------------------------------
|
||||
|
||||
decodeLegacyBundle :: ByteString -> Either String Bundle
|
||||
decodeLegacyBundle bs
|
||||
| BS.length bs < 14 = Left "bundle too short"
|
||||
| BS.take 5 bs /= legacyMagic = Left "invalid legacy magic"
|
||||
| BS.index bs 5 /= legacyWireVersion =
|
||||
Left $ "unsupported legacy wire version: " ++ show (BS.index bs 5)
|
||||
| otherwise = do
|
||||
(rootCount, rest) <- decode32be "root_count" $ BS.drop 6 bs
|
||||
(nodeCount, rest') <- decode32be "node_count" rest
|
||||
let rootBytesLen = fromIntegral rootCount * 32
|
||||
if BS.length rest' < rootBytesLen
|
||||
then Left "bundle truncated in root hashes"
|
||||
else do
|
||||
let rawRoots = BS.take rootBytesLen rest'
|
||||
afterRoots = BS.drop rootBytesLen rest'
|
||||
roots =
|
||||
[ rawToMerkleHash (BS.take 32 (BS.drop (i * 32) rawRoots))
|
||||
| i <- [0 :: Int .. fromIntegral rootCount - 1]
|
||||
]
|
||||
namedRoots = zip (defaultExportNames $ length roots) roots
|
||||
nodes <- decodeLegacyNodeEntries nodeCount afterRoots
|
||||
let manifest = defaultManifest namedRoots (Map.size nodes)
|
||||
return Bundle
|
||||
{ bundleVersion = 1
|
||||
, bundleRoots = roots
|
||||
, bundleNodes = nodes
|
||||
, bundleManifest = manifest
|
||||
, bundleManifestBytes = BL.toStrict (encode manifest)
|
||||
}
|
||||
|
||||
decodeLegacyNodeEntries :: Word32 -> ByteString -> Either String (Map MerkleHash ByteString)
|
||||
decodeLegacyNodeEntries count bs = fst <$> go count bs Map.empty
|
||||
where
|
||||
go 0 rest acc = Right (acc, rest)
|
||||
go n bytes acc
|
||||
| BS.length bytes < 36 =
|
||||
Left "not enough bytes for node entry header (hash + length)"
|
||||
| otherwise = do
|
||||
let (hashBytes, rest) = BS.splitAt 32 bytes
|
||||
(plen, rest') <- decode32be "payload_len" rest
|
||||
let payloadLen = fromIntegral plen
|
||||
if BS.length rest' < payloadLen
|
||||
then Left "payload extends beyond legacy bundle end"
|
||||
else do
|
||||
let (payload, after) = BS.splitAt payloadLen rest'
|
||||
h = rawToMerkleHash hashBytes
|
||||
when (Map.member h acc) $
|
||||
Left $ "duplicate node entry: " ++ unpack h
|
||||
go (n - 1) after (Map.insert h payload acc)
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
-- Bundle verification
|
||||
@@ -611,7 +550,7 @@ verifyBundle bundle = do
|
||||
|
||||
verifyManifest :: BundleManifest -> Either String ()
|
||||
verifyManifest manifest = do
|
||||
when (manifestSchema manifest /= "tricu.bundle.manifest.v1") $
|
||||
when (manifestSchema manifest /= "arborix.bundle.manifest.v1") $
|
||||
Left $ "unsupported manifest schema: " ++ unpack (manifestSchema manifest)
|
||||
when (manifestBundleType manifest /= "tree-calculus-executable-object") $
|
||||
Left $ "unsupported bundle type: " ++ unpack (manifestBundleType manifest)
|
||||
@@ -622,13 +561,13 @@ verifyManifest manifest = do
|
||||
Left $ "unsupported calculus: " ++ unpack (treeCalculus treeSpec)
|
||||
when (nodeHashAlgorithm hashSpec /= "sha256") $
|
||||
Left $ "unsupported node hash algorithm: " ++ unpack (nodeHashAlgorithm hashSpec)
|
||||
when (nodeHashDomain hashSpec /= "tricu.merkle.node.v1") $
|
||||
when (nodeHashDomain hashSpec /= "arborix.merkle.node.v1") $
|
||||
Left $ "unsupported node hash domain: " ++ unpack (nodeHashDomain hashSpec)
|
||||
when (treeNodePayload treeSpec /= "tricu.merkle.payload.v1") $
|
||||
when (treeNodePayload treeSpec /= "arborix.merkle.payload.v1") $
|
||||
Left $ "unsupported node payload: " ++ unpack (treeNodePayload treeSpec)
|
||||
when (runtimeSemantics runtimeSpec /= "tree-calculus.v1") $
|
||||
Left $ "unsupported runtime semantics: " ++ unpack (runtimeSemantics runtimeSpec)
|
||||
when (runtimeAbi runtimeSpec /= "tricu.abi.tree.v1") $
|
||||
when (runtimeAbi runtimeSpec /= "arborix.abi.tree.v1") $
|
||||
Left $ "unsupported runtime ABI: " ++ unpack (runtimeAbi runtimeSpec)
|
||||
unless (null $ runtimeCapabilities runtimeSpec) $
|
||||
Left "host/runtime capabilities are not supported by bundle v1"
|
||||
|
||||
12
test/Spec.hs
12
test/Spec.hs
@@ -686,7 +686,7 @@ wireTests = testGroup "Wire Tests"
|
||||
, "main = id t"
|
||||
]
|
||||
wireData <- exportBundle srcConn [termHash]
|
||||
BS.take 8 wireData @?= BS.pack [0x54, 0x52, 0x49, 0x43, 0x55, 0x42, 0x4e, 0x44]
|
||||
BS.take 8 wireData @?= BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x58, 0x00]
|
||||
case decodeBundle wireData of
|
||||
Left err -> assertFailure $ "decodeBundle failed: " ++ err
|
||||
Right bundle -> do
|
||||
@@ -694,15 +694,15 @@ wireTests = testGroup "Wire Tests"
|
||||
tree = manifestTree manifest
|
||||
hashSpec = treeNodeHash tree
|
||||
runtime = manifestRuntime manifest
|
||||
manifestSchema manifest @?= "tricu.bundle.manifest.v1"
|
||||
manifestSchema manifest @?= "arborix.bundle.manifest.v1"
|
||||
manifestBundleType manifest @?= "tree-calculus-executable-object"
|
||||
manifestClosure manifest @?= ClosureComplete
|
||||
treeCalculus tree @?= "tree-calculus.v1"
|
||||
treeNodePayload tree @?= "tricu.merkle.payload.v1"
|
||||
treeNodePayload tree @?= "arborix.merkle.payload.v1"
|
||||
nodeHashAlgorithm hashSpec @?= "sha256"
|
||||
nodeHashDomain hashSpec @?= "tricu.merkle.node.v1"
|
||||
nodeHashDomain hashSpec @?= "arborix.merkle.node.v1"
|
||||
runtimeSemantics runtime @?= "tree-calculus.v1"
|
||||
runtimeAbi runtime @?= "tricu.abi.tree.v1"
|
||||
runtimeAbi runtime @?= "arborix.abi.tree.v1"
|
||||
runtimeCapabilities runtime @?= []
|
||||
bundleRoots bundle @?= [termHash]
|
||||
map exportRoot (manifestExports manifest) @?= [termHash]
|
||||
@@ -723,7 +723,7 @@ wireTests = testGroup "Wire Tests"
|
||||
exportName exported @?= "validateEmail"
|
||||
exportRoot exported @?= termHash
|
||||
exportKind exported @?= "term"
|
||||
exportAbi exported @?= "tricu.abi.tree.v1"
|
||||
exportAbi exported @?= "arborix.abi.tree.v1"
|
||||
exports -> assertFailure $ "Expected one export, got: " ++ show exports
|
||||
close srcConn
|
||||
|
||||
|
||||
BIN
test/fixtures/equalQ.tri.bundle
vendored
Normal file
BIN
test/fixtures/equalQ.tri.bundle
vendored
Normal file
Binary file not shown.
BIN
test/fixtures/false.tri.bundle
vendored
Normal file
BIN
test/fixtures/false.tri.bundle
vendored
Normal file
Binary file not shown.
BIN
test/fixtures/id.tri.bundle
vendored
Normal file
BIN
test/fixtures/id.tri.bundle
vendored
Normal file
Binary file not shown.
2
test/fixtures/notQ.tri
vendored
Normal file
2
test/fixtures/notQ.tri
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
\!import "base.tri" _
|
||||
main = not?
|
||||
BIN
test/fixtures/notQ.tri.bundle
vendored
Normal file
BIN
test/fixtures/notQ.tri.bundle
vendored
Normal file
Binary file not shown.
BIN
test/fixtures/true.tri.bundle
vendored
Normal file
BIN
test/fixtures/true.tri.bundle
vendored
Normal file
Binary file not shown.
Reference in New Issue
Block a user