From 343ecbf4c416912407570f00454ad78930f2dc0e Mon Sep 17 00:00:00 2001 From: James Eversole Date: Fri, 8 May 2026 09:12:20 -0500 Subject: [PATCH] Arborix -> Arboricx rename --- AGENTS.md | 26 +- ...-cbor-v1.md => arboricx-bundle-cbor-v1.md} | 32 +- ext/js/package.json | 8 +- ext/js/src/bundle.js | 8 +- ext/js/src/cbor.js | 2 +- ext/js/src/cli.js | 4 +- ext/js/src/manifest.js | 8 +- ext/js/src/merkle.js | 4 +- ext/js/test/bundle.test.js | 32 +- ext/js/test/merkle.test.js | 42 +-- ext/js/test/run-bundle.test.js | 22 +- flake.nix | 3 - lib/{arborix.tri => arboricx.tri} | 66 ++-- src/FileEval.hs | 2 +- src/Main.hs | 8 +- src/Research.hs | 6 +- src/Server.hs | 12 +- src/Wire.hs | 22 +- test/Spec.hs | 332 +++++++++--------- test/fixtures/false.arboricx | Bin 0 -> 707 bytes test/fixtures/false.arborix | Bin 701 -> 0 bytes test/fixtures/id.arboricx | Bin 0 -> 946 bytes test/fixtures/id.arborix | Bin 940 -> 0 bytes test/fixtures/map.arboricx | Bin 0 -> 5728 bytes test/fixtures/map.arborix | Bin 5722 -> 0 bytes test/fixtures/notQ.arboricx | Bin 0 -> 1180 bytes test/fixtures/notQ.arborix | Bin 1174 -> 0 bytes test/fixtures/true.arboricx | Bin 0 -> 776 bytes test/fixtures/true.arborix | Bin 770 -> 0 bytes 29 files changed, 315 insertions(+), 324 deletions(-) rename docs/{arborix-bundle-cbor-v1.md => arboricx-bundle-cbor-v1.md} (89%) rename lib/{arborix.tri => arboricx.tri} (91%) create mode 100644 test/fixtures/false.arboricx delete mode 100644 test/fixtures/false.arborix create mode 100644 test/fixtures/id.arboricx delete mode 100644 test/fixtures/id.arborix create mode 100644 test/fixtures/map.arboricx delete mode 100644 test/fixtures/map.arborix create mode 100644 test/fixtures/notQ.arboricx delete mode 100644 test/fixtures/notQ.arborix create mode 100644 test/fixtures/true.arboricx delete mode 100644 test/fixtures/true.arborix diff --git a/AGENTS.md b/AGENTS.md index d843ef9..bcb4467 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -40,7 +40,7 @@ nix build .# | `REPL.hs` | Interactive Read-Eval-Print Loop (haskeline) | | `Research.hs` | Core types, `apply` reduction, booleans, marshalling (`ofString`, `ofNumber`), output formatters (`toAscii`, `toTernaryString`, `decodeResult`) | | `ContentStore.hs` | SQLite-backed term persistence | -| `Wire.hs` | Arborix portable wire format — encode/decode/import/export of Merkle-DAG bundle blobs | +| `Wire.hs` | Arboricx portable wire format — encode/decode/import/export of Merkle-DAG bundle blobs | ### File extensions @@ -123,14 +123,14 @@ NLeaf → 0x00 NStem(h) → 0x01 || h (32 bytes) NFork(l,r) → 0x02 || l (32 bytes) || r (32 bytes) -hash = SHA256("arborix.merkle.node.v1" <> 0x00 <> serialized_node) +hash = SHA256("arboricx.merkle.node.v1" <> 0x00 <> serialized_node) ``` This is stored in SQLite via `ContentStore.hs`. Hash suffixes on identifiers (e.g., `foo_abc123...`) are validated: 16–64 hex characters (SHA256). -## 7. Arborix Portable Wire Format +## 7. Arboricx Portable Wire Format -The **Arborix wire format** (module `Wire.hs`) defines a portable binary bundle for exchanging Tree Calculus terms, their Merkle DAGs, and associated metadata. It is versioned and schema-driven. +The **Arboricx wire format** (module `Wire.hs`) defines a portable binary bundle for exchanging Tree Calculus terms, their Merkle DAGs, and associated metadata. It is versioned and schema-driven. ### Header @@ -143,7 +143,7 @@ The **Arborix wire format** (module `Wire.hs`) defines a portable binary bundle +------------------+-----------------+------------------+ ``` -- **Magic**: `ARBORIX\0` (`0x41 0x52 0x42 0x4f 0x52 0x49 0x58 0x00`) +- **Magic**: `ARBORICX` (`0x41 0x52 0x42 0x4f 0x52 0x49 0x43 0x58`) - **Header length**: 32 bytes - **Major version**: `1` | **Minor version**: `0` @@ -172,18 +172,18 @@ The manifest describes the bundle's semantics, exports, and schema. Key fields: | Field | Value | Description | |-------|-------|-------------| -| `schema` | `"arborix.bundle.manifest.v1"` | Manifest schema version | +| `schema` | `"arboricx.bundle.manifest.v1"` | Manifest schema version | | `bundleType` | `"tree-calculus-executable-object"` | Bundle category | | `tree.calculus` | `"tree-calculus.v1"` | Tree calculus version | | `tree.nodeHash.algorithm` | `"sha256"` | Hash algorithm | -| `tree.nodeHash.domain` | `"arborix.merkle.node.v1"` | Hash domain string | -| `tree.nodePayload` | `"arborix.merkle.payload.v1"` | Payload encoding | +| `tree.nodeHash.domain` | `"arboricx.merkle.node.v1"` | Hash domain string | +| `tree.nodePayload` | `"arboricx.merkle.payload.v1"` | Payload encoding | | `runtime.semantics` | `"tree-calculus.v1"` | Evaluation semantics | -| `runtime.abi` | `"arborix.abi.tree.v1"` | Runtime ABI | +| `runtime.abi` | `"arboricx.abi.tree.v1"` | Runtime ABI | | `closure` | `"complete"` | Bundle must be a complete DAG | | `roots` | `[{"hash": "...", "role": "..."}]` | Named root hashes | | `exports` | `[{"name": "...", "root": "..."}]` | Export aliases for roots | -| `metadata.createdBy` | `"arborix"` | Originator | +| `metadata.createdBy` | `"arboricx"` | Originator | ### Section 2 — Nodes (Binary) @@ -252,7 +252,7 @@ tricu/ │ ├── REPL.hs │ ├── Research.hs │ ├── ContentStore.hs -│ └── Wire.hs # Arborix portable wire format +│ └── Wire.hs # Arboricx portable wire format ├── test/ │ ├── Spec.hs # Tasty + HUnit tests │ ├── *.tri # tricu test programs @@ -270,9 +270,9 @@ tricu/ └── AGENTS.md # This file ``` -## 9. JS Arborix Runtime +## 9. JS Arboricx Runtime -A JavaScript implementation of the Arborix portable bundle runtime lives in `ext/js/`. +A JavaScript implementation of the Arboricx portable bundle runtime lives in `ext/js/`. It is a reference implementation — not a tricu source parser. It reads `.tri.bundle` files produced by the Haskell toolchain, verifies Merkle node hashes, reconstructs tree values, and reduces them. From project root: diff --git a/docs/arborix-bundle-cbor-v1.md b/docs/arboricx-bundle-cbor-v1.md similarity index 89% rename from docs/arborix-bundle-cbor-v1.md rename to docs/arboricx-bundle-cbor-v1.md index b9f9441..d2cf1bc 100644 --- a/docs/arborix-bundle-cbor-v1.md +++ b/docs/arboricx-bundle-cbor-v1.md @@ -1,8 +1,8 @@ -# Arborix Portable Bundle v1 (CBOR Manifest Profile) +# Arboricx Portable Bundle v1 (CBOR Manifest Profile) Status: **Draft, implementation-aligned** (derived from `src/Wire.hs` as of 2026-05-07) -This document specifies the **actual on-wire format and validation behavior** currently implemented by `tricu` for Arborix bundles, with a focus on the newer CBOR manifest path. +This document specifies the **actual on-wire format and validation behavior** currently implemented by `tricu` for Arboricx bundles, with a focus on the newer CBOR manifest path. --- @@ -38,7 +38,7 @@ A bundle is a byte stream: | Field | Size | Encoding | Value / Notes | |---|---:|---|---| -| Magic | 8 | raw bytes | `41 52 42 4f 52 49 58 00` (`"ARBORIX\0"`) | +| Magic | 8 | raw bytes | `41 52 42 4f 52 49 58 00` (`"ARBORICX"`) | | Major | 2 | u16 BE | Must be `1` | | Minor | 2 | u16 BE | Currently `0` | | SectionCount | 4 | u32 BE | Number of section directory entries | @@ -143,18 +143,18 @@ Unknown metadata keys are ignored. Writers in `Wire.hs` currently emit: -- `schema = "arborix.bundle.manifest.v1"` +- `schema = "arboricx.bundle.manifest.v1"` - `bundleType = "tree-calculus-executable-object"` - `tree.calculus = "tree-calculus.v1"` - `tree.nodeHash.algorithm = "sha256"` -- `tree.nodeHash.domain = "arborix.merkle.node.v1"` -- `tree.nodePayload = "arborix.merkle.payload.v1"` +- `tree.nodeHash.domain = "arboricx.merkle.node.v1"` +- `tree.nodePayload = "arboricx.merkle.payload.v1"` - `runtime.semantics = "tree-calculus.v1"` - `runtime.evaluation = "normal-order"` -- `runtime.abi = "arborix.abi.tree.v1"` +- `runtime.abi = "arboricx.abi.tree.v1"` - `runtime.capabilities = []` - `closure = "complete"` -- `metadata.createdBy = "arborix"` +- `metadata.createdBy = "arboricx"` --- @@ -249,17 +249,17 @@ These are important design gaps visible from current code. Status: **resolved in current codebase**. What was wrong: -- Manifest declared `tree.nodeHash.domain = "arborix.merkle.node.v1"`. +- Manifest declared `tree.nodeHash.domain = "arboricx.merkle.node.v1"`. - Hashing implementation previously used `"tricu.merkle.node.v1"`. Current state: -- Haskell hashing now uses `"arborix.merkle.node.v1"`. -- JS reference runtime hashing now uses `"arborix.merkle.node.v1"`. -- JS manifest validation now requires `"arborix.merkle.node.v1"`. +- Haskell hashing now uses `"arboricx.merkle.node.v1"`. +- JS reference runtime hashing now uses `"arboricx.merkle.node.v1"`. +- JS manifest validation now requires `"arboricx.merkle.node.v1"`. Remaining recommendation: - Keep hash-domain constants centralized/shared to prevent future drift. -- Add explicit test vectors for Leaf/Stem/Fork hashes under the Arborix domain. +- Add explicit test vectors for Leaf/Stem/Fork hashes under the Arboricx domain. ### Gap B: CBOR decode is order-strict, not generic-map tolerant @@ -334,6 +334,6 @@ A conforming v1 reader/writer for this profile should: To stabilize interoperability, add: -1. `docs/arborix-bundle-test-vectors.md` (golden header/manifest/nodes + expected hashes). -2. `docs/arborix-bundle-errors.md` (normative error codes/strings). -3. `docs/arborix-bundle-evolution.md` (rules for minor/major upgrades, capability negotiation, extra sections). +1. `docs/arboricx-bundle-test-vectors.md` (golden header/manifest/nodes + expected hashes). +2. `docs/arboricx-bundle-errors.md` (normative error codes/strings). +3. `docs/arboricx-bundle-evolution.md` (rules for minor/major upgrades, capability negotiation, extra sections). diff --git a/ext/js/package.json b/ext/js/package.json index bf9c806..a9ba01a 100644 --- a/ext/js/package.json +++ b/ext/js/package.json @@ -1,17 +1,17 @@ { - "name": "arborix-runtime", + "name": "arboricx-runtime", "version": "0.1.0", - "description": "Arborix portable bundle runtime — JavaScript reference implementation", + "description": "Arboricx portable bundle runtime — JavaScript reference implementation", "type": "module", "main": "src/bundle.js", "bin": { - "arborix-run": "src/cli.js" + "arboricx-run": "src/cli.js" }, "scripts": { "test": "node --test test/*.test.js", "inspect": "node src/cli.js inspect", "run": "node src/cli.js run" }, - "keywords": ["arborix", "tree-calculus", "trie", "runtime"], + "keywords": ["arboricx", "tree-calculus", "trie", "runtime"], "license": "MIT" } diff --git a/ext/js/src/bundle.js b/ext/js/src/bundle.js index 593c6eb..911b1a5 100644 --- a/ext/js/src/bundle.js +++ b/ext/js/src/bundle.js @@ -1,9 +1,9 @@ /** - * bundle.js — Parse an Arborix portable bundle binary into a JavaScript object. + * bundle.js — Parse an Arboricx portable bundle binary into a JavaScript object. * * Format (v1): * Header (32 bytes): - * Magic 8B "ARBORIX\0" + * Magic 8B "ARBORICX" * Major 2B u16 BE (must be 1) * Minor 2B u16 BE * SectionCount 4B u32 BE @@ -27,7 +27,7 @@ import { decodeCbor } from "./cbor.js"; // ── Constants ─────────────────────────────────────────────────────────────── -const MAGIC = Buffer.from([0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x58, 0x00]); // "ARBORIX\0" +const MAGIC = Buffer.from([0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x43, 0x58]); // "ARBORICX" const HEADER_LENGTH = 32; const SECTION_ENTRY_LENGTH = 60; const SECTION_MANIFEST = 1; @@ -69,7 +69,7 @@ export function parseBundle(buffer) { // Check magic if (!buffer.slice(0, 8).equals(MAGIC)) { - throw new Error("invalid magic: expected ARBORIX\\0"); + throw new Error("invalid magic: expected ARBORICX"); } // Parse header diff --git a/ext/js/src/cbor.js b/ext/js/src/cbor.js index 9b31f2b..352619f 100644 --- a/ext/js/src/cbor.js +++ b/ext/js/src/cbor.js @@ -1,5 +1,5 @@ /** - * cbor.js — Minimal CBOR decoder for the Arborix manifest format. + * cbor.js — Minimal CBOR decoder for the Arboricx manifest format. * * Decodes the canonical CBOR produced by the Haskell cborg library: * - Maps: major type 5 (0xa0 + length) diff --git a/ext/js/src/cli.js b/ext/js/src/cli.js index 843284a..67e51a0 100644 --- a/ext/js/src/cli.js +++ b/ext/js/src/cli.js @@ -1,6 +1,6 @@ #!/usr/bin/env node /** - * cli.js — Minimal CLI for inspecting and running Arborix bundles. + * cli.js — Minimal CLI for inspecting and running Arboricx bundles. * * Usage: * node cli.js inspect @@ -240,7 +240,7 @@ switch (command) { break; } default: - console.log("Arborix JS Runtime"); + console.log("Arboricx JS Runtime"); console.log(""); console.log("Usage:"); console.log(" node cli.js inspect "); diff --git a/ext/js/src/manifest.js b/ext/js/src/manifest.js index 64a5e91..f6bef29 100644 --- a/ext/js/src/manifest.js +++ b/ext/js/src/manifest.js @@ -13,7 +13,7 @@ * Throws on violation. */ export function validateManifest(manifest) { - if (manifest.schema !== "arborix.bundle.manifest.v1") { + if (manifest.schema !== "arboricx.bundle.manifest.v1") { throw new Error( `unsupported manifest schema: ${manifest.schema}` ); @@ -33,12 +33,12 @@ export function validateManifest(manifest) { `unsupported node hash algorithm: ${tree.nodeHash.algorithm}` ); } - if (tree.nodeHash.domain !== "arborix.merkle.node.v1") { + if (tree.nodeHash.domain !== "arboricx.merkle.node.v1") { throw new Error( `unsupported node hash domain: ${tree.nodeHash.domain}` ); } - if (tree.nodePayload !== "arborix.merkle.payload.v1") { + if (tree.nodePayload !== "arboricx.merkle.payload.v1") { throw new Error(`unsupported node payload: ${tree.nodePayload}`); } @@ -46,7 +46,7 @@ export function validateManifest(manifest) { if (runtime.semantics !== "tree-calculus.v1") { throw new Error(`unsupported runtime semantics: ${runtime.semantics}`); } - if (runtime.abi !== "arborix.abi.tree.v1") { + if (runtime.abi !== "arboricx.abi.tree.v1") { throw new Error(`unsupported runtime ABI: ${runtime.abi}`); } if (runtime.capabilities && runtime.capabilities.length > 0) { diff --git a/ext/js/src/merkle.js b/ext/js/src/merkle.js index 57f938c..c218f88 100644 --- a/ext/js/src/merkle.js +++ b/ext/js/src/merkle.js @@ -7,14 +7,14 @@ * Fork: 0x02 || left_hash (32 bytes raw) || right_hash (32 bytes raw) * * Hash computation: - * hash = SHA256( "arborix.merkle.node.v1" || 0x00 || node_payload ) + * hash = SHA256( "arboricx.merkle.node.v1" || 0x00 || node_payload ) */ import { createHash } from "node:crypto"; // ── Constants ─────────────────────────────────────────────────────────────── -const DOMAIN_TAG = "arborix.merkle.node.v1"; +const DOMAIN_TAG = "arboricx.merkle.node.v1"; const HASH_LENGTH = 32; // raw hash bytes const HEX_LENGTH = 64; // hex-encoded hash length diff --git a/ext/js/test/bundle.test.js b/ext/js/test/bundle.test.js index 26b2830..c253a25 100644 --- a/ext/js/test/bundle.test.js +++ b/ext/js/test/bundle.test.js @@ -19,7 +19,7 @@ const fixtureDir = "../../test/fixtures"; describe("bundle parsing", () => { it("valid bundle parses header and sections", () => { const bundle = parseBundle( - readFileSync(`${fixtureDir}/id.arborix`) + readFileSync(`${fixtureDir}/id.arboricx`) ); strictEqual(bundle.version, "1.0"); strictEqual(bundle.sectionCount, 2); @@ -29,23 +29,23 @@ describe("bundle parsing", () => { it("parseManifest returns valid manifest", () => { const manifest = parseManifest( - readFileSync(`${fixtureDir}/id.arborix`) + readFileSync(`${fixtureDir}/id.arboricx`) ); - strictEqual(manifest.schema, "arborix.bundle.manifest.v1"); + strictEqual(manifest.schema, "arboricx.bundle.manifest.v1"); strictEqual(manifest.bundleType, "tree-calculus-executable-object"); strictEqual(manifest.closure, "complete"); strictEqual(manifest.tree.calculus, "tree-calculus.v1"); strictEqual(manifest.tree.nodeHash.algorithm, "sha256"); - strictEqual(manifest.tree.nodeHash.domain, "arborix.merkle.node.v1"); + strictEqual(manifest.tree.nodeHash.domain, "arboricx.merkle.node.v1"); strictEqual(manifest.runtime.semantics, "tree-calculus.v1"); - strictEqual(manifest.runtime.abi, "arborix.abi.tree.v1"); + strictEqual(manifest.runtime.abi, "arboricx.abi.tree.v1"); }); }); describe("hash verification", () => { it("valid bundle nodes verify", () => { const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arborix`) + readFileSync(`${fixtureDir}/id.arboricx`) ); const { nodeMap } = parseNodes(data); const { verified } = verifyNodeHashes(nodeMap); @@ -62,20 +62,20 @@ describe("errors", () => { it("unsupported version fails", () => { const buf = Buffer.alloc(32, 0); - buf.write("ARBORIX\0", 0, 8); + buf.write("ARBORICX", 0, 8); buf.writeUInt16BE(2, 8); // major version 2 throws(() => parseBundle(buf), /unsupported bundle major version/); }); it("bad section digest fails", () => { - const buf = readFileSync(`${fixtureDir}/id.arborix`); + const buf = readFileSync(`${fixtureDir}/id.arboricx`); // Corrupt one byte in the manifest section buf[152] ^= 0x01; throws(() => parseBundle(buf), /digest mismatch/); }); it("truncated bundle fails", () => { - const buf = readFileSync(`${fixtureDir}/id.arborix`); + const buf = readFileSync(`${fixtureDir}/id.arboricx`); const truncated = buf.slice(0, 40); throws(() => parseBundle(truncated), /truncated/); }); @@ -83,33 +83,33 @@ describe("errors", () => { it("missing nodes section fails", () => { // Build a bundle with only manifest entry in the directory (1 section instead of 2) const header = Buffer.alloc(32, 0); - header.write("ARBORIX\0", 0, 8); + header.write("ARBORICX", 0, 8); header.writeUInt16BE(1, 8); // major version header.writeUInt16BE(0, 10); // minor version header.writeUInt32BE(1, 12); // 1 section // Build a manifest JSON const manifestObj = { - schema: "arborix.bundle.manifest.v1", + schema: "arboricx.bundle.manifest.v1", bundleType: "tree-calculus-executable-object", tree: { calculus: "tree-calculus.v1", nodeHash: { algorithm: "sha256", - domain: "arborix.merkle.node.v1" + domain: "arboricx.merkle.node.v1" }, - nodePayload: "arborix.merkle.payload.v1" + nodePayload: "arboricx.merkle.payload.v1" }, runtime: { semantics: "tree-calculus.v1", evaluation: "normal-order", - abi: "arborix.abi.tree.v1", + abi: "arboricx.abi.tree.v1", capabilities: [] }, closure: "complete", roots: [{ hash: Buffer.alloc(32).toString("hex"), role: "default" }], - exports: [{ name: "root", root: Buffer.alloc(32).toString("hex"), kind: "term", abi: "arborix.abi.tree.v1" }], - metadata: { createdBy: "arborix" } + exports: [{ name: "root", root: Buffer.alloc(32).toString("hex"), kind: "term", abi: "arboricx.abi.tree.v1" }], + metadata: { createdBy: "arboricx" } }; const manifestJson = JSON.stringify(manifestObj); const manifestBytes = Buffer.from(manifestJson); diff --git a/ext/js/test/merkle.test.js b/ext/js/test/merkle.test.js index bb233a5..bbafe10 100644 --- a/ext/js/test/merkle.test.js +++ b/ext/js/test/merkle.test.js @@ -51,35 +51,35 @@ describe("merkle — computeNodeHash", () => { strictEqual(hash.length, 64); }); - it("Leaf hash matches expected Arborix domain", () => { + it("Leaf hash matches expected Arboricx domain", () => { const leaf = { type: "leaf" }; const hash = computeNodeHash(leaf); - strictEqual(hash, "e54db458aa8e94782f7c61ad6c1f19a1c0c6fca7ffe53674f0d2bc5ff7ab02ff"); + strictEqual(hash, "92b8a9796dbeafbcd36757535876256392170d137bf36b319d77f11a37112158"); }); }); describe("merkle — node section parsing", () => { const fixtureDir = "../../test/fixtures"; - it("parses id.arborix with correct node count", () => { + it("parses id.arboricx with correct node count", () => { const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arborix`) + readFileSync(`${fixtureDir}/id.arboricx`) ); const { nodeMap } = parseNodeSection(data); strictEqual(nodeMap.size, 4); }); - it("parses true.arborix with correct node count", () => { + it("parses true.arboricx with correct node count", () => { const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/true.arborix`) + readFileSync(`${fixtureDir}/true.arboricx`) ); const { nodeMap } = parseNodeSection(data); strictEqual(nodeMap.size, 2); }); - it("parses false.arborix with correct node count", () => { + it("parses false.arboricx with correct node count", () => { const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/false.arborix`) + readFileSync(`${fixtureDir}/false.arboricx`) ); const { nodeMap } = parseNodeSection(data); strictEqual(nodeMap.size, 1); @@ -89,29 +89,29 @@ describe("merkle — node section parsing", () => { describe("merkle — hash verification", () => { const fixtureDir = "../../test/fixtures"; - it("id.arborix nodes all verify", () => { + it("id.arboricx nodes all verify", () => { const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arborix`) + readFileSync(`${fixtureDir}/id.arboricx`) ); const { nodeMap } = parseNodeSection(data); const { verified, mismatches } = verifyNodeHashes(nodeMap); - ok(verified, "id.arborix node hashes should verify"); + ok(verified, "id.arboricx node hashes should verify"); strictEqual(mismatches.length, 0); }); - it("true.arborix nodes all verify", () => { + it("true.arboricx nodes all verify", () => { const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/true.arborix`) + readFileSync(`${fixtureDir}/true.arboricx`) ); const { nodeMap } = parseNodeSection(data); const { verified, mismatches } = verifyNodeHashes(nodeMap); - ok(verified, "true.arborix node hashes should verify"); + ok(verified, "true.arboricx node hashes should verify"); strictEqual(mismatches.length, 0); }); it("corrupted node payload fails hash verification", () => { const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arborix`) + readFileSync(`${fixtureDir}/id.arboricx`) ); const { nodeMap } = parseNodeSection(data); // Find a stem node to corrupt @@ -137,23 +137,23 @@ describe("merkle — hash verification", () => { describe("merkle — closure verification", () => { const fixtureDir = "../../test/fixtures"; - it("id.arborix has complete closure", () => { + it("id.arboricx has complete closure", () => { const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arborix`) + readFileSync(`${fixtureDir}/id.arboricx`) ); const { nodeMap } = parseNodeSection(data); const { complete, missing } = verifyClosure(nodeMap); - ok(complete, "id.arborix should have complete closure"); + ok(complete, "id.arboricx should have complete closure"); strictEqual(missing.length, 0); }); it("verifyRootClosure checks transitive reachability", () => { const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arborix`) + readFileSync(`${fixtureDir}/id.arboricx`) ); const { nodeMap } = parseNodeSection(data); // Use the actual root hash from the fixture's manifest - const manifest = parseManifest(readFileSync(`${fixtureDir}/id.arborix`)); + const manifest = parseManifest(readFileSync(`${fixtureDir}/id.arboricx`)); const rootHash = manifest.exports[0].root; const { complete, missingRoots } = verifyRootClosure(nodeMap, rootHash); ok(complete, "root should be reachable"); @@ -162,7 +162,7 @@ describe("merkle — closure verification", () => { it("parseNodeSection returns correct node count", () => { const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arborix`) + readFileSync(`${fixtureDir}/id.arboricx`) ); const result = parseNodeSection(data); strictEqual(result.count, 4); diff --git a/ext/js/test/run-bundle.test.js b/ext/js/test/run-bundle.test.js index 4a87d45..d50d315 100644 --- a/ext/js/test/run-bundle.test.js +++ b/ext/js/test/run-bundle.test.js @@ -9,8 +9,8 @@ import { buildTreeFromNodeMap } from "../src/cli.js"; const fixtureDir = "../../test/fixtures"; -describe("run bundle — id.arborix", () => { - const bundle = readFileSync(`${fixtureDir}/id.arborix`); +describe("run bundle — id.arboricx", () => { + const bundle = readFileSync(`${fixtureDir}/id.arboricx`); const manifest = parseManifest(bundle); const nodeSectionData = bundleParseNodeSection(bundle); const { nodeMap } = parseNodes(nodeSectionData); @@ -37,8 +37,8 @@ describe("run bundle — id.arborix", () => { }); }); -describe("run bundle — true.arborix", () => { - const bundle = readFileSync(`${fixtureDir}/true.arborix`); +describe("run bundle — true.arboricx", () => { + const bundle = readFileSync(`${fixtureDir}/true.arboricx`); const manifest = parseManifest(bundle); const nodeSectionData = bundleParseNodeSection(bundle); const { nodeMap } = parseNodes(nodeSectionData); @@ -61,8 +61,8 @@ describe("run bundle — true.arborix", () => { }); }); -describe("run bundle — false.arborix", () => { - const bundle = readFileSync(`${fixtureDir}/false.arborix`); +describe("run bundle — false.arboricx", () => { + const bundle = readFileSync(`${fixtureDir}/false.arboricx`); const manifest = parseManifest(bundle); const nodeSectionData = bundleParseNodeSection(bundle); const { nodeMap } = parseNodes(nodeSectionData); @@ -83,8 +83,8 @@ describe("run bundle — false.arborix", () => { }); }); -describe("run bundle — notQ.arborix", () => { - const bundle = readFileSync(`${fixtureDir}/notQ.arborix`); +describe("run bundle — notQ.arboricx", () => { + const bundle = readFileSync(`${fixtureDir}/notQ.arboricx`); const manifest = parseManifest(bundle); const nodeSectionData = bundleParseNodeSection(bundle); const { nodeMap } = parseNodes(nodeSectionData); @@ -100,7 +100,7 @@ describe("run bundle — notQ.arborix", () => { }); describe("run bundle — missing export", () => { - const bundle = readFileSync(`${fixtureDir}/id.arborix`); + const bundle = readFileSync(`${fixtureDir}/id.arboricx`); const manifest = parseManifest(bundle); it("nonexistent export fails clearly", () => { @@ -109,8 +109,8 @@ describe("run bundle — missing export", () => { }); describe("run bundle — auto-select", () => { - // true.arborix has only one export, should auto-select - const bundle = readFileSync(`${fixtureDir}/true.arborix`); + // true.arboricx has only one export, should auto-select + const bundle = readFileSync(`${fixtureDir}/true.arboricx`); const manifest = parseManifest(bundle); it("single export auto-selects", () => { diff --git a/flake.nix b/flake.nix index acc7401..cd2a501 100644 --- a/flake.nix +++ b/flake.nix @@ -36,8 +36,6 @@ checks.${packageName} = tricuPackageTests; checks.default = tricuPackageTests; - defaultPackage = self.packages.${system}.default; - devShells.default = pkgs.mkShell { buildInputs = with pkgs; [ haskellPackages.cabal-install @@ -51,7 +49,6 @@ tricuPackage ]; }; - devShell = self.devShells.${system}.default; packages.${containerPackageName} = pkgs.dockerTools.buildImage { name = "tricu"; diff --git a/lib/arborix.tri b/lib/arboricx.tri similarity index 91% rename from lib/arborix.tri rename to lib/arboricx.tri index 4873ac8..8eca16b 100644 --- a/lib/arborix.tri +++ b/lib/arboricx.tri @@ -3,11 +3,11 @@ !import "bytes.tri" !Local !import "binary.tri" !Local -arborixMagic = [(65) (82) (66) (79) (82) (73) (88) (0)] -arborixMajorVersion = [(0) (1)] -arborixMinorVersion = [(0) (0)] -arborixManifestSectionId = [(0) (0) (0) (1)] -arborixNodesSectionId = [(0) (0) (0) (2)] +arboricxMagic = [(65) (82) (66) (79) (82) (73) (67) (88)] +arboricxMajorVersion = [(0) (1)] +arboricxMinorVersion = [(0) (0)] +arboricxManifestSectionId = [(0) (0) (0) (1)] +arboricxNodesSectionId = [(0) (0) (0) (2)] errMissingSection = 4 errUnsupportedVersion = 5 @@ -20,10 +20,10 @@ nodePayloadLeafTag = 0 nodePayloadStemTag = 1 nodePayloadForkTag = 2 -readArborixMagic = (bs : expectBytes arborixMagic bs) +readArboricxMagic = (bs : expectBytes arboricxMagic bs) -readArborixHeader = (bs : - bindResult (readArborixMagic bs) +readArboricxHeader = (bs : + bindResult (readArboricxMagic bs) (_ afterMagic : bindResult (readBytes 2 afterMagic) (majorVersion afterMajor : @@ -296,12 +296,12 @@ beBytesToNat = (bytes : u32BEBytesToNat = beBytesToNat u64BEBytesToNat = beBytesToNat -arborixHeaderMajorVersion = (header : +arboricxHeaderMajorVersion = (header : matchPair (majorVersion _ : majorVersion) header) -arborixHeaderMinorVersion = (header : +arboricxHeaderMinorVersion = (header : matchPair (_ payload : matchPair @@ -309,7 +309,7 @@ arborixHeaderMinorVersion = (header : payload) header) -arborixHeaderSectionCount = (header : +arboricxHeaderSectionCount = (header : matchPair (_ payload : matchPair @@ -320,7 +320,7 @@ arborixHeaderSectionCount = (header : payload) header) -arborixHeaderFlags = (header : +arboricxHeaderFlags = (header : matchPair (_ payload : matchPair @@ -334,7 +334,7 @@ arborixHeaderFlags = (header : payload) header) -arborixHeaderDirOffset = (header : +arboricxHeaderDirOffset = (header : matchPair (_ payload : matchPair @@ -348,22 +348,22 @@ arborixHeaderDirOffset = (header : payload) header) -validateArborixHeader = (header rest : +validateArboricxHeader = (header rest : matchBool (ok header rest) (err errUnsupportedVersion rest) (and? - (bytesEq? arborixMajorVersion (arborixHeaderMajorVersion header)) - (bytesEq? arborixMinorVersion (arborixHeaderMinorVersion header)))) + (bytesEq? arboricxMajorVersion (arboricxHeaderMajorVersion header)) + (bytesEq? arboricxMinorVersion (arboricxHeaderMinorVersion header)))) -readArborixContainer = (bs : - bindResult (readArborixHeader bs) +readArboricxContainer = (bs : + bindResult (readArboricxHeader bs) (header afterHeader : - bindResult (validateArborixHeader header afterHeader) + bindResult (validateArboricxHeader header afterHeader) (validHeader afterValidHeader : bindResult (readSectionDirectory - (u32BEBytesToNat (arborixHeaderSectionCount validHeader)) - (bytesDrop (u64BEBytesToNat (arborixHeaderDirOffset validHeader)) bs)) + (u32BEBytesToNat (arboricxHeaderSectionCount validHeader)) + (bytesDrop (u64BEBytesToNat (arboricxHeaderDirOffset validHeader)) bs)) (directory afterDirectory : bindResult (validateSectionDirectory directory afterDirectory) (validDirectory afterValidDirectory : @@ -403,21 +403,21 @@ sectionBytesOrErr = (sectionId directory containerBytes rest : (_ _ : err errMissingSection rest) (lookupSectionRecord sectionId directory)) -readArborixSectionBytes = (sectionId bs : - bindResult (readArborixContainer bs) +readArboricxSectionBytes = (sectionId bs : + bindResult (readArboricxContainer bs) (container afterContainer : matchPair (_ directory : sectionBytesOrErr sectionId directory bs afterContainer) container)) -readArborixRequiredSections = (bs : - bindResult (readArborixContainer bs) +readArboricxRequiredSections = (bs : + bindResult (readArboricxContainer bs) (container afterContainer : matchPair (_ directory : - bindResult (sectionBytesOrErr arborixManifestSectionId directory bs afterContainer) + bindResult (sectionBytesOrErr arboricxManifestSectionId directory bs afterContainer) (manifestBytes _ : - bindResult (sectionBytesOrErr arborixNodesSectionId directory bs afterContainer) + bindResult (sectionBytesOrErr arboricxNodesSectionId directory bs afterContainer) (nodesBytes _ : ok (pair manifestBytes nodesBytes) afterContainer))) container)) @@ -602,12 +602,12 @@ readNodesSectionComplete = (bs : (err errUnexpectedBytes afterNodesSection) (bytesNil? afterNodesSection))) -readArborixNodesSection = (bs : - bindResult (readArborixContainer bs) +readArboricxNodesSection = (bs : + bindResult (readArboricxContainer bs) (container afterContainer : matchPair (_ directory : - bindResult (sectionBytesOrErr arborixNodesSectionId directory bs afterContainer) + bindResult (sectionBytesOrErr arboricxNodesSectionId directory bs afterContainer) (nodesBytes _ : bindResult (readNodesSectionComplete nodesBytes) (nodesSection _ : ok nodesSection afterContainer))) @@ -645,10 +645,10 @@ nodeHashToTree = y (self nodeHash nodeRecords : (_ _ : err errMissingNode t) (lookupNodeRecord nodeHash nodeRecords)) -readArborixTreeFromHash = (rootHash bs : - bindResult (readArborixNodesSection bs) +readArboricxTreeFromHash = (rootHash bs : + bindResult (readArboricxNodesSection bs) (nodesSection afterContainer : bindResult (nodeHashToTree rootHash (nodesSectionRecords nodesSection)) (tree _ : ok tree afterContainer))) -readArborixExecutableFromHash = readArborixTreeFromHash +readArboricxExecutableFromHash = readArboricxTreeFromHash diff --git a/src/FileEval.hs b/src/FileEval.hs index f19e783..e0d4d64 100644 --- a/src/FileEval.hs +++ b/src/FileEval.hs @@ -162,7 +162,7 @@ nsVariable :: String -> String -> String nsVariable "" name = name nsVariable moduleName name = moduleName ++ "." ++ name --- | Compile a tricu source file to a standalone Arborix bundle. +-- | Compile a tricu source file to a standalone Arboricx bundle. -- Uses a temp content store so it does not collide with the global one. -- Supports multiple named exports; each is stored separately in the -- temp store so that resolveExportTarget can look them up by name. diff --git a/src/Main.hs b/src/Main.hs index 64aac81..16ba94d 100644 --- a/src/Main.hs +++ b/src/Main.hs @@ -97,7 +97,7 @@ compileMode = Compile , names = def &= help "Definition name(s) to export as bundle roots (comma-separated or repeated -x). Defaults to 'main'." &= name "x" &= typ "NAME" } - &= help "Compile a tricu source file into a standalone Arborix portable bundle." + &= help "Compile a tricu source file into a standalone Arboricx portable bundle." &= explicit &= name "compile" @@ -106,7 +106,7 @@ serveMode = Serve { host = "127.0.0.1" &= help "Host to bind the server to." &= name "h" &= typ "HOST" , port = 8787 &= help "HTTP port to listen on." &= name "p" &= typ "PORT" } - &= help "Start a read-only HTTP server for exporting Arborix bundles." + &= help "Start a read-only HTTP server for exporting Arboricx bundles." &= explicit &= name "server" @@ -182,10 +182,10 @@ main = do let exportNames = if null namesArg then [] else map T.pack namesArg in compileFile compileInputFile compileOutFile exportNames Serve { host = hostStr, port = portNum } -> do - putStrLn $ "Starting Arborix bundle server on " ++ hostStr ++ ":" ++ show portNum + putStrLn $ "Starting Arboricx bundle server on " ++ hostStr ++ ":" ++ show portNum putStrLn $ " GET /bundle/hash/:hash -- primary endpoint" putStrLn $ " GET /bundle/name/:name -- convenience endpoint" - putStrLn $ " Content-Type: application/vnd.arborix.bundle" + putStrLn $ " Content-Type: application/vnd.arboricx.bundle" runServer hostStr portNum runTricu :: String -> String diff --git a/src/Research.hs b/src/Research.hs index e20b581..3a0eebb 100644 --- a/src/Research.hs +++ b/src/Research.hs @@ -85,12 +85,12 @@ serializeNode (NFork l r) = BS.pack [0x02] <> go (decode (encodeUtf8 l)) <> go ( go (Right bs) = bs -- | Hash a node per the Merkle content-addressing spec. --- hash = SHA256( "arborix.merkle.node.v1" <> 0x00 <> node_payload ) +-- hash = SHA256( "arboricx.merkle.node.v1" <> 0x00 <> node_payload ) nodeHash :: Node -> MerkleHash nodeHash node = decodeUtf8 (encode (sha256WithPrefix (serializeNode node))) where sha256WithPrefix payload = convert . (hash :: BS.ByteString -> Digest SHA256) $ utf8Tag <> BS.pack [0x00] <> payload - utf8Tag = BS.pack $ map fromIntegral $ BS.unpack "arborix.merkle.node.v1" + utf8Tag = BS.pack $ map fromIntegral $ BS.unpack "arboricx.merkle.node.v1" -- | Deserialize a Node from canonical bytes. deserializeNode :: BS.ByteString -> Node @@ -138,7 +138,7 @@ toBytes t = case toList t of Left err -> Left err Right bs -> BS.pack <$> mapM toByte bs --- | Convert a canonical Arborix node payload (ByteString) to a Tree +-- | Convert a canonical Arboricx node payload (ByteString) to a Tree -- representation (a list of Byte trees). nodePayloadToTreeBytes :: BS.ByteString -> T nodePayloadToTreeBytes = ofBytes diff --git a/src/Server.hs b/src/Server.hs index 795f701..c2c55ea 100644 --- a/src/Server.hs +++ b/src/Server.hs @@ -23,7 +23,7 @@ import Data.ByteString.Char8 (unpack) import Data.ByteString.Lazy (fromStrict) import qualified Data.Text as T --- | Start an HTTP server that serves Arborix bundles from the +-- | Start an HTTP server that serves Arboricx bundles from the -- local content store. -- -- This is a read-only export surface. Clients fetch bundle bytes @@ -133,10 +133,10 @@ rootsHandler request respond = do (fromStrict bundleData) -- | GET /bundle/name/:name --- Resolve a stored term name, export it as an Arborix bundle, +-- Resolve a stored term name, export it as an Arboricx bundle, -- and return the raw bundle bytes. -- --- Sets @Content-Type@ and @X-Arborix-Root-Hash@ headers. +-- Sets @Content-Type@ and @X-Arboricx-Root-Hash@ headers. -- Returns 404 when the name does not resolve to any stored term. nameHandler :: Text -> IO Response nameHandler nameText = do @@ -155,7 +155,7 @@ nameHandler nameText = do return $ responseLBS status200 (bundleHeaders th cd) (fromStrict bundleData) -- | GET /bundle/hash/:hash --- Resolve a full Merkle hash and export the root as an Arborix +-- Resolve a full Merkle hash and export the root as an Arboricx -- bundle. -- -- - Malformed hash (non-hex or < 16 chars): 400 @@ -207,8 +207,8 @@ textResponse status body = bundleHeaders :: Text -> Text -> [Header] bundleHeaders root cd = - [ (hContentType, encodeUtf8 "application/vnd.arborix.bundle") - , ("X-Arborix-Root-Hash", encodeUtf8 root) + [ (hContentType, encodeUtf8 "application/vnd.arboricx.bundle") + , ("X-Arboricx-Root-Hash", encodeUtf8 root) , ("Content-Disposition", encodeUtf8 cd) ] diff --git a/src/Wire.hs b/src/Wire.hs index 836a5b8..b6ed741 100644 --- a/src/Wire.hs +++ b/src/Wire.hs @@ -71,7 +71,7 @@ bundleMinorVersion = 0 -- | Header magic for the portable executable-object container. bundleMagic :: ByteString -bundleMagic = BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x58, 0x00] -- "ARBORIX\0" +bundleMagic = BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x43, 0x58] -- "ARBORICX" headerLength :: Int headerLength = 32 @@ -563,20 +563,20 @@ decodeSectionEntries count bytes = reverse <$> go count bytes [] defaultManifest :: [(Text, MerkleHash)] -> BundleManifest defaultManifest namedRoots = BundleManifest - { manifestSchema = "arborix.bundle.manifest.v1" + { manifestSchema = "arboricx.bundle.manifest.v1" , manifestBundleType = "tree-calculus-executable-object" , manifestTree = TreeSpec { treeCalculus = "tree-calculus.v1" , treeNodeHash = NodeHashSpec { nodeHashAlgorithm = "sha256" - , nodeHashDomain = "arborix.merkle.node.v1" + , nodeHashDomain = "arboricx.merkle.node.v1" } - , treeNodePayload = "arborix.merkle.payload.v1" + , treeNodePayload = "arboricx.merkle.payload.v1" } , manifestRuntime = RuntimeSpec { runtimeSemantics = "tree-calculus.v1" , runtimeEvaluation = "normal-order" - , runtimeAbi = "arborix.abi.tree.v1" + , runtimeAbi = "arboricx.abi.tree.v1" , runtimeCapabilities = [] } , manifestClosure = ClosureComplete @@ -587,7 +587,7 @@ defaultManifest namedRoots = BundleManifest , metadataVersion = Nothing , metadataDescription = Nothing , metadataLicense = Nothing - , metadataCreatedBy = Just "arborix" + , metadataCreatedBy = Just "arboricx" } } where @@ -597,7 +597,7 @@ defaultManifest namedRoots = BundleManifest { exportName = name , exportRoot = h , exportKind = "term" - , exportAbi = "arborix.abi.tree.v1" + , exportAbi = "arboricx.abi.tree.v1" } -- --------------------------------------------------------------------------- @@ -672,7 +672,7 @@ verifyBundle bundle = do verifyManifest :: BundleManifest -> Either String () verifyManifest manifest = do - when (manifestSchema manifest /= "arborix.bundle.manifest.v1") $ + when (manifestSchema manifest /= "arboricx.bundle.manifest.v1") $ Left $ "unsupported manifest schema: " ++ unpack (manifestSchema manifest) when (manifestBundleType manifest /= "tree-calculus-executable-object") $ Left $ "unsupported bundle type: " ++ unpack (manifestBundleType manifest) @@ -683,13 +683,13 @@ verifyManifest manifest = do Left $ "unsupported calculus: " ++ unpack (treeCalculus treeSpec) when (nodeHashAlgorithm hashSpec /= "sha256") $ Left $ "unsupported node hash algorithm: " ++ unpack (nodeHashAlgorithm hashSpec) - when (nodeHashDomain hashSpec /= "arborix.merkle.node.v1") $ + when (nodeHashDomain hashSpec /= "arboricx.merkle.node.v1") $ Left $ "unsupported node hash domain: " ++ unpack (nodeHashDomain hashSpec) - when (treeNodePayload treeSpec /= "arborix.merkle.payload.v1") $ + when (treeNodePayload treeSpec /= "arboricx.merkle.payload.v1") $ Left $ "unsupported node payload: " ++ unpack (treeNodePayload treeSpec) when (runtimeSemantics runtimeSpec /= "tree-calculus.v1") $ Left $ "unsupported runtime semantics: " ++ unpack (runtimeSemantics runtimeSpec) - when (runtimeAbi runtimeSpec /= "arborix.abi.tree.v1") $ + when (runtimeAbi runtimeSpec /= "arboricx.abi.tree.v1") $ Left $ "unsupported runtime ABI: " ++ unpack (runtimeAbi runtimeSpec) when (not (null (runtimeCapabilities runtimeSpec))) $ Left "unsupported runtime capabilities" diff --git a/test/Spec.hs b/test/Spec.hs index 23a119d..1176cb7 100644 --- a/test/Spec.hs +++ b/test/Spec.hs @@ -786,7 +786,7 @@ wireTests = testGroup "Wire Tests" , "main = id t" ] wireData <- exportBundle srcConn [termHash] - BS.take 8 wireData @?= BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x58, 0x00] + BS.take 8 wireData @?= BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x43, 0x58] case decodeBundle wireData of Left err -> assertFailure $ "decodeBundle failed: " ++ err Right bundle -> do @@ -794,15 +794,15 @@ wireTests = testGroup "Wire Tests" tree = manifestTree manifest hashSpec = treeNodeHash tree runtime = manifestRuntime manifest - manifestSchema manifest @?= "arborix.bundle.manifest.v1" + manifestSchema manifest @?= "arboricx.bundle.manifest.v1" manifestBundleType manifest @?= "tree-calculus-executable-object" manifestClosure manifest @?= ClosureComplete treeCalculus tree @?= "tree-calculus.v1" - treeNodePayload tree @?= "arborix.merkle.payload.v1" + treeNodePayload tree @?= "arboricx.merkle.payload.v1" nodeHashAlgorithm hashSpec @?= "sha256" - nodeHashDomain hashSpec @?= "arborix.merkle.node.v1" + nodeHashDomain hashSpec @?= "arboricx.merkle.node.v1" runtimeSemantics runtime @?= "tree-calculus.v1" - runtimeAbi runtime @?= "arborix.abi.tree.v1" + runtimeAbi runtime @?= "arboricx.abi.tree.v1" runtimeCapabilities runtime @?= [] bundleRoots bundle @?= [termHash] map exportRoot (manifestExports manifest) @?= [termHash] @@ -823,7 +823,7 @@ wireTests = testGroup "Wire Tests" exportName exported @?= "validateEmail" exportRoot exported @?= termHash exportKind exported @?= "term" - exportAbi exported @?= "arborix.abi.tree.v1" + exportAbi exported @?= "arboricx.abi.tree.v1" exports -> assertFailure $ "Expected one export, got: " ++ show exports close srcConn @@ -1064,9 +1064,9 @@ u32 n = [0,0,0,n] u64 :: Integer -> [Integer] u64 n = [0,0,0,0,0,0,0,n] -arborixHeaderBytes :: Integer -> [Integer] -arborixHeaderBytes sectionCount = - [65,82,66,79,82,73,88,0] +arboricxHeaderBytes :: Integer -> [Integer] +arboricxHeaderBytes sectionCount = + [65,82,66,79,82,73,67,88] ++ u16 1 ++ u16 0 ++ u32 sectionCount @@ -1107,7 +1107,7 @@ simpleContainerBytes :: [Integer] -> [Integer] -> [Integer] simpleContainerBytes manifestBytes nodesBytes = let manifestOffset = 152 nodesOffset = manifestOffset + fromIntegral (length manifestBytes) - in arborixHeaderBytes 2 + in arboricxHeaderBytes 2 ++ manifestEntryBytes manifestOffset (fromIntegral $ length manifestBytes) ++ nodesEntryBytes nodesOffset (fromIntegral $ length nodesBytes) ++ manifestBytes @@ -1115,12 +1115,12 @@ simpleContainerBytes manifestBytes nodesBytes = singleSectionContainerBytes :: [Integer] -> [Integer] -> [Integer] singleSectionContainerBytes sectionType sectionBytes = - arborixHeaderBytes 1 + arboricxHeaderBytes 1 ++ sectionEntryBytes sectionType 92 (fromIntegral $ length sectionBytes) ++ sectionBytes -arborixHeaderT :: Integer -> T -arborixHeaderT sectionCount = +arboricxHeaderT :: Integer -> T +arboricxHeaderT sectionCount = pairT (bytesT [0,1]) (pairT (bytesT [0,0]) (pairT (bytesT $ u32 sectionCount) @@ -1615,120 +1615,114 @@ binaryReaderTests = testGroup "Binary Reader Tests" result env @?= errT eofT (bytesT [1,2,3]) -- ------------------------------------------------------------------------ - -- Arborix magic recognition + -- Arboricx magic recognition -- ------------------------------------------------------------------------ - , testCase "readArborixMagic: accepts magic and preserves rest" $ do - let input = "readArborixMagic [(65) (82) (66) (79) (82) (73) (88) (0) (1) (2)]" - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxMagic: accepts magic and preserves rest" $ do + let input = "readArboricxMagic ((append arboricxMagic) [(1) (2)])" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT unitT (bytesT [1,2]) - , testCase "readArborixMagic: rejects wrong magic preserving input" $ do - let input = "readArborixMagic [(65) (82) (66) (79) (82) (73) (88) (1) (9)]" - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxMagic: rejects wrong magic preserving input" $ do + let input = "readArboricxMagic [(65) (83) (66) (79) (82) (73) (67) (88) (1) (9)]" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) - result env @?= errT unexpectedBytesT (bytesT [65,82,66,79,82,73,88,1,9]) + result env @?= errT unexpectedBytesT (bytesT [65,83,66,79,82,73,67,88,1,9]) - , testCase "readArborixMagic: short input returns EOF preserving input" $ do - let input = "readArborixMagic [(65) (82) (66) (79)]" - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxMagic: short input returns EOF preserving input" $ do + let input = "readArboricxMagic [(65) (82) (66) (79)]" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT eofT (bytesT [65,82,66,79]) -- ------------------------------------------------------------------------ - -- Arborix header parsing + -- Arboricx header parsing -- ------------------------------------------------------------------------ - , testCase "readArborixHeader: parses portable header" $ do - let input = "readArborixHeader " ++ bytesExpr (arborixHeaderBytes 0) - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxHeader: parses portable header" $ do + let input = "readArboricxHeader " ++ bytesExpr (arboricxHeaderBytes 0) + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) - result env @?= okT (arborixHeaderT 0) (bytesT []) + result env @?= okT (arboricxHeaderT 0) (bytesT []) - , testCase "readArborixHeader: preserves trailing bytes" $ do - let input = "readArborixHeader " ++ bytesExpr (arborixHeaderBytes 0 ++ [9,8]) - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxHeader: preserves trailing bytes" $ do + let input = "readArboricxHeader " ++ bytesExpr (arboricxHeaderBytes 0 ++ [9,8]) + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) - result env @?= okT (arborixHeaderT 0) (bytesT [9,8]) + result env @?= okT (arboricxHeaderT 0) (bytesT [9,8]) - , testCase "readArborixHeader: rejects wrong magic preserving input" $ do - let input = "readArborixHeader [(65) (82) (66) (79) (82) (73) (88) (1) (0) (1)]" - library <- evaluateFile "./lib/arborix.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT unexpectedBytesT (bytesT [65,82,66,79,82,73,88,1,0,1]) - - , testCase "readArborixHeader: short input returns EOF preserving input" $ do - let input = "readArborixHeader [(65) (82)]" - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxHeader: short input returns EOF preserving input" $ do + let input = "readArboricxHeader [(65) (82)]" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT eofT (bytesT [65,82]) -- ------------------------------------------------------------------------ - -- Arborix section directory record parsing + -- Arboricx section directory record parsing -- ------------------------------------------------------------------------ , testCase "readSectionRecord: parses portable section entry" $ do let input = "readSectionRecord " ++ bytesExpr (nodesEntryBytes 16 32) - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT (sectionRecordT nodesSectionIdBytes 16 32) (bytesT []) , testCase "readSectionRecord: preserves trailing bytes" $ do let input = "readSectionRecord " ++ bytesExpr (nodesEntryBytes 16 32 ++ [9,8]) - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT (sectionRecordT nodesSectionIdBytes 16 32) (bytesT [9,8]) , testCase "readSectionRecord: empty input returns EOF" $ do let input = "readSectionRecord []" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT eofT (bytesT []) , testCase "readSectionRecord: short section id returns EOF preserving input" $ do let input = "readSectionRecord [(0)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT eofT (bytesT [0]) , testCase "readSectionRecord: missing section version returns EOF preserving unread bytes" $ do let input = "readSectionRecord [(0) (2)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT eofT (bytesT [0,2]) , testCase "readSectionRecord: short section version returns EOF preserving unread bytes" $ do let input = "readSectionRecord [(0) (2) (0) (0) (0)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT eofT (bytesT [0]) , testCase "readSectionRecord: missing length returns EOF preserving unread length bytes" $ do let input = "readSectionRecord [(0) (2) (0) (0) (0) (16)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT eofT (bytesT []) , testCase "readSectionRecord: short section flags returns EOF preserving unread bytes" $ do let input = "readSectionRecord [(0) (2) (0) (0) (0) (16) (0) (0) (0)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT eofT (bytesT [0]) -- ------------------------------------------------------------------------ - -- Arborix section directory parsing + -- Arboricx section directory parsing -- ------------------------------------------------------------------------ , testCase "readSectionDirectory: zero records preserves input" $ do let input = "readSectionDirectory 0 [(9) (8)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT (ofList []) (bytesT [9,8]) , testCase "readSectionDirectory: reads requested records and preserves trailing bytes" $ do let input = "readSectionDirectory 2 " ++ bytesExpr (manifestEntryBytes 10 20 ++ nodesEntryBytes 30 40 ++ [9]) - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT (ofList @@ -1739,171 +1733,171 @@ binaryReaderTests = testGroup "Binary Reader Tests" , testCase "readSectionDirectory: truncated record returns EOF" $ do let input = "readSectionDirectory 2 [(0) (1) (0) (0) (0) (10) (0) (0) (0) (20) (0) (2) (0) (0)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT eofT (bytesT [0,0]) -- ------------------------------------------------------------------------ - -- Arborix section lookup and raw byte slicing + -- Arboricx section lookup and raw byte slicing -- ------------------------------------------------------------------------ , testCase "lookupSectionRecord: finds record by raw section id" $ do let input = "lookupSectionRecord " ++ bytesExpr nodesSectionIdBytes ++ " [(" ++ "pair " ++ bytesExpr manifestSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 10) ++ " (pair " ++ bytesExpr (u64 20) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ") (" ++ "pair " ++ bytesExpr nodesSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 30) ++ " (pair " ++ bytesExpr (u64 40) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ")]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= justT (sectionRecordT nodesSectionIdBytes 30 40) , testCase "lookupSectionRecord: missing section id returns nothing" $ do let input = "lookupSectionRecord " ++ bytesExpr [0,0,0,3] ++ " [(" ++ "pair " ++ bytesExpr manifestSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 10) ++ " (pair " ++ bytesExpr (u64 20) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ") (" ++ "pair " ++ bytesExpr nodesSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 30) ++ " (pair " ++ bytesExpr (u64 40) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ")]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= nothingT , testCase "byteSlice: extracts requested byte range" $ do let input = "byteSlice 2 3 [(10) (11) (12) (13) (14) (15)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= bytesT [12,13,14] , testCase "byteSlice: overlong length returns remaining bytes" $ do let input = "byteSlice 4 9 [(10) (11) (12) (13) (14) (15)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= bytesT [14,15] -- ------------------------------------------------------------------------ - -- Arborix minimal container parsing foundation + -- Arboricx minimal container parsing foundation -- ------------------------------------------------------------------------ , testCase "u32BEBytesToNat: decodes zero" $ do let input = "u32BEBytesToNat [(0) (0) (0) (0)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofNumber 0 , testCase "u32BEBytesToNat: decodes small section count" $ do let input = "u32BEBytesToNat [(0) (0) (0) (2)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofNumber 2 , testCase "u64BEBytesToNat: decodes small node count" $ do let input = "u64BEBytesToNat [(0) (0) (0) (0) (0) (0) (0) (2)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofNumber 2 , testCase "u64BEBytesToNat: decodes fixture-scale offset" $ do let input = "u64BEBytesToNat [(0) (0) (0) (0) (0) (0) (3) (214)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofNumber 982 - , testCase "readArborixContainer: reads header directory and preserves payload" $ do - let input = "readArborixContainer " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204]) - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxContainer: reads header directory and preserves payload" $ do + let input = "readArboricxContainer " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204]) + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT (pairT - (arborixHeaderT 2) + (arboricxHeaderT 2) (ofList [ sectionRecordT manifestSectionIdBytes 152 3 , sectionRecordT nodesSectionIdBytes 155 4 ])) (bytesT [101,102,103,201,202,203,204]) - , testCase "readArborixContainer: truncated directory returns EOF" $ do - let input = "readArborixContainer " ++ bytesExpr (arborixHeaderBytes 1 ++ [0,0]) - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxContainer: truncated directory returns EOF" $ do + let input = "readArboricxContainer " ++ bytesExpr (arboricxHeaderBytes 1 ++ [0,0]) + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT eofT (bytesT [0,0]) - , testCase "readArborixContainer: rejects unsupported major version" $ do - let badHeader = [65,82,66,79,82,73,88,0] ++ u16 2 ++ u16 0 ++ u32 0 ++ u64 0 ++ u64 32 - input = "readArborixContainer " ++ bytesExpr badHeader - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxContainer: rejects unsupported major version" $ do + let badHeader = [65,82,66,79,82,73,67,88] ++ u16 2 ++ u16 0 ++ u32 0 ++ u64 0 ++ u64 32 + input = "readArboricxContainer " ++ bytesExpr badHeader + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT unsupportedVersionT (bytesT []) - , testCase "readArborixContainer: rejects unsupported minor version" $ do - let badHeader = [65,82,66,79,82,73,88,0] ++ u16 1 ++ u16 1 ++ u32 0 ++ u64 0 ++ u64 32 - input = "readArborixContainer " ++ bytesExpr badHeader - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxContainer: rejects unsupported minor version" $ do + let badHeader = [65,82,66,79,82,73,67,88] ++ u16 1 ++ u16 1 ++ u32 0 ++ u64 0 ++ u64 32 + input = "readArboricxContainer " ++ bytesExpr badHeader + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT unsupportedVersionT (bytesT []) - , testCase "readArborixContainer: rejects duplicate section ids" $ do - let input = "readArborixContainer " ++ bytesExpr (arborixHeaderBytes 2 ++ manifestEntryBytes 152 1 ++ manifestEntryBytes 153 1 ++ [9]) - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxContainer: rejects duplicate section ids" $ do + let input = "readArboricxContainer " ++ bytesExpr (arboricxHeaderBytes 2 ++ manifestEntryBytes 152 1 ++ manifestEntryBytes 153 1 ++ [9]) + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT duplicateSectionT (bytesT [9]) , testCase "extractSectionBytes: uses raw offset and length fields" $ do let input = "extractSectionBytes " ++ sectionRecordExpr nodesSectionIdBytes 3 4 ++ " " ++ bytesExpr [10,11,12,13,14,15,16,17] - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= bytesT [13,14,15,16] , testCase "lookupSectionBytes: finds section and extracts raw bytes" $ do let input = "lookupSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " [" ++ sectionRecordExpr manifestSectionIdBytes 1 2 ++ " " ++ sectionRecordExpr nodesSectionIdBytes 4 3 ++ "] " ++ bytesExpr [10,11,12,13,14,15,16,17] - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= justT (bytesT [14,15,16]) , testCase "lookupSectionBytes: missing section returns nothing" $ do let input = "lookupSectionBytes " ++ bytesExpr [0,0,0,3] ++ " [" ++ sectionRecordExpr manifestSectionIdBytes 1 2 ++ " " ++ sectionRecordExpr nodesSectionIdBytes 4 3 ++ "] " ++ bytesExpr [10,11,12,13,14,15,16,17] - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= nothingT , testCase "extractSectionBytesResult: rejects out-of-bounds section" $ do let input = "extractSectionBytesResult " ++ sectionRecordExpr nodesSectionIdBytes 6 4 ++ " " ++ bytesExpr [10,11,12,13,14,15,16,17] ++ " []" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT eofT (bytesT []) - , testCase "readArborixSectionBytes: extracts requested section from container" $ do - let input = "readArborixSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204]) - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxSectionBytes: extracts requested section from container" $ do + let input = "readArboricxSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204]) + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT (bytesT [201,202,203,204]) (bytesT [101,102,103,201,202,203,204]) - , testCase "readArborixSectionBytes: missing section returns missing-section err" $ do - let input = "readArborixSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103]) - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxSectionBytes: missing section returns missing-section err" $ do + let input = "readArboricxSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103]) + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT missingSectionT (bytesT [101,102,103]) - , testCase "readArborixRequiredSections: extracts manifest and nodes bytes" $ do - let input = "readArborixRequiredSections " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204]) - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxRequiredSections: extracts manifest and nodes bytes" $ do + let input = "readArboricxRequiredSections " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204]) + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT (pairT (bytesT [101,102,103]) (bytesT [201,202,203,204])) (bytesT [101,102,103,201,202,203,204]) - , testCase "readArborixRequiredSections: missing nodes section returns missing-section err" $ do - let input = "readArborixRequiredSections " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103]) - library <- evaluateFile "./lib/arborix.tri" + , testCase "readArboricxRequiredSections: missing nodes section returns missing-section err" $ do + let input = "readArboricxRequiredSections " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103]) + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT missingSectionT (bytesT [101,102,103]) - , testCase "readArborixRequiredSections: out-of-bounds section returns EOF" $ do + , testCase "readArboricxRequiredSections: out-of-bounds section returns EOF" $ do let manifestBytes = [101,102,103] nodesBytes = [201,202,203,204] - badContainer = arborixHeaderBytes 2 ++ manifestEntryBytes 152 3 ++ nodesEntryBytes 155 9 ++ manifestBytes ++ nodesBytes - input = "readArborixRequiredSections " ++ bytesExpr badContainer - library <- evaluateFile "./lib/arborix.tri" + badContainer = arboricxHeaderBytes 2 ++ manifestEntryBytes 152 3 ++ nodesEntryBytes 155 9 ++ manifestBytes ++ nodesBytes + input = "readArboricxRequiredSections " ++ bytesExpr badContainer + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT eofT (bytesT [101,102,103,201,202,203,204]) -- ------------------------------------------------------------------------ - -- Arborix raw nodes section parsing + -- Arboricx raw nodes section parsing -- ------------------------------------------------------------------------ , testCase "readNodeRecord: parses hash length and raw payload" $ do let input = "readNodeRecord [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (3) (101) (102) (103) (9)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT (pairT (bytesT [1..32]) @@ -1913,13 +1907,13 @@ binaryReaderTests = testGroup "Binary Reader Tests" , testCase "readNodeRecord: truncated payload returns EOF preserving unread payload" $ do let input = "readNodeRecord [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (3) (101) (102)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT eofT (bytesT [101,102]) , testCase "readNodesSection: parses node count and records" $ do let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (9)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT (pairT (bytesT [0,0,0,0,0,0,0,1]) @@ -1932,37 +1926,37 @@ binaryReaderTests = testGroup "Binary Reader Tests" , testCase "readNodesSectionComplete: rejects trailing bytes inside nodes section" $ do let input = "readNodesSectionComplete [(0) (0) (0) (0) (0) (0) (0) (0) (9)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT unexpectedBytesT (bytesT [9]) , testCase "readNodesSection: rejects duplicate node hashes" $ do let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (9)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT duplicateNodeT (bytesT [9]) , testCase "nodePayloadValid?: accepts leaf stem and fork payload shapes" $ do let input = "[(nodePayloadValid? [(0)]) (nodePayloadValid? [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]) (nodePayloadValid? [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)])]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofList [trueT, trueT, trueT] , testCase "nodePayloadValid?: rejects invalid payload shapes" $ do let input = "[(nodePayloadValid? []) (nodePayloadValid? [(9)]) (nodePayloadValid? [(1) (1)]) (nodePayloadValid? [(2) (1) (2)])]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofList [falseT, falseT, falseT, falseT] , testCase "node payload child accessors expose raw hashes" $ do let input = "[(nodePayloadStemChildHash [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]) (nodePayloadForkLeftHash [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]) (nodePayloadForkRightHash [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)])]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofList [bytesT [1..32], bytesT [1..32], bytesT [33..64]] , testCase "lookupNodeRecord: finds record by raw node hash" $ do let input = "lookupNodeRecord [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (1)] [(0)]))]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= justT (pairT (bytesT [33..64]) @@ -1971,7 +1965,7 @@ binaryReaderTests = testGroup "Binary Reader Tests" , testCase "nodeRecordChildHashes: extracts stem and fork references" $ do let input = "[(nodeRecordChildHashes (pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (33)] [(1) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]))) (nodeRecordChildHashes (pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (65)] [(2) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64) (65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)])))]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofList [ ofList [bytesT [33..64]] @@ -1980,20 +1974,20 @@ binaryReaderTests = testGroup "Binary Reader Tests" , testCase "readNodesSection: rejects invalid node payload shape" $ do let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (9)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT invalidNodePayloadT (bytesT []) , testCase "readNodesSection: rejects missing child node" $ do let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (33) (1) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64) (9)]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= errT missingNodeT (bytesT [9]) - , testCase "readArborixNodesSection: extracts and parses raw nodes section" $ do + , testCase "readArboricxNodesSection: extracts and parses raw nodes section" $ do let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0] - input = "readArborixNodesSection " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes) - library <- evaluateFile "./lib/arborix.tri" + input = "readArboricxNodesSection " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes) + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT (pairT (bytesT [0,0,0,0,0,0,0,1]) @@ -2005,186 +1999,186 @@ binaryReaderTests = testGroup "Binary Reader Tests" (bytesT ([101,102,103] ++ nodesBytes)) -- ------------------------------------------------------------------------ - -- Arborix node DAG reconstruction + -- Arboricx node DAG reconstruction -- ------------------------------------------------------------------------ , testCase "nodeHashToTree: reconstructs leaf node" $ do let input = "nodeHashToTree [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)]))]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT Leaf Leaf , testCase "nodeHashToTree: reconstructs stem node" $ do let input = "nodeHashToTree [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (33)] [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]))]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT (Stem Leaf) Leaf , testCase "nodeHashToTree: reconstructs fork node" $ do let input = "nodeHashToTree [(65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)] (pair [(0) (0) (0) (65)] [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]))]" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT (Fork Leaf Leaf) Leaf - , testCase "readArborixTreeFromHash: reconstructs tree from bundle bytes" $ do + , testCase "readArboricxTreeFromHash: reconstructs tree from bundle bytes" $ do let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0] - input = "readArborixTreeFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes) - library <- evaluateFile "./lib/arborix.tri" + input = "readArboricxTreeFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes) + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT Leaf (bytesT ([101,102,103] ++ nodesBytes)) - , testCase "readArborixExecutableFromHash: alias reconstructs tree" $ do + , testCase "readArboricxExecutableFromHash: alias reconstructs tree" $ do let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0] - input = "readArborixExecutableFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes) - library <- evaluateFile "./lib/arborix.tri" + input = "readArboricxExecutableFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes) + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= okT Leaf (bytesT ([101,102,103] ++ nodesBytes)) - , testCase "readArborixNodesSection: reads id fixture bundle" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arborix" + , testCase "readArboricxNodesSection: reads id fixture bundle" $ do + fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" case decodeBundle fixtureBytes of Left err -> assertFailure $ "decodeBundle failed: " ++ err Right _ -> do - let input = "matchResult (code rest : code) (nodes rest : 0) (readArborixNodesSection " + let input = "matchResult (code rest : code) (nodes rest : 0) (readArboricxNodesSection " ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ ")" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofNumber 0 - , testCase "readArborixNodesSection: reads notQ fixture bundle" $ do - fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix" + , testCase "readArboricxNodesSection: reads notQ fixture bundle" $ do + fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx" case decodeBundle fixtureBytes of Left err -> assertFailure $ "decodeBundle failed: " ++ err Right _ -> do - let input = "matchResult (code rest : code) (nodes rest : 0) (readArborixNodesSection " + let input = "matchResult (code rest : code) (nodes rest : 0) (readArboricxNodesSection " ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ ")" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofNumber 0 - , testCase "readArborixNodesSection: reads map fixture bundle" $ do - fixtureBytes <- BS.readFile "test/fixtures/map.arborix" + , testCase "readArboricxNodesSection: reads map fixture bundle" $ do + fixtureBytes <- BS.readFile "test/fixtures/map.arboricx" case decodeBundle fixtureBytes of Left err -> assertFailure $ "decodeBundle failed: " ++ err Right _ -> do - let input = "matchResult (code rest : code) (nodes rest : 0) (readArborixNodesSection " + let input = "matchResult (code rest : code) (nodes rest : 0) (readArboricxNodesSection " ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ ")" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofNumber 0 - , testCase "readArborixExecutableFromHash: reconstructs id fixture root" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arborix" + , testCase "readArboricxExecutableFromHash: reconstructs id fixture root" $ do + fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" case decodeBundle fixtureBytes of Left err -> assertFailure $ "decodeBundle failed: " ++ err Right bundle -> case bundleRoots bundle of [] -> assertFailure "fixture has no roots" (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : 0) (readArborixExecutableFromHash " + let input = "matchResult (code rest : code) (tree rest : 0) (readArboricxExecutableFromHash " ++ bytesExpr (hexTextBytes rootHash) ++ " " ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ ")" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofNumber 0 - , testCase "readArborixExecutableFromHash: reconstructs notQ fixture root" $ do - fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix" + , testCase "readArboricxExecutableFromHash: reconstructs notQ fixture root" $ do + fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx" case decodeBundle fixtureBytes of Left err -> assertFailure $ "decodeBundle failed: " ++ err Right bundle -> case bundleRoots bundle of [] -> assertFailure "fixture has no roots" (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : 0) (readArborixExecutableFromHash " + let input = "matchResult (code rest : code) (tree rest : 0) (readArboricxExecutableFromHash " ++ bytesExpr (hexTextBytes rootHash) ++ " " ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ ")" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofNumber 0 - , testCase "readArborixExecutableFromHash: reconstructs map fixture root" $ do - fixtureBytes <- BS.readFile "test/fixtures/map.arborix" + , testCase "readArboricxExecutableFromHash: reconstructs map fixture root" $ do + fixtureBytes <- BS.readFile "test/fixtures/map.arboricx" case decodeBundle fixtureBytes of Left err -> assertFailure $ "decodeBundle failed: " ++ err Right bundle -> case bundleRoots bundle of [] -> assertFailure "fixture has no roots" (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : 0) (readArborixExecutableFromHash " + let input = "matchResult (code rest : code) (tree rest : 0) (readArboricxExecutableFromHash " ++ bytesExpr (hexTextBytes rootHash) ++ " " ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ ")" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofNumber 0 - , testCase "readArborixExecutableFromHash: executes id fixture root" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arborix" + , testCase "readArboricxExecutableFromHash: executes id fixture root" $ do + fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" case decodeBundle fixtureBytes of Left err -> assertFailure $ "decodeBundle failed: " ++ err Right bundle -> case bundleRoots bundle of [] -> assertFailure "fixture has no roots" (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : tree 42) (readArborixExecutableFromHash " + let input = "matchResult (code rest : code) (tree rest : tree 42) (readArboricxExecutableFromHash " ++ bytesExpr (hexTextBytes rootHash) ++ " " ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ ")" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= ofNumber 42 - , testCase "readArborixExecutableFromHash: executes notQ fixture on true" $ do - fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix" + , testCase "readArboricxExecutableFromHash: executes notQ fixture on true" $ do + fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx" case decodeBundle fixtureBytes of Left err -> assertFailure $ "decodeBundle failed: " ++ err Right bundle -> case bundleRoots bundle of [] -> assertFailure "fixture has no roots" (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : tree true) (readArborixExecutableFromHash " + let input = "matchResult (code rest : code) (tree rest : tree true) (readArboricxExecutableFromHash " ++ bytesExpr (hexTextBytes rootHash) ++ " " ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ ")" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= falseT - , testCase "readArborixExecutableFromHash: executes notQ fixture on false" $ do - fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix" + , testCase "readArboricxExecutableFromHash: executes notQ fixture on false" $ do + fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx" case decodeBundle fixtureBytes of Left err -> assertFailure $ "decodeBundle failed: " ++ err Right bundle -> case bundleRoots bundle of [] -> assertFailure "fixture has no roots" (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : tree false) (readArborixExecutableFromHash " + let input = "matchResult (code rest : code) (tree rest : tree false) (readArboricxExecutableFromHash " ++ bytesExpr (hexTextBytes rootHash) ++ " " ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ ")" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= trueT - , testCase "readArborixExecutableFromHash: executes map fixture root" $ do - fixtureBytes <- BS.readFile "test/fixtures/map.arborix" + , testCase "readArboricxExecutableFromHash: executes map fixture root" $ do + fixtureBytes <- BS.readFile "test/fixtures/map.arboricx" case decodeBundle fixtureBytes of Left err -> assertFailure $ "decodeBundle failed: " ++ err Right bundle -> case bundleRoots bundle of [] -> assertFailure "fixture has no roots" (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : head (tail (tree (a : (t t t)) [(t) (t) (t)]))) (readArborixExecutableFromHash " + let input = "matchResult (code rest : code) (tree rest : head (tail (tree (a : (t t t)) [(t) (t) (t)]))) (readArboricxExecutableFromHash " ++ bytesExpr (hexTextBytes rootHash) ++ " " ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ ")" - library <- evaluateFile "./lib/arborix.tri" + library <- evaluateFile "./lib/arboricx.tri" let env = evalTricu library (parseTricu input) result env @?= Fork Leaf Leaf ] diff --git a/test/fixtures/false.arboricx b/test/fixtures/false.arboricx new file mode 100644 index 0000000000000000000000000000000000000000..8ee3588c10a4c5ddaa31ffe53e556e57432f40b8 GIT binary patch literal 707 zcma)4y=xRf6u&*vrVznEM2tniW)CB%o#0zVryY> zg;;2jGhk8)21PI!tyF9T1zYtm@B=I?g0pu?B#P)8cpvkA?>D~(bGBK((41+sDnX^R z>cIbU6Fvg9_Kd2ap2hwI&s_`J<#ir3HH_3N7t&d>FZy;+SdVJIO#2qw~ole`vX ziVCbrP&`KKYTc>!@aj^LVt&{e#E^g>naFI2Ig-plBrw!b8;PT3|1R4Bqe}Z73Q75qG=PzNnS@J*gm&`ciPm(Jm!KXqy>iM9Wm5 zLo`}Dd#z8I->fy!d%RZ2r}sDBCYR@1-AU3L8{NP7tuu9J;mi1`eMefd0vMZxuKPAZ5R1KFF@bk|JxZuj)qU9-%B2OoEnKtOMj zxZn#3_&^X3Ui9GO;>mMDJZeBuL1FFL%_31myP>+e>ifR_3i`;*^vRiHtsn@4YSo7E zSE&0m9PTk92(FFV;kT>v;_$vD@n-e=^F3m+{c*jHvzx28k1y`JwEN7Lxpj}LZRge& zM%*B}{3NFN&&|Craq#}#{nM|~509=t-tl^E?Nk53&cVXDTZv-=HI$QJV`F96h>MIX zY-q@2f{q&91MN|1wwGf$MHVq)pje>_7h#F4Acz%4CT=4mUiEK?F(BUt_gW$|9FKw% zD-u-dA0(j@C*a_rX2OjInRUlTG}?|gQV~3{Q=O-vrws6N;x|k_s(V;!i;SekzI1*n zpECD{hc&}&>a{qa_r=}*BpS}2TbFk5*&9_g_e!W{8aBCE%91|!MrC8?1 xdUl^@pwagX4a0xvbY#Xkq1Au$Ei@8vAXt{nA`tTFUb;2k*8d3q<<{fE;3t6B5eEPO diff --git a/test/fixtures/id.arboricx b/test/fixtures/id.arboricx new file mode 100644 index 0000000000000000000000000000000000000000..1e289cb5d590c8fd2fa352b5671d98652d73174a GIT binary patch literal 946 zcma)4O=uHA6i(70iuNE1LjNoDU@?&j{uD%Qt6=n?O+p*Rs*{v}0Uf6RFwe3|+%+jZ~dc;Hp&;>Od%cK_Qm=9Y?~ z!AxLg;)=q#R&rgLU)-ASY1k(E^4W#@x~Xf=TeDrPb$a*2efdnBST*pmr+X}7VI2&V ztVSZt1;t5k*ft3RuYpV&0ZV!lEz#0dFcpVny%Yc*j2O0=ZFwLG*p?{Fz{A5)z>*Z| zcl0Gdd<9%-=_V&|2wD0FMNCs2rEWwl9c|ySGeWq5sM)*h!~hXfP6`agj1}@Rlwur_ zWYwRbc&VUpny^ht4H%`C8cc1m^_Pn%BoMQaq}*gC7Y1S;E(i#W!c;CxtAn@TUx_k| z;_Aq#q+s=Hm~qP%pkr>t8At%ZxwOuY5M50cYG?|cbU$4;RNJ>_&F0R`yXO5pq3O=L zC#P<@uQvN<7n_LS3^W1}WHYHjGR}o!j}j9ZplVze!++?+s7Z(fVf@p$Zh%AtNiOMi*Oy*58sS+nT-I&*1j_F?ah-8<{tJD0h0m%C78 z(h^8S|Jg$inw43-^^nwq6a`698NDQgAc7TyAYp1`v%4-~5!(!$IWy<`zVjWJy5{<( z=6x-SqAHo?$;RKILZ)hV=Mjq1nJcT`uTF1!+qP?Y@lZNgTJ&Lf?M>y?aCKUby*}=a zULGu7EpeWOT+g|A?y4(I?1DJ-PvMjWZn@w9J`GXLp^ohX7c@tpC5bQ`24~)fE<*Za e=TX5IJPXz|e?DU@{;| z0RL|#446$5Dfu=+G?Mo1?hXafnGivs_}`sa%MGG_BsMRAOJTry6pmb*EjtlP&yuBiN4Y)TnfwyBjM>x4v!6! z@k|Db4s&6c&17;p;kg8|Xl6d&U&`bhGBu_g=1pl$zxK84A{$o)j^INN>2=4y)yHEt z5!g&ROd`NUh)3s=U;&HC7LCp$1V9W})W+l@mlfd9C;J@pQoJcNYiu++DVUDiOBcu?A2cF%C;@n2V%p<2`^;Q{g0b$ zk*e<0F-mQ=xx;H7_yg@kCQpB$%PV->c1a*0fsT?hTbn=McsibtFzaoj)rAjFsizKaefv<#%~I%ecvFe2G_+(^`t`>W&63jajG_Eu zr^NAXA`mO_Q-bsT@te!5WO3Wxm@1Is2IEK`<5JPTtW~nVqIzOaHfXwvL8azcuhF)u z-NT#aU$sW`#+F}N`aEV)7b|trqJ1z++SYDOug#~P!m zYKKGK-qOw-#?qe_#}xF%UbIbXQuaK-pLED6q;}ExdX?IqxEYWLBpjC>6b3P>Z&$XH zyghvaHSw3WtWyY?rRit!XB79!srjhKRCK;I*lDEp@#8Nai82-;Ey<@_QqY3|k!X4ym8_SER}L{-vmKk#v(xgJPd8Sj?s$e|(oT=b1x-v?nUq)$byCuqJ3e&ba8#x80URiWE*-La~^--SJ*G1VqnK*V z;u6KR?kiHt^zfa{*gM9K;${Y&?`*OO^dYwgqc(cIv~{%eafoSh!LcWne!s=hOLt^I z4B_Se+b7)vLbSx%+lWC=38#B$nP;0pDNqU zZ#;5(MI?{i-G(L|yH=0&W@1OD(b1j`^=+ZqI;dkB*7uwlE z_4*^NF?`Y7=xNZbK=@ikf1_AaR;T$zm7jikPPo6HNlO zuP7W3-z`QjQcbn|&j%%|{AT6-qs%~>uJ+IgjS1&&zqx`A&=N($ab~fy1hZeRKIC#* z$B;=OEx}vfdeT{ctltjTQ*@1M&11?RG;1H!XZ=|)(vhEUFSv8l}pK~9qR=+#0JLf;K&n|OE^5da}z^0L7E%n(a zADHQsN!#rx{nGGYAw{q4w8ZThLC^$N0|9CFt?=KQqjN-KDMs9%c+;`ak=$9hP*$H8 z>SNzJtUwKk3+#kC3K}*(6ss%Wvg+mWr%k8LrKFjztaFc2RJRWzAfN;SvM&S#w6{SI z1_4<_ub*=ITKtKuLTvyc@-JNB@N*0X@{*?Y)?U|GjuOCE8t;C-%w&co;&PMFJ zL?*@EG1Cmadg;cd?3bZ>ZK$MvW(*bNK_GiUNM=W@Y}&16GjY^|P;8!u=IGu9$>>pz zFja@SdDT1rlb{3<|1aLFhmloBDi;amwh_3n*CjV8b#6WJlqUnP6E~pkmNI=`}DaLlS8(KlV>DU#~D=jXj-64cuRVqGe?kKwQ4v`4>tRQcgUY6U_vFNu%-?peoD zLXS&eyG&-<3{ z*-vcndleJ@t3hc+$xpjC_lro>Q2l*_g`tLqOB=M5&c44FJlClBK0+ec#Q2GevIbEy zdMn6Y5T&3U0cr;s;fhE=F$dKJ2nopOKqU2r?3moN#2}jn)jwY3<@uzG{rR=-b6#@! zF7C{+J$gyP2zTD~S_T1;(``qU>^E}^tGvE9Ee!uts{@?#HMz+T1ZO=Hdu_4&L zzL47)YHPTXRWoxzHDn28;@PtQjbD%2deqg-XiQ2A@VYVU3Qw@5%R~ps^4E z6)1`#dlJdGKvhqg9|jMp5>S3PFNi`_4bbrU_LF@O2?`Xak|9V1jl!d8cp60o3WU%! zD#{ZNsSrJVAT$;8^?5e-`vB74hO@A+enbppfudlua45Vti;#*Xcu}w@O>G@743U7s z`2~CzLV(D=EFoVfuqOI`{j@;^;fW|rpvZrXA$|LuHO!k#^P}PjP!5j5S{3USP5B=U zK2QJ(PeW00L_fSAkxW40RfuE^L`I`Lag^`Ktb5h3;#rt@G>XLfipNoL5ao(D8c(Fq z$PgAyB#`hB6@th_B9#)6g~76xmJiO77=>aSpVu+wn+VbE_+;<}VoPRL%Tx6n)D z7_G4wG7%4XV<0aS4Nvuk0!c(NYjzIC4@H1j&xlm`vI`tvoF4{5g~){eryEOvs3;7I zips>H$que5OzSeI$ML@hs)eA6@Zutw^-Y_3J4_wwT-OWtl>+3B>$HKhMy#BN?VTJwtD;SJ z4==vGVvk&9Mjl-H`9YfcOM%~Wf3FKu_p4aDQf)KIx$k^T`w67u)R&?*%g^ycGQunp zNyI3U-m+JIHojIyHd&;#)Um{FgnV*O)tMBvVv!pbszOprk|vvr-IsikGb?}ga6K>h zrO0t_Llk0dJSw9sV?G{MdhF))%j43MdzR{3kLgY(U6D}@-7nh|)8uCUYQkruKc)ov z%JS{=LPa63X1J@`C5qCHtR^@pL)fpXG!=iaX#J-eD)TP37%>q2(x)N9c( zi;$TX)5F!<`bN&XGRNfA-&)+q)|As_Ti1KeB|*`{?%O*J=2mMy4#p!gk;2mLa{_Pq zYJ}?FG}`}?dqmY#N8~YSD)Pinn(H@@f4-Ryll*!_dPG;&^{!Q7;e^E=(uVby4ZIv( zfAQK;UZqpGkrmLMfGh#qJYqs1C$MkMe3JZx9>;6;eq;VN!Bh2RZee+!^ypJnUOjte zrKhQGs3aGL>zRLNRlZ1@&1zqJ|8P(VZ(W|$01N`M*@@~)4ue)qw=pe3vet0Kw3x;w zwY1x;@TP!$U(L0JG#F&iBF?_*xQ(`C%?Gubm`zgqEvDCo{qd=6yF{LO&xMwgj2THL zdql;j)zAG(sQ%SkO6yV0f-&o5^Ag+n`@AP%o0YabbiefWXrE1jKgvp7b+Cc0D2S00 z)-V`U#@A8lbl@z#oE#~!jqAQZo?0U&zRQv22-lp+OYila2|tH>IBUqK$uB)U^rn>m zLvQW79?9wR#|r9f4;HJQ>9iEoUC4n+K!%^dE`BJeiL2zl`jg;bi@=)=`3C(gMV-T! zi(mfZ(bl*lNW`v4JnqOh{k3lTaamo0y-`?!-I{T%IlyFbL>G zfy@CfH~Wmi^)1GLVdK!qY}C}{@N0<%nH7mQEB*F0=dbmJrnN6&w24RAQxEPxnC)8j z!a99Iyz--10;kTHOX3g9YyqAoITjqT$D8@~&_4FHobE)vzoa;RpZFK8oAG*U&wdh2 zay`&da9caCc&}M-$EQQy&mhz8^Ur4T>MMkec|I`d#UStTa=zc{aPKGY&G+mpu{vEs zOV~J*_;S98F_94O25v$YOwxh#FCbA;a67qwRG6vxNFLM|yDeo|-9bikbJQ?v@gNec zE@u!G+m?K7GWJxl21#=VaX((3K~J;$mQurqf^6M3u9`P!Gms^>gY? z__jToc3d+p*9s~{%o0jJfmQ-?VHSxbVr`Yl<2!vUTn*tEFPd(@qh83prQN}8E7;1@ z#~oO;0t&UTQqacznXJd)F4*!5j(5JeVmv9N5#u}@WZY4$fx zw36m;Wme&E6=PKQ2S+q@&qYo7wp5qRH0eHVElMZH4CpYxN!RH0r{sz5 zC|K#c^V79QR{pY!JrzFa+B*8iTZ&(CxMxvsHYa#5Le0VOfmRr^x=)(3>f;i0Xh$R1 zd1v-$a$j;ANK!ycv z3~V#d+3X8BSDHzGK0q4svQ{m-a_Z;xlvCRbo?(4~JFS{CXf06v3zLA#bx^znxgcx~ zP~-+323RR5M}hnh$4`Y0^ zvnH$dxr}9W7>9U`29o~4d^=6sTOWRUA$9BAcrmOLd+a7&6@yoyr}Am2`I{d8emYog-Jj* R2jzWGPXLpE>Qhh={4azV%y$3) diff --git a/test/fixtures/notQ.arboricx b/test/fixtures/notQ.arboricx new file mode 100644 index 0000000000000000000000000000000000000000..80c94bec872b255cfe3ca663b5473e728a7547d1 GIT binary patch literal 1180 zcma)4Ye*DP6yDvyB()`y0&U5qC&Vrps1Q;3#3d@*q!uLY?#$hp*}XG&x^qW&DZ()A z@h34Ve^j6utPtt;0I6897b2p91pSaeD$GLGvInF^cX!dSi17}bbMKt*JLmi6PNBQ# zfZJ8vG;9X>NT-hBRb7{w-r-gSQuKeF~j z^e;6?WjM=XuH{OSYe|XP`eK#-f{bF9>)BE09FM&>-9DAt?a!YZzWVZ3WG?K}I1$Pq zm<@;*6Nw8ty}H69=#+sX`JhIfwYmP}tICiHgX<}Q(7^%3=}6ZcFbFxF0xyCN?Da#A z^6~mjaUP&9g0o(tf_YdDG_hR*NMIAGDEl-K?8wXa@mL0uQn#Q)h9tn0#2Jhki029r zLKyJDtiM5OvcPZx(G@DmutU}9ci42ef6CM%{cqO{fnjM^R z{j4eCfXa?Yq7u~V1rA|NCs5?DtRhGuBp6febUQCHXP%t)<_GTC{d>zssvg=`)=%Xh z+~w~4*0p`(;Ndyz*L&t0Ji!PGJoEt_Q2_>3Oc=X^SAYyz4bwzeLZ5)7@H~Y?{;zOR zh7|CCf;NdG5Kzb$g~WM>$-l&ul3hG|(p4%C_H|lev*I1 zaCUP`PiIJuc0C>yjvVsTZsJy~c{IeD6k3cX^?qEjxk-)qhSlE7*0U9Fq8lQU+kPDTu(R~qmP`F_ zf0DlGNF;4!EOSt6w!33Z(~S)Sw$|!G6dMxmG?qO{3^7t_G{lI{hL}x8cbrgS^i7QH M7%5$nT_Zcc0gFoJP5=M^ literal 0 HcmV?d00001 diff --git a/test/fixtures/notQ.arborix b/test/fixtures/notQ.arborix deleted file mode 100644 index 3d1b7e848b211c75299e34a738fa78728fdebe35..0000000000000000000000000000000000000000 GIT binary patch literal 0 HcmV?d00001 literal 1174 zcma)4ZAcVR6rJ6`K#4$MQ6p9>{nc(MDd8V=p-j;%6H`)_w=;KVW_M=ZHuGkgr4ThW zA+w-wD98^qlSCu*M+JruC`BMi2z?q7C4pdJifC_lTcRSyH}K}Zch0%@+{-JhaFtgS zS2K)-2`_78|GlNr&=UD#42HQLg)Kk6CuKLpKWpLj`}GU&2+n8aF<&UpTf{)RsUjrL*ZF=Hz_UP`bUiJK46d&v&jF?szp;02vvn zNcCVjww4yLaw;jW z&^lN4p!#mdWn<52bWgIUbdEh;?OUJAB@0uJ+)D%2^)};p;#yAM(B^}uUVO6g!*Jkb z`S91ivC(8#{JXjSA+wQ0inNs7nAl!0ur)7BOZ?DotG`)bsr6pSNVD-t?z)FHrDjge z^pPT8n%gJbAD4IE%A8!8K70vU-n!S;$BRmreO0da8BNJ%otWVp8kgBP&2U=8!kBeJ KN$D1|Og{nDW$S|g diff --git a/test/fixtures/true.arboricx b/test/fixtures/true.arboricx new file mode 100644 index 0000000000000000000000000000000000000000..565d1abfd49dcae168fe4f84d6dabd07cb80f16d GIT binary patch literal 776 zcma)4O=uHA6i(vB9z3K)gqG4&5kyU-#UBt6Of6b3l^8IB2ybWJ?oM`QXPuo*H~v8U z0l`~)5HCHn_E7QQRk8G9Jqca{BHr}i4}#bhFACDxq^(qu`Uc+5yzhPUJ(v@v;%I4P zsFF&j63evwZ;lGX7~Pck2Bh8K2jUOV(V+{*E)* z3kMA@vvD?aZPd_8N2#6OKJsL0n$;0z1RVkRZwUn;6t10I29(&xTZ0(f33ii&{6^g6XE3WEGt>uxrYCjF{RfZ7d`dMc@lWBdKn)Z$OfjMmG}2<-UE6T= xmt;26YjFMPi6;;1V*c@5(>Z^xQrk@?c5Lgqy6g>1jxBZ{$@W(^>l>1EY6UfVB0K;9 literal 0 HcmV?d00001 diff --git a/test/fixtures/true.arborix b/test/fixtures/true.arborix deleted file mode 100644 index 4fb2275df6a9b6ec254a7aa2b84c317cebf5b044..0000000000000000000000000000000000000000 GIT binary patch literal 0 HcmV?d00001 literal 770 zcma)4O=uHA6yD^f2TvYaskUOzRHq+q+TVL% z=;ctw;*#sR$y+V1qdK0AZ{PK?_G-46+FYCOdZ!PK%^f>(w72?VcQ6z#}kB20V~T08dFpQ4w&NZY$e7-7s%8If2k#IWr9P zWsh*FBZWSZBIIZgm6S$LRH$#q%JzJ{cH@2S%hK)#{U3hKpSg|KKjl{HZ;lTy6~;23 z`>u3QB{_N&^AK_4VH`?j*;6zCfwm`2GyI26kp+|*RN`O0eSroj7?@;4AsD2YvbQ$h y+W$x{oPM62xp^nvGY(IAcjx5mx69M@g+cRuu9~YoN!Bec;cPUL1?^01q_heJ3@*q3