Arborix -> Arboricx rename

This commit is contained in:
2026-05-08 09:12:20 -05:00
parent e3117e3ac8
commit 343ecbf4c4
29 changed files with 315 additions and 324 deletions

View File

@@ -40,7 +40,7 @@ nix build .#
| `REPL.hs` | Interactive Read-Eval-Print Loop (haskeline) | | `REPL.hs` | Interactive Read-Eval-Print Loop (haskeline) |
| `Research.hs` | Core types, `apply` reduction, booleans, marshalling (`ofString`, `ofNumber`), output formatters (`toAscii`, `toTernaryString`, `decodeResult`) | | `Research.hs` | Core types, `apply` reduction, booleans, marshalling (`ofString`, `ofNumber`), output formatters (`toAscii`, `toTernaryString`, `decodeResult`) |
| `ContentStore.hs` | SQLite-backed term persistence | | `ContentStore.hs` | SQLite-backed term persistence |
| `Wire.hs` | Arborix portable wire format — encode/decode/import/export of Merkle-DAG bundle blobs | | `Wire.hs` | Arboricx portable wire format — encode/decode/import/export of Merkle-DAG bundle blobs |
### File extensions ### File extensions
@@ -123,14 +123,14 @@ NLeaf → 0x00
NStem(h) → 0x01 || h (32 bytes) NStem(h) → 0x01 || h (32 bytes)
NFork(l,r) → 0x02 || l (32 bytes) || r (32 bytes) NFork(l,r) → 0x02 || l (32 bytes) || r (32 bytes)
hash = SHA256("arborix.merkle.node.v1" <> 0x00 <> serialized_node) hash = SHA256("arboricx.merkle.node.v1" <> 0x00 <> serialized_node)
``` ```
This is stored in SQLite via `ContentStore.hs`. Hash suffixes on identifiers (e.g., `foo_abc123...`) are validated: 1664 hex characters (SHA256). This is stored in SQLite via `ContentStore.hs`. Hash suffixes on identifiers (e.g., `foo_abc123...`) are validated: 1664 hex characters (SHA256).
## 7. Arborix Portable Wire Format ## 7. Arboricx Portable Wire Format
The **Arborix wire format** (module `Wire.hs`) defines a portable binary bundle for exchanging Tree Calculus terms, their Merkle DAGs, and associated metadata. It is versioned and schema-driven. The **Arboricx wire format** (module `Wire.hs`) defines a portable binary bundle for exchanging Tree Calculus terms, their Merkle DAGs, and associated metadata. It is versioned and schema-driven.
### Header ### Header
@@ -143,7 +143,7 @@ The **Arborix wire format** (module `Wire.hs`) defines a portable binary bundle
+------------------+-----------------+------------------+ +------------------+-----------------+------------------+
``` ```
- **Magic**: `ARBORIX\0` (`0x41 0x52 0x42 0x4f 0x52 0x49 0x58 0x00`) - **Magic**: `ARBORICX` (`0x41 0x52 0x42 0x4f 0x52 0x49 0x43 0x58`)
- **Header length**: 32 bytes - **Header length**: 32 bytes
- **Major version**: `1` | **Minor version**: `0` - **Major version**: `1` | **Minor version**: `0`
@@ -172,18 +172,18 @@ The manifest describes the bundle's semantics, exports, and schema. Key fields:
| Field | Value | Description | | Field | Value | Description |
|-------|-------|-------------| |-------|-------|-------------|
| `schema` | `"arborix.bundle.manifest.v1"` | Manifest schema version | | `schema` | `"arboricx.bundle.manifest.v1"` | Manifest schema version |
| `bundleType` | `"tree-calculus-executable-object"` | Bundle category | | `bundleType` | `"tree-calculus-executable-object"` | Bundle category |
| `tree.calculus` | `"tree-calculus.v1"` | Tree calculus version | | `tree.calculus` | `"tree-calculus.v1"` | Tree calculus version |
| `tree.nodeHash.algorithm` | `"sha256"` | Hash algorithm | | `tree.nodeHash.algorithm` | `"sha256"` | Hash algorithm |
| `tree.nodeHash.domain` | `"arborix.merkle.node.v1"` | Hash domain string | | `tree.nodeHash.domain` | `"arboricx.merkle.node.v1"` | Hash domain string |
| `tree.nodePayload` | `"arborix.merkle.payload.v1"` | Payload encoding | | `tree.nodePayload` | `"arboricx.merkle.payload.v1"` | Payload encoding |
| `runtime.semantics` | `"tree-calculus.v1"` | Evaluation semantics | | `runtime.semantics` | `"tree-calculus.v1"` | Evaluation semantics |
| `runtime.abi` | `"arborix.abi.tree.v1"` | Runtime ABI | | `runtime.abi` | `"arboricx.abi.tree.v1"` | Runtime ABI |
| `closure` | `"complete"` | Bundle must be a complete DAG | | `closure` | `"complete"` | Bundle must be a complete DAG |
| `roots` | `[{"hash": "...", "role": "..."}]` | Named root hashes | | `roots` | `[{"hash": "...", "role": "..."}]` | Named root hashes |
| `exports` | `[{"name": "...", "root": "..."}]` | Export aliases for roots | | `exports` | `[{"name": "...", "root": "..."}]` | Export aliases for roots |
| `metadata.createdBy` | `"arborix"` | Originator | | `metadata.createdBy` | `"arboricx"` | Originator |
### Section 2 — Nodes (Binary) ### Section 2 — Nodes (Binary)
@@ -252,7 +252,7 @@ tricu/
│ ├── REPL.hs │ ├── REPL.hs
│ ├── Research.hs │ ├── Research.hs
│ ├── ContentStore.hs │ ├── ContentStore.hs
│ └── Wire.hs # Arborix portable wire format │ └── Wire.hs # Arboricx portable wire format
├── test/ ├── test/
│ ├── Spec.hs # Tasty + HUnit tests │ ├── Spec.hs # Tasty + HUnit tests
│ ├── *.tri # tricu test programs │ ├── *.tri # tricu test programs
@@ -270,9 +270,9 @@ tricu/
└── AGENTS.md # This file └── AGENTS.md # This file
``` ```
## 9. JS Arborix Runtime ## 9. JS Arboricx Runtime
A JavaScript implementation of the Arborix portable bundle runtime lives in `ext/js/`. A JavaScript implementation of the Arboricx portable bundle runtime lives in `ext/js/`.
It is a reference implementation — not a tricu source parser. It reads `.tri.bundle` files produced by the Haskell toolchain, verifies Merkle node hashes, reconstructs tree values, and reduces them. It is a reference implementation — not a tricu source parser. It reads `.tri.bundle` files produced by the Haskell toolchain, verifies Merkle node hashes, reconstructs tree values, and reduces them.
From project root: From project root:

View File

@@ -1,8 +1,8 @@
# Arborix Portable Bundle v1 (CBOR Manifest Profile) # Arboricx Portable Bundle v1 (CBOR Manifest Profile)
Status: **Draft, implementation-aligned** (derived from `src/Wire.hs` as of 2026-05-07) Status: **Draft, implementation-aligned** (derived from `src/Wire.hs` as of 2026-05-07)
This document specifies the **actual on-wire format and validation behavior** currently implemented by `tricu` for Arborix bundles, with a focus on the newer CBOR manifest path. This document specifies the **actual on-wire format and validation behavior** currently implemented by `tricu` for Arboricx bundles, with a focus on the newer CBOR manifest path.
--- ---
@@ -38,7 +38,7 @@ A bundle is a byte stream:
| Field | Size | Encoding | Value / Notes | | Field | Size | Encoding | Value / Notes |
|---|---:|---|---| |---|---:|---|---|
| Magic | 8 | raw bytes | `41 52 42 4f 52 49 58 00` (`"ARBORIX\0"`) | | Magic | 8 | raw bytes | `41 52 42 4f 52 49 58 00` (`"ARBORICX"`) |
| Major | 2 | u16 BE | Must be `1` | | Major | 2 | u16 BE | Must be `1` |
| Minor | 2 | u16 BE | Currently `0` | | Minor | 2 | u16 BE | Currently `0` |
| SectionCount | 4 | u32 BE | Number of section directory entries | | SectionCount | 4 | u32 BE | Number of section directory entries |
@@ -143,18 +143,18 @@ Unknown metadata keys are ignored.
Writers in `Wire.hs` currently emit: Writers in `Wire.hs` currently emit:
- `schema = "arborix.bundle.manifest.v1"` - `schema = "arboricx.bundle.manifest.v1"`
- `bundleType = "tree-calculus-executable-object"` - `bundleType = "tree-calculus-executable-object"`
- `tree.calculus = "tree-calculus.v1"` - `tree.calculus = "tree-calculus.v1"`
- `tree.nodeHash.algorithm = "sha256"` - `tree.nodeHash.algorithm = "sha256"`
- `tree.nodeHash.domain = "arborix.merkle.node.v1"` - `tree.nodeHash.domain = "arboricx.merkle.node.v1"`
- `tree.nodePayload = "arborix.merkle.payload.v1"` - `tree.nodePayload = "arboricx.merkle.payload.v1"`
- `runtime.semantics = "tree-calculus.v1"` - `runtime.semantics = "tree-calculus.v1"`
- `runtime.evaluation = "normal-order"` - `runtime.evaluation = "normal-order"`
- `runtime.abi = "arborix.abi.tree.v1"` - `runtime.abi = "arboricx.abi.tree.v1"`
- `runtime.capabilities = []` - `runtime.capabilities = []`
- `closure = "complete"` - `closure = "complete"`
- `metadata.createdBy = "arborix"` - `metadata.createdBy = "arboricx"`
--- ---
@@ -249,17 +249,17 @@ These are important design gaps visible from current code.
Status: **resolved in current codebase**. Status: **resolved in current codebase**.
What was wrong: What was wrong:
- Manifest declared `tree.nodeHash.domain = "arborix.merkle.node.v1"`. - Manifest declared `tree.nodeHash.domain = "arboricx.merkle.node.v1"`.
- Hashing implementation previously used `"tricu.merkle.node.v1"`. - Hashing implementation previously used `"tricu.merkle.node.v1"`.
Current state: Current state:
- Haskell hashing now uses `"arborix.merkle.node.v1"`. - Haskell hashing now uses `"arboricx.merkle.node.v1"`.
- JS reference runtime hashing now uses `"arborix.merkle.node.v1"`. - JS reference runtime hashing now uses `"arboricx.merkle.node.v1"`.
- JS manifest validation now requires `"arborix.merkle.node.v1"`. - JS manifest validation now requires `"arboricx.merkle.node.v1"`.
Remaining recommendation: Remaining recommendation:
- Keep hash-domain constants centralized/shared to prevent future drift. - Keep hash-domain constants centralized/shared to prevent future drift.
- Add explicit test vectors for Leaf/Stem/Fork hashes under the Arborix domain. - Add explicit test vectors for Leaf/Stem/Fork hashes under the Arboricx domain.
### Gap B: CBOR decode is order-strict, not generic-map tolerant ### Gap B: CBOR decode is order-strict, not generic-map tolerant
@@ -334,6 +334,6 @@ A conforming v1 reader/writer for this profile should:
To stabilize interoperability, add: To stabilize interoperability, add:
1. `docs/arborix-bundle-test-vectors.md` (golden header/manifest/nodes + expected hashes). 1. `docs/arboricx-bundle-test-vectors.md` (golden header/manifest/nodes + expected hashes).
2. `docs/arborix-bundle-errors.md` (normative error codes/strings). 2. `docs/arboricx-bundle-errors.md` (normative error codes/strings).
3. `docs/arborix-bundle-evolution.md` (rules for minor/major upgrades, capability negotiation, extra sections). 3. `docs/arboricx-bundle-evolution.md` (rules for minor/major upgrades, capability negotiation, extra sections).

View File

@@ -1,17 +1,17 @@
{ {
"name": "arborix-runtime", "name": "arboricx-runtime",
"version": "0.1.0", "version": "0.1.0",
"description": "Arborix portable bundle runtime — JavaScript reference implementation", "description": "Arboricx portable bundle runtime — JavaScript reference implementation",
"type": "module", "type": "module",
"main": "src/bundle.js", "main": "src/bundle.js",
"bin": { "bin": {
"arborix-run": "src/cli.js" "arboricx-run": "src/cli.js"
}, },
"scripts": { "scripts": {
"test": "node --test test/*.test.js", "test": "node --test test/*.test.js",
"inspect": "node src/cli.js inspect", "inspect": "node src/cli.js inspect",
"run": "node src/cli.js run" "run": "node src/cli.js run"
}, },
"keywords": ["arborix", "tree-calculus", "trie", "runtime"], "keywords": ["arboricx", "tree-calculus", "trie", "runtime"],
"license": "MIT" "license": "MIT"
} }

View File

@@ -1,9 +1,9 @@
/** /**
* bundle.js — Parse an Arborix portable bundle binary into a JavaScript object. * bundle.js — Parse an Arboricx portable bundle binary into a JavaScript object.
* *
* Format (v1): * Format (v1):
* Header (32 bytes): * Header (32 bytes):
* Magic 8B "ARBORIX\0" * Magic 8B "ARBORICX"
* Major 2B u16 BE (must be 1) * Major 2B u16 BE (must be 1)
* Minor 2B u16 BE * Minor 2B u16 BE
* SectionCount 4B u32 BE * SectionCount 4B u32 BE
@@ -27,7 +27,7 @@ import { decodeCbor } from "./cbor.js";
// ── Constants ─────────────────────────────────────────────────────────────── // ── Constants ───────────────────────────────────────────────────────────────
const MAGIC = Buffer.from([0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x58, 0x00]); // "ARBORIX\0" const MAGIC = Buffer.from([0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x43, 0x58]); // "ARBORICX"
const HEADER_LENGTH = 32; const HEADER_LENGTH = 32;
const SECTION_ENTRY_LENGTH = 60; const SECTION_ENTRY_LENGTH = 60;
const SECTION_MANIFEST = 1; const SECTION_MANIFEST = 1;
@@ -69,7 +69,7 @@ export function parseBundle(buffer) {
// Check magic // Check magic
if (!buffer.slice(0, 8).equals(MAGIC)) { if (!buffer.slice(0, 8).equals(MAGIC)) {
throw new Error("invalid magic: expected ARBORIX\\0"); throw new Error("invalid magic: expected ARBORICX");
} }
// Parse header // Parse header

View File

@@ -1,5 +1,5 @@
/** /**
* cbor.js — Minimal CBOR decoder for the Arborix manifest format. * cbor.js — Minimal CBOR decoder for the Arboricx manifest format.
* *
* Decodes the canonical CBOR produced by the Haskell cborg library: * Decodes the canonical CBOR produced by the Haskell cborg library:
* - Maps: major type 5 (0xa0 + length) * - Maps: major type 5 (0xa0 + length)

View File

@@ -1,6 +1,6 @@
#!/usr/bin/env node #!/usr/bin/env node
/** /**
* cli.js — Minimal CLI for inspecting and running Arborix bundles. * cli.js — Minimal CLI for inspecting and running Arboricx bundles.
* *
* Usage: * Usage:
* node cli.js inspect <bundle> * node cli.js inspect <bundle>
@@ -240,7 +240,7 @@ switch (command) {
break; break;
} }
default: default:
console.log("Arborix JS Runtime"); console.log("Arboricx JS Runtime");
console.log(""); console.log("");
console.log("Usage:"); console.log("Usage:");
console.log(" node cli.js inspect <bundle>"); console.log(" node cli.js inspect <bundle>");

View File

@@ -13,7 +13,7 @@
* Throws on violation. * Throws on violation.
*/ */
export function validateManifest(manifest) { export function validateManifest(manifest) {
if (manifest.schema !== "arborix.bundle.manifest.v1") { if (manifest.schema !== "arboricx.bundle.manifest.v1") {
throw new Error( throw new Error(
`unsupported manifest schema: ${manifest.schema}` `unsupported manifest schema: ${manifest.schema}`
); );
@@ -33,12 +33,12 @@ export function validateManifest(manifest) {
`unsupported node hash algorithm: ${tree.nodeHash.algorithm}` `unsupported node hash algorithm: ${tree.nodeHash.algorithm}`
); );
} }
if (tree.nodeHash.domain !== "arborix.merkle.node.v1") { if (tree.nodeHash.domain !== "arboricx.merkle.node.v1") {
throw new Error( throw new Error(
`unsupported node hash domain: ${tree.nodeHash.domain}` `unsupported node hash domain: ${tree.nodeHash.domain}`
); );
} }
if (tree.nodePayload !== "arborix.merkle.payload.v1") { if (tree.nodePayload !== "arboricx.merkle.payload.v1") {
throw new Error(`unsupported node payload: ${tree.nodePayload}`); throw new Error(`unsupported node payload: ${tree.nodePayload}`);
} }
@@ -46,7 +46,7 @@ export function validateManifest(manifest) {
if (runtime.semantics !== "tree-calculus.v1") { if (runtime.semantics !== "tree-calculus.v1") {
throw new Error(`unsupported runtime semantics: ${runtime.semantics}`); throw new Error(`unsupported runtime semantics: ${runtime.semantics}`);
} }
if (runtime.abi !== "arborix.abi.tree.v1") { if (runtime.abi !== "arboricx.abi.tree.v1") {
throw new Error(`unsupported runtime ABI: ${runtime.abi}`); throw new Error(`unsupported runtime ABI: ${runtime.abi}`);
} }
if (runtime.capabilities && runtime.capabilities.length > 0) { if (runtime.capabilities && runtime.capabilities.length > 0) {

View File

@@ -7,14 +7,14 @@
* Fork: 0x02 || left_hash (32 bytes raw) || right_hash (32 bytes raw) * Fork: 0x02 || left_hash (32 bytes raw) || right_hash (32 bytes raw)
* *
* Hash computation: * Hash computation:
* hash = SHA256( "arborix.merkle.node.v1" || 0x00 || node_payload ) * hash = SHA256( "arboricx.merkle.node.v1" || 0x00 || node_payload )
*/ */
import { createHash } from "node:crypto"; import { createHash } from "node:crypto";
// ── Constants ─────────────────────────────────────────────────────────────── // ── Constants ───────────────────────────────────────────────────────────────
const DOMAIN_TAG = "arborix.merkle.node.v1"; const DOMAIN_TAG = "arboricx.merkle.node.v1";
const HASH_LENGTH = 32; // raw hash bytes const HASH_LENGTH = 32; // raw hash bytes
const HEX_LENGTH = 64; // hex-encoded hash length const HEX_LENGTH = 64; // hex-encoded hash length

View File

@@ -19,7 +19,7 @@ const fixtureDir = "../../test/fixtures";
describe("bundle parsing", () => { describe("bundle parsing", () => {
it("valid bundle parses header and sections", () => { it("valid bundle parses header and sections", () => {
const bundle = parseBundle( const bundle = parseBundle(
readFileSync(`${fixtureDir}/id.arborix`) readFileSync(`${fixtureDir}/id.arboricx`)
); );
strictEqual(bundle.version, "1.0"); strictEqual(bundle.version, "1.0");
strictEqual(bundle.sectionCount, 2); strictEqual(bundle.sectionCount, 2);
@@ -29,23 +29,23 @@ describe("bundle parsing", () => {
it("parseManifest returns valid manifest", () => { it("parseManifest returns valid manifest", () => {
const manifest = parseManifest( const manifest = parseManifest(
readFileSync(`${fixtureDir}/id.arborix`) readFileSync(`${fixtureDir}/id.arboricx`)
); );
strictEqual(manifest.schema, "arborix.bundle.manifest.v1"); strictEqual(manifest.schema, "arboricx.bundle.manifest.v1");
strictEqual(manifest.bundleType, "tree-calculus-executable-object"); strictEqual(manifest.bundleType, "tree-calculus-executable-object");
strictEqual(manifest.closure, "complete"); strictEqual(manifest.closure, "complete");
strictEqual(manifest.tree.calculus, "tree-calculus.v1"); strictEqual(manifest.tree.calculus, "tree-calculus.v1");
strictEqual(manifest.tree.nodeHash.algorithm, "sha256"); strictEqual(manifest.tree.nodeHash.algorithm, "sha256");
strictEqual(manifest.tree.nodeHash.domain, "arborix.merkle.node.v1"); strictEqual(manifest.tree.nodeHash.domain, "arboricx.merkle.node.v1");
strictEqual(manifest.runtime.semantics, "tree-calculus.v1"); strictEqual(manifest.runtime.semantics, "tree-calculus.v1");
strictEqual(manifest.runtime.abi, "arborix.abi.tree.v1"); strictEqual(manifest.runtime.abi, "arboricx.abi.tree.v1");
}); });
}); });
describe("hash verification", () => { describe("hash verification", () => {
it("valid bundle nodes verify", () => { it("valid bundle nodes verify", () => {
const data = bundleParseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.arborix`) readFileSync(`${fixtureDir}/id.arboricx`)
); );
const { nodeMap } = parseNodes(data); const { nodeMap } = parseNodes(data);
const { verified } = verifyNodeHashes(nodeMap); const { verified } = verifyNodeHashes(nodeMap);
@@ -62,20 +62,20 @@ describe("errors", () => {
it("unsupported version fails", () => { it("unsupported version fails", () => {
const buf = Buffer.alloc(32, 0); const buf = Buffer.alloc(32, 0);
buf.write("ARBORIX\0", 0, 8); buf.write("ARBORICX", 0, 8);
buf.writeUInt16BE(2, 8); // major version 2 buf.writeUInt16BE(2, 8); // major version 2
throws(() => parseBundle(buf), /unsupported bundle major version/); throws(() => parseBundle(buf), /unsupported bundle major version/);
}); });
it("bad section digest fails", () => { it("bad section digest fails", () => {
const buf = readFileSync(`${fixtureDir}/id.arborix`); const buf = readFileSync(`${fixtureDir}/id.arboricx`);
// Corrupt one byte in the manifest section // Corrupt one byte in the manifest section
buf[152] ^= 0x01; buf[152] ^= 0x01;
throws(() => parseBundle(buf), /digest mismatch/); throws(() => parseBundle(buf), /digest mismatch/);
}); });
it("truncated bundle fails", () => { it("truncated bundle fails", () => {
const buf = readFileSync(`${fixtureDir}/id.arborix`); const buf = readFileSync(`${fixtureDir}/id.arboricx`);
const truncated = buf.slice(0, 40); const truncated = buf.slice(0, 40);
throws(() => parseBundle(truncated), /truncated/); throws(() => parseBundle(truncated), /truncated/);
}); });
@@ -83,33 +83,33 @@ describe("errors", () => {
it("missing nodes section fails", () => { it("missing nodes section fails", () => {
// Build a bundle with only manifest entry in the directory (1 section instead of 2) // Build a bundle with only manifest entry in the directory (1 section instead of 2)
const header = Buffer.alloc(32, 0); const header = Buffer.alloc(32, 0);
header.write("ARBORIX\0", 0, 8); header.write("ARBORICX", 0, 8);
header.writeUInt16BE(1, 8); // major version header.writeUInt16BE(1, 8); // major version
header.writeUInt16BE(0, 10); // minor version header.writeUInt16BE(0, 10); // minor version
header.writeUInt32BE(1, 12); // 1 section header.writeUInt32BE(1, 12); // 1 section
// Build a manifest JSON // Build a manifest JSON
const manifestObj = { const manifestObj = {
schema: "arborix.bundle.manifest.v1", schema: "arboricx.bundle.manifest.v1",
bundleType: "tree-calculus-executable-object", bundleType: "tree-calculus-executable-object",
tree: { tree: {
calculus: "tree-calculus.v1", calculus: "tree-calculus.v1",
nodeHash: { nodeHash: {
algorithm: "sha256", algorithm: "sha256",
domain: "arborix.merkle.node.v1" domain: "arboricx.merkle.node.v1"
}, },
nodePayload: "arborix.merkle.payload.v1" nodePayload: "arboricx.merkle.payload.v1"
}, },
runtime: { runtime: {
semantics: "tree-calculus.v1", semantics: "tree-calculus.v1",
evaluation: "normal-order", evaluation: "normal-order",
abi: "arborix.abi.tree.v1", abi: "arboricx.abi.tree.v1",
capabilities: [] capabilities: []
}, },
closure: "complete", closure: "complete",
roots: [{ hash: Buffer.alloc(32).toString("hex"), role: "default" }], roots: [{ hash: Buffer.alloc(32).toString("hex"), role: "default" }],
exports: [{ name: "root", root: Buffer.alloc(32).toString("hex"), kind: "term", abi: "arborix.abi.tree.v1" }], exports: [{ name: "root", root: Buffer.alloc(32).toString("hex"), kind: "term", abi: "arboricx.abi.tree.v1" }],
metadata: { createdBy: "arborix" } metadata: { createdBy: "arboricx" }
}; };
const manifestJson = JSON.stringify(manifestObj); const manifestJson = JSON.stringify(manifestObj);
const manifestBytes = Buffer.from(manifestJson); const manifestBytes = Buffer.from(manifestJson);

View File

@@ -51,35 +51,35 @@ describe("merkle — computeNodeHash", () => {
strictEqual(hash.length, 64); strictEqual(hash.length, 64);
}); });
it("Leaf hash matches expected Arborix domain", () => { it("Leaf hash matches expected Arboricx domain", () => {
const leaf = { type: "leaf" }; const leaf = { type: "leaf" };
const hash = computeNodeHash(leaf); const hash = computeNodeHash(leaf);
strictEqual(hash, "e54db458aa8e94782f7c61ad6c1f19a1c0c6fca7ffe53674f0d2bc5ff7ab02ff"); strictEqual(hash, "92b8a9796dbeafbcd36757535876256392170d137bf36b319d77f11a37112158");
}); });
}); });
describe("merkle — node section parsing", () => { describe("merkle — node section parsing", () => {
const fixtureDir = "../../test/fixtures"; const fixtureDir = "../../test/fixtures";
it("parses id.arborix with correct node count", () => { it("parses id.arboricx with correct node count", () => {
const data = bundleParseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.arborix`) readFileSync(`${fixtureDir}/id.arboricx`)
); );
const { nodeMap } = parseNodeSection(data); const { nodeMap } = parseNodeSection(data);
strictEqual(nodeMap.size, 4); strictEqual(nodeMap.size, 4);
}); });
it("parses true.arborix with correct node count", () => { it("parses true.arboricx with correct node count", () => {
const data = bundleParseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/true.arborix`) readFileSync(`${fixtureDir}/true.arboricx`)
); );
const { nodeMap } = parseNodeSection(data); const { nodeMap } = parseNodeSection(data);
strictEqual(nodeMap.size, 2); strictEqual(nodeMap.size, 2);
}); });
it("parses false.arborix with correct node count", () => { it("parses false.arboricx with correct node count", () => {
const data = bundleParseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/false.arborix`) readFileSync(`${fixtureDir}/false.arboricx`)
); );
const { nodeMap } = parseNodeSection(data); const { nodeMap } = parseNodeSection(data);
strictEqual(nodeMap.size, 1); strictEqual(nodeMap.size, 1);
@@ -89,29 +89,29 @@ describe("merkle — node section parsing", () => {
describe("merkle — hash verification", () => { describe("merkle — hash verification", () => {
const fixtureDir = "../../test/fixtures"; const fixtureDir = "../../test/fixtures";
it("id.arborix nodes all verify", () => { it("id.arboricx nodes all verify", () => {
const data = bundleParseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.arborix`) readFileSync(`${fixtureDir}/id.arboricx`)
); );
const { nodeMap } = parseNodeSection(data); const { nodeMap } = parseNodeSection(data);
const { verified, mismatches } = verifyNodeHashes(nodeMap); const { verified, mismatches } = verifyNodeHashes(nodeMap);
ok(verified, "id.arborix node hashes should verify"); ok(verified, "id.arboricx node hashes should verify");
strictEqual(mismatches.length, 0); strictEqual(mismatches.length, 0);
}); });
it("true.arborix nodes all verify", () => { it("true.arboricx nodes all verify", () => {
const data = bundleParseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/true.arborix`) readFileSync(`${fixtureDir}/true.arboricx`)
); );
const { nodeMap } = parseNodeSection(data); const { nodeMap } = parseNodeSection(data);
const { verified, mismatches } = verifyNodeHashes(nodeMap); const { verified, mismatches } = verifyNodeHashes(nodeMap);
ok(verified, "true.arborix node hashes should verify"); ok(verified, "true.arboricx node hashes should verify");
strictEqual(mismatches.length, 0); strictEqual(mismatches.length, 0);
}); });
it("corrupted node payload fails hash verification", () => { it("corrupted node payload fails hash verification", () => {
const data = bundleParseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.arborix`) readFileSync(`${fixtureDir}/id.arboricx`)
); );
const { nodeMap } = parseNodeSection(data); const { nodeMap } = parseNodeSection(data);
// Find a stem node to corrupt // Find a stem node to corrupt
@@ -137,23 +137,23 @@ describe("merkle — hash verification", () => {
describe("merkle — closure verification", () => { describe("merkle — closure verification", () => {
const fixtureDir = "../../test/fixtures"; const fixtureDir = "../../test/fixtures";
it("id.arborix has complete closure", () => { it("id.arboricx has complete closure", () => {
const data = bundleParseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.arborix`) readFileSync(`${fixtureDir}/id.arboricx`)
); );
const { nodeMap } = parseNodeSection(data); const { nodeMap } = parseNodeSection(data);
const { complete, missing } = verifyClosure(nodeMap); const { complete, missing } = verifyClosure(nodeMap);
ok(complete, "id.arborix should have complete closure"); ok(complete, "id.arboricx should have complete closure");
strictEqual(missing.length, 0); strictEqual(missing.length, 0);
}); });
it("verifyRootClosure checks transitive reachability", () => { it("verifyRootClosure checks transitive reachability", () => {
const data = bundleParseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.arborix`) readFileSync(`${fixtureDir}/id.arboricx`)
); );
const { nodeMap } = parseNodeSection(data); const { nodeMap } = parseNodeSection(data);
// Use the actual root hash from the fixture's manifest // Use the actual root hash from the fixture's manifest
const manifest = parseManifest(readFileSync(`${fixtureDir}/id.arborix`)); const manifest = parseManifest(readFileSync(`${fixtureDir}/id.arboricx`));
const rootHash = manifest.exports[0].root; const rootHash = manifest.exports[0].root;
const { complete, missingRoots } = verifyRootClosure(nodeMap, rootHash); const { complete, missingRoots } = verifyRootClosure(nodeMap, rootHash);
ok(complete, "root should be reachable"); ok(complete, "root should be reachable");
@@ -162,7 +162,7 @@ describe("merkle — closure verification", () => {
it("parseNodeSection returns correct node count", () => { it("parseNodeSection returns correct node count", () => {
const data = bundleParseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.arborix`) readFileSync(`${fixtureDir}/id.arboricx`)
); );
const result = parseNodeSection(data); const result = parseNodeSection(data);
strictEqual(result.count, 4); strictEqual(result.count, 4);

View File

@@ -9,8 +9,8 @@ import { buildTreeFromNodeMap } from "../src/cli.js";
const fixtureDir = "../../test/fixtures"; const fixtureDir = "../../test/fixtures";
describe("run bundle — id.arborix", () => { describe("run bundle — id.arboricx", () => {
const bundle = readFileSync(`${fixtureDir}/id.arborix`); const bundle = readFileSync(`${fixtureDir}/id.arboricx`);
const manifest = parseManifest(bundle); const manifest = parseManifest(bundle);
const nodeSectionData = bundleParseNodeSection(bundle); const nodeSectionData = bundleParseNodeSection(bundle);
const { nodeMap } = parseNodes(nodeSectionData); const { nodeMap } = parseNodes(nodeSectionData);
@@ -37,8 +37,8 @@ describe("run bundle — id.arborix", () => {
}); });
}); });
describe("run bundle — true.arborix", () => { describe("run bundle — true.arboricx", () => {
const bundle = readFileSync(`${fixtureDir}/true.arborix`); const bundle = readFileSync(`${fixtureDir}/true.arboricx`);
const manifest = parseManifest(bundle); const manifest = parseManifest(bundle);
const nodeSectionData = bundleParseNodeSection(bundle); const nodeSectionData = bundleParseNodeSection(bundle);
const { nodeMap } = parseNodes(nodeSectionData); const { nodeMap } = parseNodes(nodeSectionData);
@@ -61,8 +61,8 @@ describe("run bundle — true.arborix", () => {
}); });
}); });
describe("run bundle — false.arborix", () => { describe("run bundle — false.arboricx", () => {
const bundle = readFileSync(`${fixtureDir}/false.arborix`); const bundle = readFileSync(`${fixtureDir}/false.arboricx`);
const manifest = parseManifest(bundle); const manifest = parseManifest(bundle);
const nodeSectionData = bundleParseNodeSection(bundle); const nodeSectionData = bundleParseNodeSection(bundle);
const { nodeMap } = parseNodes(nodeSectionData); const { nodeMap } = parseNodes(nodeSectionData);
@@ -83,8 +83,8 @@ describe("run bundle — false.arborix", () => {
}); });
}); });
describe("run bundle — notQ.arborix", () => { describe("run bundle — notQ.arboricx", () => {
const bundle = readFileSync(`${fixtureDir}/notQ.arborix`); const bundle = readFileSync(`${fixtureDir}/notQ.arboricx`);
const manifest = parseManifest(bundle); const manifest = parseManifest(bundle);
const nodeSectionData = bundleParseNodeSection(bundle); const nodeSectionData = bundleParseNodeSection(bundle);
const { nodeMap } = parseNodes(nodeSectionData); const { nodeMap } = parseNodes(nodeSectionData);
@@ -100,7 +100,7 @@ describe("run bundle — notQ.arborix", () => {
}); });
describe("run bundle — missing export", () => { describe("run bundle — missing export", () => {
const bundle = readFileSync(`${fixtureDir}/id.arborix`); const bundle = readFileSync(`${fixtureDir}/id.arboricx`);
const manifest = parseManifest(bundle); const manifest = parseManifest(bundle);
it("nonexistent export fails clearly", () => { it("nonexistent export fails clearly", () => {
@@ -109,8 +109,8 @@ describe("run bundle — missing export", () => {
}); });
describe("run bundle — auto-select", () => { describe("run bundle — auto-select", () => {
// true.arborix has only one export, should auto-select // true.arboricx has only one export, should auto-select
const bundle = readFileSync(`${fixtureDir}/true.arborix`); const bundle = readFileSync(`${fixtureDir}/true.arboricx`);
const manifest = parseManifest(bundle); const manifest = parseManifest(bundle);
it("single export auto-selects", () => { it("single export auto-selects", () => {

View File

@@ -36,8 +36,6 @@
checks.${packageName} = tricuPackageTests; checks.${packageName} = tricuPackageTests;
checks.default = tricuPackageTests; checks.default = tricuPackageTests;
defaultPackage = self.packages.${system}.default;
devShells.default = pkgs.mkShell { devShells.default = pkgs.mkShell {
buildInputs = with pkgs; [ buildInputs = with pkgs; [
haskellPackages.cabal-install haskellPackages.cabal-install
@@ -51,7 +49,6 @@
tricuPackage tricuPackage
]; ];
}; };
devShell = self.devShells.${system}.default;
packages.${containerPackageName} = pkgs.dockerTools.buildImage { packages.${containerPackageName} = pkgs.dockerTools.buildImage {
name = "tricu"; name = "tricu";

View File

@@ -3,11 +3,11 @@
!import "bytes.tri" !Local !import "bytes.tri" !Local
!import "binary.tri" !Local !import "binary.tri" !Local
arborixMagic = [(65) (82) (66) (79) (82) (73) (88) (0)] arboricxMagic = [(65) (82) (66) (79) (82) (73) (67) (88)]
arborixMajorVersion = [(0) (1)] arboricxMajorVersion = [(0) (1)]
arborixMinorVersion = [(0) (0)] arboricxMinorVersion = [(0) (0)]
arborixManifestSectionId = [(0) (0) (0) (1)] arboricxManifestSectionId = [(0) (0) (0) (1)]
arborixNodesSectionId = [(0) (0) (0) (2)] arboricxNodesSectionId = [(0) (0) (0) (2)]
errMissingSection = 4 errMissingSection = 4
errUnsupportedVersion = 5 errUnsupportedVersion = 5
@@ -20,10 +20,10 @@ nodePayloadLeafTag = 0
nodePayloadStemTag = 1 nodePayloadStemTag = 1
nodePayloadForkTag = 2 nodePayloadForkTag = 2
readArborixMagic = (bs : expectBytes arborixMagic bs) readArboricxMagic = (bs : expectBytes arboricxMagic bs)
readArborixHeader = (bs : readArboricxHeader = (bs :
bindResult (readArborixMagic bs) bindResult (readArboricxMagic bs)
(_ afterMagic : (_ afterMagic :
bindResult (readBytes 2 afterMagic) bindResult (readBytes 2 afterMagic)
(majorVersion afterMajor : (majorVersion afterMajor :
@@ -296,12 +296,12 @@ beBytesToNat = (bytes :
u32BEBytesToNat = beBytesToNat u32BEBytesToNat = beBytesToNat
u64BEBytesToNat = beBytesToNat u64BEBytesToNat = beBytesToNat
arborixHeaderMajorVersion = (header : arboricxHeaderMajorVersion = (header :
matchPair matchPair
(majorVersion _ : majorVersion) (majorVersion _ : majorVersion)
header) header)
arborixHeaderMinorVersion = (header : arboricxHeaderMinorVersion = (header :
matchPair matchPair
(_ payload : (_ payload :
matchPair matchPair
@@ -309,7 +309,7 @@ arborixHeaderMinorVersion = (header :
payload) payload)
header) header)
arborixHeaderSectionCount = (header : arboricxHeaderSectionCount = (header :
matchPair matchPair
(_ payload : (_ payload :
matchPair matchPair
@@ -320,7 +320,7 @@ arborixHeaderSectionCount = (header :
payload) payload)
header) header)
arborixHeaderFlags = (header : arboricxHeaderFlags = (header :
matchPair matchPair
(_ payload : (_ payload :
matchPair matchPair
@@ -334,7 +334,7 @@ arborixHeaderFlags = (header :
payload) payload)
header) header)
arborixHeaderDirOffset = (header : arboricxHeaderDirOffset = (header :
matchPair matchPair
(_ payload : (_ payload :
matchPair matchPair
@@ -348,22 +348,22 @@ arborixHeaderDirOffset = (header :
payload) payload)
header) header)
validateArborixHeader = (header rest : validateArboricxHeader = (header rest :
matchBool matchBool
(ok header rest) (ok header rest)
(err errUnsupportedVersion rest) (err errUnsupportedVersion rest)
(and? (and?
(bytesEq? arborixMajorVersion (arborixHeaderMajorVersion header)) (bytesEq? arboricxMajorVersion (arboricxHeaderMajorVersion header))
(bytesEq? arborixMinorVersion (arborixHeaderMinorVersion header)))) (bytesEq? arboricxMinorVersion (arboricxHeaderMinorVersion header))))
readArborixContainer = (bs : readArboricxContainer = (bs :
bindResult (readArborixHeader bs) bindResult (readArboricxHeader bs)
(header afterHeader : (header afterHeader :
bindResult (validateArborixHeader header afterHeader) bindResult (validateArboricxHeader header afterHeader)
(validHeader afterValidHeader : (validHeader afterValidHeader :
bindResult (readSectionDirectory bindResult (readSectionDirectory
(u32BEBytesToNat (arborixHeaderSectionCount validHeader)) (u32BEBytesToNat (arboricxHeaderSectionCount validHeader))
(bytesDrop (u64BEBytesToNat (arborixHeaderDirOffset validHeader)) bs)) (bytesDrop (u64BEBytesToNat (arboricxHeaderDirOffset validHeader)) bs))
(directory afterDirectory : (directory afterDirectory :
bindResult (validateSectionDirectory directory afterDirectory) bindResult (validateSectionDirectory directory afterDirectory)
(validDirectory afterValidDirectory : (validDirectory afterValidDirectory :
@@ -403,21 +403,21 @@ sectionBytesOrErr = (sectionId directory containerBytes rest :
(_ _ : err errMissingSection rest) (_ _ : err errMissingSection rest)
(lookupSectionRecord sectionId directory)) (lookupSectionRecord sectionId directory))
readArborixSectionBytes = (sectionId bs : readArboricxSectionBytes = (sectionId bs :
bindResult (readArborixContainer bs) bindResult (readArboricxContainer bs)
(container afterContainer : (container afterContainer :
matchPair matchPair
(_ directory : sectionBytesOrErr sectionId directory bs afterContainer) (_ directory : sectionBytesOrErr sectionId directory bs afterContainer)
container)) container))
readArborixRequiredSections = (bs : readArboricxRequiredSections = (bs :
bindResult (readArborixContainer bs) bindResult (readArboricxContainer bs)
(container afterContainer : (container afterContainer :
matchPair matchPair
(_ directory : (_ directory :
bindResult (sectionBytesOrErr arborixManifestSectionId directory bs afterContainer) bindResult (sectionBytesOrErr arboricxManifestSectionId directory bs afterContainer)
(manifestBytes _ : (manifestBytes _ :
bindResult (sectionBytesOrErr arborixNodesSectionId directory bs afterContainer) bindResult (sectionBytesOrErr arboricxNodesSectionId directory bs afterContainer)
(nodesBytes _ : (nodesBytes _ :
ok (pair manifestBytes nodesBytes) afterContainer))) ok (pair manifestBytes nodesBytes) afterContainer)))
container)) container))
@@ -602,12 +602,12 @@ readNodesSectionComplete = (bs :
(err errUnexpectedBytes afterNodesSection) (err errUnexpectedBytes afterNodesSection)
(bytesNil? afterNodesSection))) (bytesNil? afterNodesSection)))
readArborixNodesSection = (bs : readArboricxNodesSection = (bs :
bindResult (readArborixContainer bs) bindResult (readArboricxContainer bs)
(container afterContainer : (container afterContainer :
matchPair matchPair
(_ directory : (_ directory :
bindResult (sectionBytesOrErr arborixNodesSectionId directory bs afterContainer) bindResult (sectionBytesOrErr arboricxNodesSectionId directory bs afterContainer)
(nodesBytes _ : (nodesBytes _ :
bindResult (readNodesSectionComplete nodesBytes) bindResult (readNodesSectionComplete nodesBytes)
(nodesSection _ : ok nodesSection afterContainer))) (nodesSection _ : ok nodesSection afterContainer)))
@@ -645,10 +645,10 @@ nodeHashToTree = y (self nodeHash nodeRecords :
(_ _ : err errMissingNode t) (_ _ : err errMissingNode t)
(lookupNodeRecord nodeHash nodeRecords)) (lookupNodeRecord nodeHash nodeRecords))
readArborixTreeFromHash = (rootHash bs : readArboricxTreeFromHash = (rootHash bs :
bindResult (readArborixNodesSection bs) bindResult (readArboricxNodesSection bs)
(nodesSection afterContainer : (nodesSection afterContainer :
bindResult (nodeHashToTree rootHash (nodesSectionRecords nodesSection)) bindResult (nodeHashToTree rootHash (nodesSectionRecords nodesSection))
(tree _ : ok tree afterContainer))) (tree _ : ok tree afterContainer)))
readArborixExecutableFromHash = readArborixTreeFromHash readArboricxExecutableFromHash = readArboricxTreeFromHash

View File

@@ -162,7 +162,7 @@ nsVariable :: String -> String -> String
nsVariable "" name = name nsVariable "" name = name
nsVariable moduleName name = moduleName ++ "." ++ name nsVariable moduleName name = moduleName ++ "." ++ name
-- | Compile a tricu source file to a standalone Arborix bundle. -- | Compile a tricu source file to a standalone Arboricx bundle.
-- Uses a temp content store so it does not collide with the global one. -- Uses a temp content store so it does not collide with the global one.
-- Supports multiple named exports; each is stored separately in the -- Supports multiple named exports; each is stored separately in the
-- temp store so that resolveExportTarget can look them up by name. -- temp store so that resolveExportTarget can look them up by name.

View File

@@ -97,7 +97,7 @@ compileMode = Compile
, names = def &= help "Definition name(s) to export as bundle roots (comma-separated or repeated -x). Defaults to 'main'." , names = def &= help "Definition name(s) to export as bundle roots (comma-separated or repeated -x). Defaults to 'main'."
&= name "x" &= typ "NAME" &= name "x" &= typ "NAME"
} }
&= help "Compile a tricu source file into a standalone Arborix portable bundle." &= help "Compile a tricu source file into a standalone Arboricx portable bundle."
&= explicit &= explicit
&= name "compile" &= name "compile"
@@ -106,7 +106,7 @@ serveMode = Serve
{ host = "127.0.0.1" &= help "Host to bind the server to." &= name "h" &= typ "HOST" { host = "127.0.0.1" &= help "Host to bind the server to." &= name "h" &= typ "HOST"
, port = 8787 &= help "HTTP port to listen on." &= name "p" &= typ "PORT" , port = 8787 &= help "HTTP port to listen on." &= name "p" &= typ "PORT"
} }
&= help "Start a read-only HTTP server for exporting Arborix bundles." &= help "Start a read-only HTTP server for exporting Arboricx bundles."
&= explicit &= explicit
&= name "server" &= name "server"
@@ -182,10 +182,10 @@ main = do
let exportNames = if null namesArg then [] else map T.pack namesArg let exportNames = if null namesArg then [] else map T.pack namesArg
in compileFile compileInputFile compileOutFile exportNames in compileFile compileInputFile compileOutFile exportNames
Serve { host = hostStr, port = portNum } -> do Serve { host = hostStr, port = portNum } -> do
putStrLn $ "Starting Arborix bundle server on " ++ hostStr ++ ":" ++ show portNum putStrLn $ "Starting Arboricx bundle server on " ++ hostStr ++ ":" ++ show portNum
putStrLn $ " GET /bundle/hash/:hash -- primary endpoint" putStrLn $ " GET /bundle/hash/:hash -- primary endpoint"
putStrLn $ " GET /bundle/name/:name -- convenience endpoint" putStrLn $ " GET /bundle/name/:name -- convenience endpoint"
putStrLn $ " Content-Type: application/vnd.arborix.bundle" putStrLn $ " Content-Type: application/vnd.arboricx.bundle"
runServer hostStr portNum runServer hostStr portNum
runTricu :: String -> String runTricu :: String -> String

View File

@@ -85,12 +85,12 @@ serializeNode (NFork l r) = BS.pack [0x02] <> go (decode (encodeUtf8 l)) <> go (
go (Right bs) = bs go (Right bs) = bs
-- | Hash a node per the Merkle content-addressing spec. -- | Hash a node per the Merkle content-addressing spec.
-- hash = SHA256( "arborix.merkle.node.v1" <> 0x00 <> node_payload ) -- hash = SHA256( "arboricx.merkle.node.v1" <> 0x00 <> node_payload )
nodeHash :: Node -> MerkleHash nodeHash :: Node -> MerkleHash
nodeHash node = decodeUtf8 (encode (sha256WithPrefix (serializeNode node))) nodeHash node = decodeUtf8 (encode (sha256WithPrefix (serializeNode node)))
where sha256WithPrefix payload = where sha256WithPrefix payload =
convert . (hash :: BS.ByteString -> Digest SHA256) $ utf8Tag <> BS.pack [0x00] <> payload convert . (hash :: BS.ByteString -> Digest SHA256) $ utf8Tag <> BS.pack [0x00] <> payload
utf8Tag = BS.pack $ map fromIntegral $ BS.unpack "arborix.merkle.node.v1" utf8Tag = BS.pack $ map fromIntegral $ BS.unpack "arboricx.merkle.node.v1"
-- | Deserialize a Node from canonical bytes. -- | Deserialize a Node from canonical bytes.
deserializeNode :: BS.ByteString -> Node deserializeNode :: BS.ByteString -> Node
@@ -138,7 +138,7 @@ toBytes t = case toList t of
Left err -> Left err Left err -> Left err
Right bs -> BS.pack <$> mapM toByte bs Right bs -> BS.pack <$> mapM toByte bs
-- | Convert a canonical Arborix node payload (ByteString) to a Tree -- | Convert a canonical Arboricx node payload (ByteString) to a Tree
-- representation (a list of Byte trees). -- representation (a list of Byte trees).
nodePayloadToTreeBytes :: BS.ByteString -> T nodePayloadToTreeBytes :: BS.ByteString -> T
nodePayloadToTreeBytes = ofBytes nodePayloadToTreeBytes = ofBytes

View File

@@ -23,7 +23,7 @@ import Data.ByteString.Char8 (unpack)
import Data.ByteString.Lazy (fromStrict) import Data.ByteString.Lazy (fromStrict)
import qualified Data.Text as T import qualified Data.Text as T
-- | Start an HTTP server that serves Arborix bundles from the -- | Start an HTTP server that serves Arboricx bundles from the
-- local content store. -- local content store.
-- --
-- This is a read-only export surface. Clients fetch bundle bytes -- This is a read-only export surface. Clients fetch bundle bytes
@@ -133,10 +133,10 @@ rootsHandler request respond = do
(fromStrict bundleData) (fromStrict bundleData)
-- | GET /bundle/name/:name -- | GET /bundle/name/:name
-- Resolve a stored term name, export it as an Arborix bundle, -- Resolve a stored term name, export it as an Arboricx bundle,
-- and return the raw bundle bytes. -- and return the raw bundle bytes.
-- --
-- Sets @Content-Type@ and @X-Arborix-Root-Hash@ headers. -- Sets @Content-Type@ and @X-Arboricx-Root-Hash@ headers.
-- Returns 404 when the name does not resolve to any stored term. -- Returns 404 when the name does not resolve to any stored term.
nameHandler :: Text -> IO Response nameHandler :: Text -> IO Response
nameHandler nameText = do nameHandler nameText = do
@@ -155,7 +155,7 @@ nameHandler nameText = do
return $ responseLBS status200 (bundleHeaders th cd) (fromStrict bundleData) return $ responseLBS status200 (bundleHeaders th cd) (fromStrict bundleData)
-- | GET /bundle/hash/:hash -- | GET /bundle/hash/:hash
-- Resolve a full Merkle hash and export the root as an Arborix -- Resolve a full Merkle hash and export the root as an Arboricx
-- bundle. -- bundle.
-- --
-- - Malformed hash (non-hex or < 16 chars): 400 -- - Malformed hash (non-hex or < 16 chars): 400
@@ -207,8 +207,8 @@ textResponse status body =
bundleHeaders :: Text -> Text -> [Header] bundleHeaders :: Text -> Text -> [Header]
bundleHeaders root cd = bundleHeaders root cd =
[ (hContentType, encodeUtf8 "application/vnd.arborix.bundle") [ (hContentType, encodeUtf8 "application/vnd.arboricx.bundle")
, ("X-Arborix-Root-Hash", encodeUtf8 root) , ("X-Arboricx-Root-Hash", encodeUtf8 root)
, ("Content-Disposition", encodeUtf8 cd) , ("Content-Disposition", encodeUtf8 cd)
] ]

View File

@@ -71,7 +71,7 @@ bundleMinorVersion = 0
-- | Header magic for the portable executable-object container. -- | Header magic for the portable executable-object container.
bundleMagic :: ByteString bundleMagic :: ByteString
bundleMagic = BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x58, 0x00] -- "ARBORIX\0" bundleMagic = BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x43, 0x58] -- "ARBORICX"
headerLength :: Int headerLength :: Int
headerLength = 32 headerLength = 32
@@ -563,20 +563,20 @@ decodeSectionEntries count bytes = reverse <$> go count bytes []
defaultManifest :: [(Text, MerkleHash)] -> BundleManifest defaultManifest :: [(Text, MerkleHash)] -> BundleManifest
defaultManifest namedRoots = BundleManifest defaultManifest namedRoots = BundleManifest
{ manifestSchema = "arborix.bundle.manifest.v1" { manifestSchema = "arboricx.bundle.manifest.v1"
, manifestBundleType = "tree-calculus-executable-object" , manifestBundleType = "tree-calculus-executable-object"
, manifestTree = TreeSpec , manifestTree = TreeSpec
{ treeCalculus = "tree-calculus.v1" { treeCalculus = "tree-calculus.v1"
, treeNodeHash = NodeHashSpec , treeNodeHash = NodeHashSpec
{ nodeHashAlgorithm = "sha256" { nodeHashAlgorithm = "sha256"
, nodeHashDomain = "arborix.merkle.node.v1" , nodeHashDomain = "arboricx.merkle.node.v1"
} }
, treeNodePayload = "arborix.merkle.payload.v1" , treeNodePayload = "arboricx.merkle.payload.v1"
} }
, manifestRuntime = RuntimeSpec , manifestRuntime = RuntimeSpec
{ runtimeSemantics = "tree-calculus.v1" { runtimeSemantics = "tree-calculus.v1"
, runtimeEvaluation = "normal-order" , runtimeEvaluation = "normal-order"
, runtimeAbi = "arborix.abi.tree.v1" , runtimeAbi = "arboricx.abi.tree.v1"
, runtimeCapabilities = [] , runtimeCapabilities = []
} }
, manifestClosure = ClosureComplete , manifestClosure = ClosureComplete
@@ -587,7 +587,7 @@ defaultManifest namedRoots = BundleManifest
, metadataVersion = Nothing , metadataVersion = Nothing
, metadataDescription = Nothing , metadataDescription = Nothing
, metadataLicense = Nothing , metadataLicense = Nothing
, metadataCreatedBy = Just "arborix" , metadataCreatedBy = Just "arboricx"
} }
} }
where where
@@ -597,7 +597,7 @@ defaultManifest namedRoots = BundleManifest
{ exportName = name { exportName = name
, exportRoot = h , exportRoot = h
, exportKind = "term" , exportKind = "term"
, exportAbi = "arborix.abi.tree.v1" , exportAbi = "arboricx.abi.tree.v1"
} }
-- --------------------------------------------------------------------------- -- ---------------------------------------------------------------------------
@@ -672,7 +672,7 @@ verifyBundle bundle = do
verifyManifest :: BundleManifest -> Either String () verifyManifest :: BundleManifest -> Either String ()
verifyManifest manifest = do verifyManifest manifest = do
when (manifestSchema manifest /= "arborix.bundle.manifest.v1") $ when (manifestSchema manifest /= "arboricx.bundle.manifest.v1") $
Left $ "unsupported manifest schema: " ++ unpack (manifestSchema manifest) Left $ "unsupported manifest schema: " ++ unpack (manifestSchema manifest)
when (manifestBundleType manifest /= "tree-calculus-executable-object") $ when (manifestBundleType manifest /= "tree-calculus-executable-object") $
Left $ "unsupported bundle type: " ++ unpack (manifestBundleType manifest) Left $ "unsupported bundle type: " ++ unpack (manifestBundleType manifest)
@@ -683,13 +683,13 @@ verifyManifest manifest = do
Left $ "unsupported calculus: " ++ unpack (treeCalculus treeSpec) Left $ "unsupported calculus: " ++ unpack (treeCalculus treeSpec)
when (nodeHashAlgorithm hashSpec /= "sha256") $ when (nodeHashAlgorithm hashSpec /= "sha256") $
Left $ "unsupported node hash algorithm: " ++ unpack (nodeHashAlgorithm hashSpec) Left $ "unsupported node hash algorithm: " ++ unpack (nodeHashAlgorithm hashSpec)
when (nodeHashDomain hashSpec /= "arborix.merkle.node.v1") $ when (nodeHashDomain hashSpec /= "arboricx.merkle.node.v1") $
Left $ "unsupported node hash domain: " ++ unpack (nodeHashDomain hashSpec) Left $ "unsupported node hash domain: " ++ unpack (nodeHashDomain hashSpec)
when (treeNodePayload treeSpec /= "arborix.merkle.payload.v1") $ when (treeNodePayload treeSpec /= "arboricx.merkle.payload.v1") $
Left $ "unsupported node payload: " ++ unpack (treeNodePayload treeSpec) Left $ "unsupported node payload: " ++ unpack (treeNodePayload treeSpec)
when (runtimeSemantics runtimeSpec /= "tree-calculus.v1") $ when (runtimeSemantics runtimeSpec /= "tree-calculus.v1") $
Left $ "unsupported runtime semantics: " ++ unpack (runtimeSemantics runtimeSpec) Left $ "unsupported runtime semantics: " ++ unpack (runtimeSemantics runtimeSpec)
when (runtimeAbi runtimeSpec /= "arborix.abi.tree.v1") $ when (runtimeAbi runtimeSpec /= "arboricx.abi.tree.v1") $
Left $ "unsupported runtime ABI: " ++ unpack (runtimeAbi runtimeSpec) Left $ "unsupported runtime ABI: " ++ unpack (runtimeAbi runtimeSpec)
when (not (null (runtimeCapabilities runtimeSpec))) $ when (not (null (runtimeCapabilities runtimeSpec))) $
Left "unsupported runtime capabilities" Left "unsupported runtime capabilities"

View File

@@ -786,7 +786,7 @@ wireTests = testGroup "Wire Tests"
, "main = id t" , "main = id t"
] ]
wireData <- exportBundle srcConn [termHash] wireData <- exportBundle srcConn [termHash]
BS.take 8 wireData @?= BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x58, 0x00] BS.take 8 wireData @?= BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x43, 0x58]
case decodeBundle wireData of case decodeBundle wireData of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> do Right bundle -> do
@@ -794,15 +794,15 @@ wireTests = testGroup "Wire Tests"
tree = manifestTree manifest tree = manifestTree manifest
hashSpec = treeNodeHash tree hashSpec = treeNodeHash tree
runtime = manifestRuntime manifest runtime = manifestRuntime manifest
manifestSchema manifest @?= "arborix.bundle.manifest.v1" manifestSchema manifest @?= "arboricx.bundle.manifest.v1"
manifestBundleType manifest @?= "tree-calculus-executable-object" manifestBundleType manifest @?= "tree-calculus-executable-object"
manifestClosure manifest @?= ClosureComplete manifestClosure manifest @?= ClosureComplete
treeCalculus tree @?= "tree-calculus.v1" treeCalculus tree @?= "tree-calculus.v1"
treeNodePayload tree @?= "arborix.merkle.payload.v1" treeNodePayload tree @?= "arboricx.merkle.payload.v1"
nodeHashAlgorithm hashSpec @?= "sha256" nodeHashAlgorithm hashSpec @?= "sha256"
nodeHashDomain hashSpec @?= "arborix.merkle.node.v1" nodeHashDomain hashSpec @?= "arboricx.merkle.node.v1"
runtimeSemantics runtime @?= "tree-calculus.v1" runtimeSemantics runtime @?= "tree-calculus.v1"
runtimeAbi runtime @?= "arborix.abi.tree.v1" runtimeAbi runtime @?= "arboricx.abi.tree.v1"
runtimeCapabilities runtime @?= [] runtimeCapabilities runtime @?= []
bundleRoots bundle @?= [termHash] bundleRoots bundle @?= [termHash]
map exportRoot (manifestExports manifest) @?= [termHash] map exportRoot (manifestExports manifest) @?= [termHash]
@@ -823,7 +823,7 @@ wireTests = testGroup "Wire Tests"
exportName exported @?= "validateEmail" exportName exported @?= "validateEmail"
exportRoot exported @?= termHash exportRoot exported @?= termHash
exportKind exported @?= "term" exportKind exported @?= "term"
exportAbi exported @?= "arborix.abi.tree.v1" exportAbi exported @?= "arboricx.abi.tree.v1"
exports -> assertFailure $ "Expected one export, got: " ++ show exports exports -> assertFailure $ "Expected one export, got: " ++ show exports
close srcConn close srcConn
@@ -1064,9 +1064,9 @@ u32 n = [0,0,0,n]
u64 :: Integer -> [Integer] u64 :: Integer -> [Integer]
u64 n = [0,0,0,0,0,0,0,n] u64 n = [0,0,0,0,0,0,0,n]
arborixHeaderBytes :: Integer -> [Integer] arboricxHeaderBytes :: Integer -> [Integer]
arborixHeaderBytes sectionCount = arboricxHeaderBytes sectionCount =
[65,82,66,79,82,73,88,0] [65,82,66,79,82,73,67,88]
++ u16 1 ++ u16 1
++ u16 0 ++ u16 0
++ u32 sectionCount ++ u32 sectionCount
@@ -1107,7 +1107,7 @@ simpleContainerBytes :: [Integer] -> [Integer] -> [Integer]
simpleContainerBytes manifestBytes nodesBytes = simpleContainerBytes manifestBytes nodesBytes =
let manifestOffset = 152 let manifestOffset = 152
nodesOffset = manifestOffset + fromIntegral (length manifestBytes) nodesOffset = manifestOffset + fromIntegral (length manifestBytes)
in arborixHeaderBytes 2 in arboricxHeaderBytes 2
++ manifestEntryBytes manifestOffset (fromIntegral $ length manifestBytes) ++ manifestEntryBytes manifestOffset (fromIntegral $ length manifestBytes)
++ nodesEntryBytes nodesOffset (fromIntegral $ length nodesBytes) ++ nodesEntryBytes nodesOffset (fromIntegral $ length nodesBytes)
++ manifestBytes ++ manifestBytes
@@ -1115,12 +1115,12 @@ simpleContainerBytes manifestBytes nodesBytes =
singleSectionContainerBytes :: [Integer] -> [Integer] -> [Integer] singleSectionContainerBytes :: [Integer] -> [Integer] -> [Integer]
singleSectionContainerBytes sectionType sectionBytes = singleSectionContainerBytes sectionType sectionBytes =
arborixHeaderBytes 1 arboricxHeaderBytes 1
++ sectionEntryBytes sectionType 92 (fromIntegral $ length sectionBytes) ++ sectionEntryBytes sectionType 92 (fromIntegral $ length sectionBytes)
++ sectionBytes ++ sectionBytes
arborixHeaderT :: Integer -> T arboricxHeaderT :: Integer -> T
arborixHeaderT sectionCount = arboricxHeaderT sectionCount =
pairT (bytesT [0,1]) pairT (bytesT [0,1])
(pairT (bytesT [0,0]) (pairT (bytesT [0,0])
(pairT (bytesT $ u32 sectionCount) (pairT (bytesT $ u32 sectionCount)
@@ -1615,120 +1615,114 @@ binaryReaderTests = testGroup "Binary Reader Tests"
result env @?= errT eofT (bytesT [1,2,3]) result env @?= errT eofT (bytesT [1,2,3])
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
-- Arborix magic recognition -- Arboricx magic recognition
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
, testCase "readArborixMagic: accepts magic and preserves rest" $ do , testCase "readArboricxMagic: accepts magic and preserves rest" $ do
let input = "readArborixMagic [(65) (82) (66) (79) (82) (73) (88) (0) (1) (2)]" let input = "readArboricxMagic ((append arboricxMagic) [(1) (2)])"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT unitT (bytesT [1,2]) result env @?= okT unitT (bytesT [1,2])
, testCase "readArborixMagic: rejects wrong magic preserving input" $ do , testCase "readArboricxMagic: rejects wrong magic preserving input" $ do
let input = "readArborixMagic [(65) (82) (66) (79) (82) (73) (88) (1) (9)]" let input = "readArboricxMagic [(65) (83) (66) (79) (82) (73) (67) (88) (1) (9)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT unexpectedBytesT (bytesT [65,82,66,79,82,73,88,1,9]) result env @?= errT unexpectedBytesT (bytesT [65,83,66,79,82,73,67,88,1,9])
, testCase "readArborixMagic: short input returns EOF preserving input" $ do , testCase "readArboricxMagic: short input returns EOF preserving input" $ do
let input = "readArborixMagic [(65) (82) (66) (79)]" let input = "readArboricxMagic [(65) (82) (66) (79)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT eofT (bytesT [65,82,66,79]) result env @?= errT eofT (bytesT [65,82,66,79])
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
-- Arborix header parsing -- Arboricx header parsing
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
, testCase "readArborixHeader: parses portable header" $ do , testCase "readArboricxHeader: parses portable header" $ do
let input = "readArborixHeader " ++ bytesExpr (arborixHeaderBytes 0) let input = "readArboricxHeader " ++ bytesExpr (arboricxHeaderBytes 0)
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT (arborixHeaderT 0) (bytesT []) result env @?= okT (arboricxHeaderT 0) (bytesT [])
, testCase "readArborixHeader: preserves trailing bytes" $ do , testCase "readArboricxHeader: preserves trailing bytes" $ do
let input = "readArborixHeader " ++ bytesExpr (arborixHeaderBytes 0 ++ [9,8]) let input = "readArboricxHeader " ++ bytesExpr (arboricxHeaderBytes 0 ++ [9,8])
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT (arborixHeaderT 0) (bytesT [9,8]) result env @?= okT (arboricxHeaderT 0) (bytesT [9,8])
, testCase "readArborixHeader: rejects wrong magic preserving input" $ do , testCase "readArboricxHeader: short input returns EOF preserving input" $ do
let input = "readArborixHeader [(65) (82) (66) (79) (82) (73) (88) (1) (0) (1)]" let input = "readArboricxHeader [(65) (82)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input)
result env @?= errT unexpectedBytesT (bytesT [65,82,66,79,82,73,88,1,0,1])
, testCase "readArborixHeader: short input returns EOF preserving input" $ do
let input = "readArborixHeader [(65) (82)]"
library <- evaluateFile "./lib/arborix.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT eofT (bytesT [65,82]) result env @?= errT eofT (bytesT [65,82])
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
-- Arborix section directory record parsing -- Arboricx section directory record parsing
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
, testCase "readSectionRecord: parses portable section entry" $ do , testCase "readSectionRecord: parses portable section entry" $ do
let input = "readSectionRecord " ++ bytesExpr (nodesEntryBytes 16 32) let input = "readSectionRecord " ++ bytesExpr (nodesEntryBytes 16 32)
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT (sectionRecordT nodesSectionIdBytes 16 32) (bytesT []) result env @?= okT (sectionRecordT nodesSectionIdBytes 16 32) (bytesT [])
, testCase "readSectionRecord: preserves trailing bytes" $ do , testCase "readSectionRecord: preserves trailing bytes" $ do
let input = "readSectionRecord " ++ bytesExpr (nodesEntryBytes 16 32 ++ [9,8]) let input = "readSectionRecord " ++ bytesExpr (nodesEntryBytes 16 32 ++ [9,8])
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT (sectionRecordT nodesSectionIdBytes 16 32) (bytesT [9,8]) result env @?= okT (sectionRecordT nodesSectionIdBytes 16 32) (bytesT [9,8])
, testCase "readSectionRecord: empty input returns EOF" $ do , testCase "readSectionRecord: empty input returns EOF" $ do
let input = "readSectionRecord []" let input = "readSectionRecord []"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT eofT (bytesT []) result env @?= errT eofT (bytesT [])
, testCase "readSectionRecord: short section id returns EOF preserving input" $ do , testCase "readSectionRecord: short section id returns EOF preserving input" $ do
let input = "readSectionRecord [(0)]" let input = "readSectionRecord [(0)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT eofT (bytesT [0]) result env @?= errT eofT (bytesT [0])
, testCase "readSectionRecord: missing section version returns EOF preserving unread bytes" $ do , testCase "readSectionRecord: missing section version returns EOF preserving unread bytes" $ do
let input = "readSectionRecord [(0) (2)]" let input = "readSectionRecord [(0) (2)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT eofT (bytesT [0,2]) result env @?= errT eofT (bytesT [0,2])
, testCase "readSectionRecord: short section version returns EOF preserving unread bytes" $ do , testCase "readSectionRecord: short section version returns EOF preserving unread bytes" $ do
let input = "readSectionRecord [(0) (2) (0) (0) (0)]" let input = "readSectionRecord [(0) (2) (0) (0) (0)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT eofT (bytesT [0]) result env @?= errT eofT (bytesT [0])
, testCase "readSectionRecord: missing length returns EOF preserving unread length bytes" $ do , testCase "readSectionRecord: missing length returns EOF preserving unread length bytes" $ do
let input = "readSectionRecord [(0) (2) (0) (0) (0) (16)]" let input = "readSectionRecord [(0) (2) (0) (0) (0) (16)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT eofT (bytesT []) result env @?= errT eofT (bytesT [])
, testCase "readSectionRecord: short section flags returns EOF preserving unread bytes" $ do , testCase "readSectionRecord: short section flags returns EOF preserving unread bytes" $ do
let input = "readSectionRecord [(0) (2) (0) (0) (0) (16) (0) (0) (0)]" let input = "readSectionRecord [(0) (2) (0) (0) (0) (16) (0) (0) (0)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT eofT (bytesT [0]) result env @?= errT eofT (bytesT [0])
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
-- Arborix section directory parsing -- Arboricx section directory parsing
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
, testCase "readSectionDirectory: zero records preserves input" $ do , testCase "readSectionDirectory: zero records preserves input" $ do
let input = "readSectionDirectory 0 [(9) (8)]" let input = "readSectionDirectory 0 [(9) (8)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT (ofList []) (bytesT [9,8]) result env @?= okT (ofList []) (bytesT [9,8])
, testCase "readSectionDirectory: reads requested records and preserves trailing bytes" $ do , testCase "readSectionDirectory: reads requested records and preserves trailing bytes" $ do
let input = "readSectionDirectory 2 " ++ bytesExpr (manifestEntryBytes 10 20 ++ nodesEntryBytes 30 40 ++ [9]) let input = "readSectionDirectory 2 " ++ bytesExpr (manifestEntryBytes 10 20 ++ nodesEntryBytes 30 40 ++ [9])
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT result env @?= okT
(ofList (ofList
@@ -1739,171 +1733,171 @@ binaryReaderTests = testGroup "Binary Reader Tests"
, testCase "readSectionDirectory: truncated record returns EOF" $ do , testCase "readSectionDirectory: truncated record returns EOF" $ do
let input = "readSectionDirectory 2 [(0) (1) (0) (0) (0) (10) (0) (0) (0) (20) (0) (2) (0) (0)]" let input = "readSectionDirectory 2 [(0) (1) (0) (0) (0) (10) (0) (0) (0) (20) (0) (2) (0) (0)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT eofT (bytesT [0,0]) result env @?= errT eofT (bytesT [0,0])
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
-- Arborix section lookup and raw byte slicing -- Arboricx section lookup and raw byte slicing
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
, testCase "lookupSectionRecord: finds record by raw section id" $ do , testCase "lookupSectionRecord: finds record by raw section id" $ do
let input = "lookupSectionRecord " ++ bytesExpr nodesSectionIdBytes ++ " [(" ++ "pair " ++ bytesExpr manifestSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 10) ++ " (pair " ++ bytesExpr (u64 20) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ") (" ++ "pair " ++ bytesExpr nodesSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 30) ++ " (pair " ++ bytesExpr (u64 40) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ")]" let input = "lookupSectionRecord " ++ bytesExpr nodesSectionIdBytes ++ " [(" ++ "pair " ++ bytesExpr manifestSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 10) ++ " (pair " ++ bytesExpr (u64 20) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ") (" ++ "pair " ++ bytesExpr nodesSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 30) ++ " (pair " ++ bytesExpr (u64 40) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ")]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= justT (sectionRecordT nodesSectionIdBytes 30 40) result env @?= justT (sectionRecordT nodesSectionIdBytes 30 40)
, testCase "lookupSectionRecord: missing section id returns nothing" $ do , testCase "lookupSectionRecord: missing section id returns nothing" $ do
let input = "lookupSectionRecord " ++ bytesExpr [0,0,0,3] ++ " [(" ++ "pair " ++ bytesExpr manifestSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 10) ++ " (pair " ++ bytesExpr (u64 20) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ") (" ++ "pair " ++ bytesExpr nodesSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 30) ++ " (pair " ++ bytesExpr (u64 40) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ")]" let input = "lookupSectionRecord " ++ bytesExpr [0,0,0,3] ++ " [(" ++ "pair " ++ bytesExpr manifestSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 10) ++ " (pair " ++ bytesExpr (u64 20) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ") (" ++ "pair " ++ bytesExpr nodesSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 30) ++ " (pair " ++ bytesExpr (u64 40) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ")]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= nothingT result env @?= nothingT
, testCase "byteSlice: extracts requested byte range" $ do , testCase "byteSlice: extracts requested byte range" $ do
let input = "byteSlice 2 3 [(10) (11) (12) (13) (14) (15)]" let input = "byteSlice 2 3 [(10) (11) (12) (13) (14) (15)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= bytesT [12,13,14] result env @?= bytesT [12,13,14]
, testCase "byteSlice: overlong length returns remaining bytes" $ do , testCase "byteSlice: overlong length returns remaining bytes" $ do
let input = "byteSlice 4 9 [(10) (11) (12) (13) (14) (15)]" let input = "byteSlice 4 9 [(10) (11) (12) (13) (14) (15)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= bytesT [14,15] result env @?= bytesT [14,15]
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
-- Arborix minimal container parsing foundation -- Arboricx minimal container parsing foundation
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
, testCase "u32BEBytesToNat: decodes zero" $ do , testCase "u32BEBytesToNat: decodes zero" $ do
let input = "u32BEBytesToNat [(0) (0) (0) (0)]" let input = "u32BEBytesToNat [(0) (0) (0) (0)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofNumber 0 result env @?= ofNumber 0
, testCase "u32BEBytesToNat: decodes small section count" $ do , testCase "u32BEBytesToNat: decodes small section count" $ do
let input = "u32BEBytesToNat [(0) (0) (0) (2)]" let input = "u32BEBytesToNat [(0) (0) (0) (2)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofNumber 2 result env @?= ofNumber 2
, testCase "u64BEBytesToNat: decodes small node count" $ do , testCase "u64BEBytesToNat: decodes small node count" $ do
let input = "u64BEBytesToNat [(0) (0) (0) (0) (0) (0) (0) (2)]" let input = "u64BEBytesToNat [(0) (0) (0) (0) (0) (0) (0) (2)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofNumber 2 result env @?= ofNumber 2
, testCase "u64BEBytesToNat: decodes fixture-scale offset" $ do , testCase "u64BEBytesToNat: decodes fixture-scale offset" $ do
let input = "u64BEBytesToNat [(0) (0) (0) (0) (0) (0) (3) (214)]" let input = "u64BEBytesToNat [(0) (0) (0) (0) (0) (0) (3) (214)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofNumber 982 result env @?= ofNumber 982
, testCase "readArborixContainer: reads header directory and preserves payload" $ do , testCase "readArboricxContainer: reads header directory and preserves payload" $ do
let input = "readArborixContainer " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204]) let input = "readArboricxContainer " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204])
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT result env @?= okT
(pairT (pairT
(arborixHeaderT 2) (arboricxHeaderT 2)
(ofList (ofList
[ sectionRecordT manifestSectionIdBytes 152 3 [ sectionRecordT manifestSectionIdBytes 152 3
, sectionRecordT nodesSectionIdBytes 155 4 , sectionRecordT nodesSectionIdBytes 155 4
])) ]))
(bytesT [101,102,103,201,202,203,204]) (bytesT [101,102,103,201,202,203,204])
, testCase "readArborixContainer: truncated directory returns EOF" $ do , testCase "readArboricxContainer: truncated directory returns EOF" $ do
let input = "readArborixContainer " ++ bytesExpr (arborixHeaderBytes 1 ++ [0,0]) let input = "readArboricxContainer " ++ bytesExpr (arboricxHeaderBytes 1 ++ [0,0])
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT eofT (bytesT [0,0]) result env @?= errT eofT (bytesT [0,0])
, testCase "readArborixContainer: rejects unsupported major version" $ do , testCase "readArboricxContainer: rejects unsupported major version" $ do
let badHeader = [65,82,66,79,82,73,88,0] ++ u16 2 ++ u16 0 ++ u32 0 ++ u64 0 ++ u64 32 let badHeader = [65,82,66,79,82,73,67,88] ++ u16 2 ++ u16 0 ++ u32 0 ++ u64 0 ++ u64 32
input = "readArborixContainer " ++ bytesExpr badHeader input = "readArboricxContainer " ++ bytesExpr badHeader
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT unsupportedVersionT (bytesT []) result env @?= errT unsupportedVersionT (bytesT [])
, testCase "readArborixContainer: rejects unsupported minor version" $ do , testCase "readArboricxContainer: rejects unsupported minor version" $ do
let badHeader = [65,82,66,79,82,73,88,0] ++ u16 1 ++ u16 1 ++ u32 0 ++ u64 0 ++ u64 32 let badHeader = [65,82,66,79,82,73,67,88] ++ u16 1 ++ u16 1 ++ u32 0 ++ u64 0 ++ u64 32
input = "readArborixContainer " ++ bytesExpr badHeader input = "readArboricxContainer " ++ bytesExpr badHeader
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT unsupportedVersionT (bytesT []) result env @?= errT unsupportedVersionT (bytesT [])
, testCase "readArborixContainer: rejects duplicate section ids" $ do , testCase "readArboricxContainer: rejects duplicate section ids" $ do
let input = "readArborixContainer " ++ bytesExpr (arborixHeaderBytes 2 ++ manifestEntryBytes 152 1 ++ manifestEntryBytes 153 1 ++ [9]) let input = "readArboricxContainer " ++ bytesExpr (arboricxHeaderBytes 2 ++ manifestEntryBytes 152 1 ++ manifestEntryBytes 153 1 ++ [9])
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT duplicateSectionT (bytesT [9]) result env @?= errT duplicateSectionT (bytesT [9])
, testCase "extractSectionBytes: uses raw offset and length fields" $ do , testCase "extractSectionBytes: uses raw offset and length fields" $ do
let input = "extractSectionBytes " ++ sectionRecordExpr nodesSectionIdBytes 3 4 ++ " " ++ bytesExpr [10,11,12,13,14,15,16,17] let input = "extractSectionBytes " ++ sectionRecordExpr nodesSectionIdBytes 3 4 ++ " " ++ bytesExpr [10,11,12,13,14,15,16,17]
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= bytesT [13,14,15,16] result env @?= bytesT [13,14,15,16]
, testCase "lookupSectionBytes: finds section and extracts raw bytes" $ do , testCase "lookupSectionBytes: finds section and extracts raw bytes" $ do
let input = "lookupSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " [" ++ sectionRecordExpr manifestSectionIdBytes 1 2 ++ " " ++ sectionRecordExpr nodesSectionIdBytes 4 3 ++ "] " ++ bytesExpr [10,11,12,13,14,15,16,17] let input = "lookupSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " [" ++ sectionRecordExpr manifestSectionIdBytes 1 2 ++ " " ++ sectionRecordExpr nodesSectionIdBytes 4 3 ++ "] " ++ bytesExpr [10,11,12,13,14,15,16,17]
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= justT (bytesT [14,15,16]) result env @?= justT (bytesT [14,15,16])
, testCase "lookupSectionBytes: missing section returns nothing" $ do , testCase "lookupSectionBytes: missing section returns nothing" $ do
let input = "lookupSectionBytes " ++ bytesExpr [0,0,0,3] ++ " [" ++ sectionRecordExpr manifestSectionIdBytes 1 2 ++ " " ++ sectionRecordExpr nodesSectionIdBytes 4 3 ++ "] " ++ bytesExpr [10,11,12,13,14,15,16,17] let input = "lookupSectionBytes " ++ bytesExpr [0,0,0,3] ++ " [" ++ sectionRecordExpr manifestSectionIdBytes 1 2 ++ " " ++ sectionRecordExpr nodesSectionIdBytes 4 3 ++ "] " ++ bytesExpr [10,11,12,13,14,15,16,17]
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= nothingT result env @?= nothingT
, testCase "extractSectionBytesResult: rejects out-of-bounds section" $ do , testCase "extractSectionBytesResult: rejects out-of-bounds section" $ do
let input = "extractSectionBytesResult " ++ sectionRecordExpr nodesSectionIdBytes 6 4 ++ " " ++ bytesExpr [10,11,12,13,14,15,16,17] ++ " []" let input = "extractSectionBytesResult " ++ sectionRecordExpr nodesSectionIdBytes 6 4 ++ " " ++ bytesExpr [10,11,12,13,14,15,16,17] ++ " []"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT eofT (bytesT []) result env @?= errT eofT (bytesT [])
, testCase "readArborixSectionBytes: extracts requested section from container" $ do , testCase "readArboricxSectionBytes: extracts requested section from container" $ do
let input = "readArborixSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204]) let input = "readArboricxSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204])
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT (bytesT [201,202,203,204]) (bytesT [101,102,103,201,202,203,204]) result env @?= okT (bytesT [201,202,203,204]) (bytesT [101,102,103,201,202,203,204])
, testCase "readArborixSectionBytes: missing section returns missing-section err" $ do , testCase "readArboricxSectionBytes: missing section returns missing-section err" $ do
let input = "readArborixSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103]) let input = "readArboricxSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103])
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT missingSectionT (bytesT [101,102,103]) result env @?= errT missingSectionT (bytesT [101,102,103])
, testCase "readArborixRequiredSections: extracts manifest and nodes bytes" $ do , testCase "readArboricxRequiredSections: extracts manifest and nodes bytes" $ do
let input = "readArborixRequiredSections " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204]) let input = "readArboricxRequiredSections " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204])
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT result env @?= okT
(pairT (bytesT [101,102,103]) (bytesT [201,202,203,204])) (pairT (bytesT [101,102,103]) (bytesT [201,202,203,204]))
(bytesT [101,102,103,201,202,203,204]) (bytesT [101,102,103,201,202,203,204])
, testCase "readArborixRequiredSections: missing nodes section returns missing-section err" $ do , testCase "readArboricxRequiredSections: missing nodes section returns missing-section err" $ do
let input = "readArborixRequiredSections " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103]) let input = "readArboricxRequiredSections " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103])
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT missingSectionT (bytesT [101,102,103]) result env @?= errT missingSectionT (bytesT [101,102,103])
, testCase "readArborixRequiredSections: out-of-bounds section returns EOF" $ do , testCase "readArboricxRequiredSections: out-of-bounds section returns EOF" $ do
let manifestBytes = [101,102,103] let manifestBytes = [101,102,103]
nodesBytes = [201,202,203,204] nodesBytes = [201,202,203,204]
badContainer = arborixHeaderBytes 2 ++ manifestEntryBytes 152 3 ++ nodesEntryBytes 155 9 ++ manifestBytes ++ nodesBytes badContainer = arboricxHeaderBytes 2 ++ manifestEntryBytes 152 3 ++ nodesEntryBytes 155 9 ++ manifestBytes ++ nodesBytes
input = "readArborixRequiredSections " ++ bytesExpr badContainer input = "readArboricxRequiredSections " ++ bytesExpr badContainer
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT eofT (bytesT [101,102,103,201,202,203,204]) result env @?= errT eofT (bytesT [101,102,103,201,202,203,204])
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
-- Arborix raw nodes section parsing -- Arboricx raw nodes section parsing
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
, testCase "readNodeRecord: parses hash length and raw payload" $ do , testCase "readNodeRecord: parses hash length and raw payload" $ do
let input = "readNodeRecord [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (3) (101) (102) (103) (9)]" let input = "readNodeRecord [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (3) (101) (102) (103) (9)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT result env @?= okT
(pairT (bytesT [1..32]) (pairT (bytesT [1..32])
@@ -1913,13 +1907,13 @@ binaryReaderTests = testGroup "Binary Reader Tests"
, testCase "readNodeRecord: truncated payload returns EOF preserving unread payload" $ do , testCase "readNodeRecord: truncated payload returns EOF preserving unread payload" $ do
let input = "readNodeRecord [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (3) (101) (102)]" let input = "readNodeRecord [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (3) (101) (102)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT eofT (bytesT [101,102]) result env @?= errT eofT (bytesT [101,102])
, testCase "readNodesSection: parses node count and records" $ do , testCase "readNodesSection: parses node count and records" $ do
let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (9)]" let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (9)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT result env @?= okT
(pairT (bytesT [0,0,0,0,0,0,0,1]) (pairT (bytesT [0,0,0,0,0,0,0,1])
@@ -1932,37 +1926,37 @@ binaryReaderTests = testGroup "Binary Reader Tests"
, testCase "readNodesSectionComplete: rejects trailing bytes inside nodes section" $ do , testCase "readNodesSectionComplete: rejects trailing bytes inside nodes section" $ do
let input = "readNodesSectionComplete [(0) (0) (0) (0) (0) (0) (0) (0) (9)]" let input = "readNodesSectionComplete [(0) (0) (0) (0) (0) (0) (0) (0) (9)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT unexpectedBytesT (bytesT [9]) result env @?= errT unexpectedBytesT (bytesT [9])
, testCase "readNodesSection: rejects duplicate node hashes" $ do , testCase "readNodesSection: rejects duplicate node hashes" $ do
let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (9)]" let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (9)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT duplicateNodeT (bytesT [9]) result env @?= errT duplicateNodeT (bytesT [9])
, testCase "nodePayloadValid?: accepts leaf stem and fork payload shapes" $ do , testCase "nodePayloadValid?: accepts leaf stem and fork payload shapes" $ do
let input = "[(nodePayloadValid? [(0)]) (nodePayloadValid? [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]) (nodePayloadValid? [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)])]" let input = "[(nodePayloadValid? [(0)]) (nodePayloadValid? [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]) (nodePayloadValid? [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)])]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofList [trueT, trueT, trueT] result env @?= ofList [trueT, trueT, trueT]
, testCase "nodePayloadValid?: rejects invalid payload shapes" $ do , testCase "nodePayloadValid?: rejects invalid payload shapes" $ do
let input = "[(nodePayloadValid? []) (nodePayloadValid? [(9)]) (nodePayloadValid? [(1) (1)]) (nodePayloadValid? [(2) (1) (2)])]" let input = "[(nodePayloadValid? []) (nodePayloadValid? [(9)]) (nodePayloadValid? [(1) (1)]) (nodePayloadValid? [(2) (1) (2)])]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofList [falseT, falseT, falseT, falseT] result env @?= ofList [falseT, falseT, falseT, falseT]
, testCase "node payload child accessors expose raw hashes" $ do , testCase "node payload child accessors expose raw hashes" $ do
let input = "[(nodePayloadStemChildHash [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]) (nodePayloadForkLeftHash [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]) (nodePayloadForkRightHash [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)])]" let input = "[(nodePayloadStemChildHash [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]) (nodePayloadForkLeftHash [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]) (nodePayloadForkRightHash [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)])]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofList [bytesT [1..32], bytesT [1..32], bytesT [33..64]] result env @?= ofList [bytesT [1..32], bytesT [1..32], bytesT [33..64]]
, testCase "lookupNodeRecord: finds record by raw node hash" $ do , testCase "lookupNodeRecord: finds record by raw node hash" $ do
let input = "lookupNodeRecord [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (1)] [(0)]))]" let input = "lookupNodeRecord [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (1)] [(0)]))]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= justT result env @?= justT
(pairT (bytesT [33..64]) (pairT (bytesT [33..64])
@@ -1971,7 +1965,7 @@ binaryReaderTests = testGroup "Binary Reader Tests"
, testCase "nodeRecordChildHashes: extracts stem and fork references" $ do , testCase "nodeRecordChildHashes: extracts stem and fork references" $ do
let input = "[(nodeRecordChildHashes (pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (33)] [(1) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]))) (nodeRecordChildHashes (pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (65)] [(2) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64) (65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)])))]" let input = "[(nodeRecordChildHashes (pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (33)] [(1) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]))) (nodeRecordChildHashes (pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (65)] [(2) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64) (65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)])))]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofList result env @?= ofList
[ ofList [bytesT [33..64]] [ ofList [bytesT [33..64]]
@@ -1980,20 +1974,20 @@ binaryReaderTests = testGroup "Binary Reader Tests"
, testCase "readNodesSection: rejects invalid node payload shape" $ do , testCase "readNodesSection: rejects invalid node payload shape" $ do
let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (9)]" let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (9)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT invalidNodePayloadT (bytesT []) result env @?= errT invalidNodePayloadT (bytesT [])
, testCase "readNodesSection: rejects missing child node" $ do , testCase "readNodesSection: rejects missing child node" $ do
let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (33) (1) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64) (9)]" let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (33) (1) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64) (9)]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= errT missingNodeT (bytesT [9]) result env @?= errT missingNodeT (bytesT [9])
, testCase "readArborixNodesSection: extracts and parses raw nodes section" $ do , testCase "readArboricxNodesSection: extracts and parses raw nodes section" $ do
let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0] let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0]
input = "readArborixNodesSection " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes) input = "readArboricxNodesSection " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes)
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT result env @?= okT
(pairT (bytesT [0,0,0,0,0,0,0,1]) (pairT (bytesT [0,0,0,0,0,0,0,1])
@@ -2005,186 +1999,186 @@ binaryReaderTests = testGroup "Binary Reader Tests"
(bytesT ([101,102,103] ++ nodesBytes)) (bytesT ([101,102,103] ++ nodesBytes))
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
-- Arborix node DAG reconstruction -- Arboricx node DAG reconstruction
-- ------------------------------------------------------------------------ -- ------------------------------------------------------------------------
, testCase "nodeHashToTree: reconstructs leaf node" $ do , testCase "nodeHashToTree: reconstructs leaf node" $ do
let input = "nodeHashToTree [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)]))]" let input = "nodeHashToTree [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)]))]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT Leaf Leaf result env @?= okT Leaf Leaf
, testCase "nodeHashToTree: reconstructs stem node" $ do , testCase "nodeHashToTree: reconstructs stem node" $ do
let input = "nodeHashToTree [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (33)] [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]))]" let input = "nodeHashToTree [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (33)] [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]))]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT (Stem Leaf) Leaf result env @?= okT (Stem Leaf) Leaf
, testCase "nodeHashToTree: reconstructs fork node" $ do , testCase "nodeHashToTree: reconstructs fork node" $ do
let input = "nodeHashToTree [(65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)] (pair [(0) (0) (0) (65)] [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]))]" let input = "nodeHashToTree [(65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)] (pair [(0) (0) (0) (65)] [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]))]"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT (Fork Leaf Leaf) Leaf result env @?= okT (Fork Leaf Leaf) Leaf
, testCase "readArborixTreeFromHash: reconstructs tree from bundle bytes" $ do , testCase "readArboricxTreeFromHash: reconstructs tree from bundle bytes" $ do
let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0] let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0]
input = "readArborixTreeFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes) input = "readArboricxTreeFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes)
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT Leaf (bytesT ([101,102,103] ++ nodesBytes)) result env @?= okT Leaf (bytesT ([101,102,103] ++ nodesBytes))
, testCase "readArborixExecutableFromHash: alias reconstructs tree" $ do , testCase "readArboricxExecutableFromHash: alias reconstructs tree" $ do
let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0] let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0]
input = "readArborixExecutableFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes) input = "readArboricxExecutableFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes)
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= okT Leaf (bytesT ([101,102,103] ++ nodesBytes)) result env @?= okT Leaf (bytesT ([101,102,103] ++ nodesBytes))
, testCase "readArborixNodesSection: reads id fixture bundle" $ do , testCase "readArboricxNodesSection: reads id fixture bundle" $ do
fixtureBytes <- BS.readFile "test/fixtures/id.arborix" fixtureBytes <- BS.readFile "test/fixtures/id.arboricx"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right _ -> do Right _ -> do
let input = "matchResult (code rest : code) (nodes rest : 0) (readArborixNodesSection " let input = "matchResult (code rest : code) (nodes rest : 0) (readArboricxNodesSection "
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
++ ")" ++ ")"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofNumber 0 result env @?= ofNumber 0
, testCase "readArborixNodesSection: reads notQ fixture bundle" $ do , testCase "readArboricxNodesSection: reads notQ fixture bundle" $ do
fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix" fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right _ -> do Right _ -> do
let input = "matchResult (code rest : code) (nodes rest : 0) (readArborixNodesSection " let input = "matchResult (code rest : code) (nodes rest : 0) (readArboricxNodesSection "
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
++ ")" ++ ")"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofNumber 0 result env @?= ofNumber 0
, testCase "readArborixNodesSection: reads map fixture bundle" $ do , testCase "readArboricxNodesSection: reads map fixture bundle" $ do
fixtureBytes <- BS.readFile "test/fixtures/map.arborix" fixtureBytes <- BS.readFile "test/fixtures/map.arboricx"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right _ -> do Right _ -> do
let input = "matchResult (code rest : code) (nodes rest : 0) (readArborixNodesSection " let input = "matchResult (code rest : code) (nodes rest : 0) (readArboricxNodesSection "
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
++ ")" ++ ")"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofNumber 0 result env @?= ofNumber 0
, testCase "readArborixExecutableFromHash: reconstructs id fixture root" $ do , testCase "readArboricxExecutableFromHash: reconstructs id fixture root" $ do
fixtureBytes <- BS.readFile "test/fixtures/id.arborix" fixtureBytes <- BS.readFile "test/fixtures/id.arboricx"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of
[] -> assertFailure "fixture has no roots" [] -> assertFailure "fixture has no roots"
(rootHash:_) -> do (rootHash:_) -> do
let input = "matchResult (code rest : code) (tree rest : 0) (readArborixExecutableFromHash " let input = "matchResult (code rest : code) (tree rest : 0) (readArboricxExecutableFromHash "
++ bytesExpr (hexTextBytes rootHash) ++ bytesExpr (hexTextBytes rootHash)
++ " " ++ " "
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
++ ")" ++ ")"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofNumber 0 result env @?= ofNumber 0
, testCase "readArborixExecutableFromHash: reconstructs notQ fixture root" $ do , testCase "readArboricxExecutableFromHash: reconstructs notQ fixture root" $ do
fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix" fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of
[] -> assertFailure "fixture has no roots" [] -> assertFailure "fixture has no roots"
(rootHash:_) -> do (rootHash:_) -> do
let input = "matchResult (code rest : code) (tree rest : 0) (readArborixExecutableFromHash " let input = "matchResult (code rest : code) (tree rest : 0) (readArboricxExecutableFromHash "
++ bytesExpr (hexTextBytes rootHash) ++ bytesExpr (hexTextBytes rootHash)
++ " " ++ " "
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
++ ")" ++ ")"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofNumber 0 result env @?= ofNumber 0
, testCase "readArborixExecutableFromHash: reconstructs map fixture root" $ do , testCase "readArboricxExecutableFromHash: reconstructs map fixture root" $ do
fixtureBytes <- BS.readFile "test/fixtures/map.arborix" fixtureBytes <- BS.readFile "test/fixtures/map.arboricx"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of
[] -> assertFailure "fixture has no roots" [] -> assertFailure "fixture has no roots"
(rootHash:_) -> do (rootHash:_) -> do
let input = "matchResult (code rest : code) (tree rest : 0) (readArborixExecutableFromHash " let input = "matchResult (code rest : code) (tree rest : 0) (readArboricxExecutableFromHash "
++ bytesExpr (hexTextBytes rootHash) ++ bytesExpr (hexTextBytes rootHash)
++ " " ++ " "
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
++ ")" ++ ")"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofNumber 0 result env @?= ofNumber 0
, testCase "readArborixExecutableFromHash: executes id fixture root" $ do , testCase "readArboricxExecutableFromHash: executes id fixture root" $ do
fixtureBytes <- BS.readFile "test/fixtures/id.arborix" fixtureBytes <- BS.readFile "test/fixtures/id.arboricx"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of
[] -> assertFailure "fixture has no roots" [] -> assertFailure "fixture has no roots"
(rootHash:_) -> do (rootHash:_) -> do
let input = "matchResult (code rest : code) (tree rest : tree 42) (readArborixExecutableFromHash " let input = "matchResult (code rest : code) (tree rest : tree 42) (readArboricxExecutableFromHash "
++ bytesExpr (hexTextBytes rootHash) ++ bytesExpr (hexTextBytes rootHash)
++ " " ++ " "
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
++ ")" ++ ")"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= ofNumber 42 result env @?= ofNumber 42
, testCase "readArborixExecutableFromHash: executes notQ fixture on true" $ do , testCase "readArboricxExecutableFromHash: executes notQ fixture on true" $ do
fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix" fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of
[] -> assertFailure "fixture has no roots" [] -> assertFailure "fixture has no roots"
(rootHash:_) -> do (rootHash:_) -> do
let input = "matchResult (code rest : code) (tree rest : tree true) (readArborixExecutableFromHash " let input = "matchResult (code rest : code) (tree rest : tree true) (readArboricxExecutableFromHash "
++ bytesExpr (hexTextBytes rootHash) ++ bytesExpr (hexTextBytes rootHash)
++ " " ++ " "
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
++ ")" ++ ")"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= falseT result env @?= falseT
, testCase "readArborixExecutableFromHash: executes notQ fixture on false" $ do , testCase "readArboricxExecutableFromHash: executes notQ fixture on false" $ do
fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix" fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of
[] -> assertFailure "fixture has no roots" [] -> assertFailure "fixture has no roots"
(rootHash:_) -> do (rootHash:_) -> do
let input = "matchResult (code rest : code) (tree rest : tree false) (readArborixExecutableFromHash " let input = "matchResult (code rest : code) (tree rest : tree false) (readArboricxExecutableFromHash "
++ bytesExpr (hexTextBytes rootHash) ++ bytesExpr (hexTextBytes rootHash)
++ " " ++ " "
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
++ ")" ++ ")"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= trueT result env @?= trueT
, testCase "readArborixExecutableFromHash: executes map fixture root" $ do , testCase "readArboricxExecutableFromHash: executes map fixture root" $ do
fixtureBytes <- BS.readFile "test/fixtures/map.arborix" fixtureBytes <- BS.readFile "test/fixtures/map.arboricx"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of
[] -> assertFailure "fixture has no roots" [] -> assertFailure "fixture has no roots"
(rootHash:_) -> do (rootHash:_) -> do
let input = "matchResult (code rest : code) (tree rest : head (tail (tree (a : (t t t)) [(t) (t) (t)]))) (readArborixExecutableFromHash " let input = "matchResult (code rest : code) (tree rest : head (tail (tree (a : (t t t)) [(t) (t) (t)]))) (readArboricxExecutableFromHash "
++ bytesExpr (hexTextBytes rootHash) ++ bytesExpr (hexTextBytes rootHash)
++ " " ++ " "
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
++ ")" ++ ")"
library <- evaluateFile "./lib/arborix.tri" library <- evaluateFile "./lib/arboricx.tri"
let env = evalTricu library (parseTricu input) let env = evalTricu library (parseTricu input)
result env @?= Fork Leaf Leaf result env @?= Fork Leaf Leaf
] ]

BIN
test/fixtures/false.arboricx vendored Normal file

Binary file not shown.

Binary file not shown.

BIN
test/fixtures/id.arboricx vendored Normal file

Binary file not shown.

Binary file not shown.

BIN
test/fixtures/map.arboricx vendored Normal file

Binary file not shown.

Binary file not shown.

BIN
test/fixtures/notQ.arboricx vendored Normal file

Binary file not shown.

Binary file not shown.

BIN
test/fixtures/true.arboricx vendored Normal file

Binary file not shown.

Binary file not shown.