Arborix -> Arboricx rename
This commit is contained in:
26
AGENTS.md
26
AGENTS.md
@@ -40,7 +40,7 @@ nix build .#
|
||||
| `REPL.hs` | Interactive Read-Eval-Print Loop (haskeline) |
|
||||
| `Research.hs` | Core types, `apply` reduction, booleans, marshalling (`ofString`, `ofNumber`), output formatters (`toAscii`, `toTernaryString`, `decodeResult`) |
|
||||
| `ContentStore.hs` | SQLite-backed term persistence |
|
||||
| `Wire.hs` | Arborix portable wire format — encode/decode/import/export of Merkle-DAG bundle blobs |
|
||||
| `Wire.hs` | Arboricx portable wire format — encode/decode/import/export of Merkle-DAG bundle blobs |
|
||||
|
||||
### File extensions
|
||||
|
||||
@@ -123,14 +123,14 @@ NLeaf → 0x00
|
||||
NStem(h) → 0x01 || h (32 bytes)
|
||||
NFork(l,r) → 0x02 || l (32 bytes) || r (32 bytes)
|
||||
|
||||
hash = SHA256("arborix.merkle.node.v1" <> 0x00 <> serialized_node)
|
||||
hash = SHA256("arboricx.merkle.node.v1" <> 0x00 <> serialized_node)
|
||||
```
|
||||
|
||||
This is stored in SQLite via `ContentStore.hs`. Hash suffixes on identifiers (e.g., `foo_abc123...`) are validated: 16–64 hex characters (SHA256).
|
||||
|
||||
## 7. Arborix Portable Wire Format
|
||||
## 7. Arboricx Portable Wire Format
|
||||
|
||||
The **Arborix wire format** (module `Wire.hs`) defines a portable binary bundle for exchanging Tree Calculus terms, their Merkle DAGs, and associated metadata. It is versioned and schema-driven.
|
||||
The **Arboricx wire format** (module `Wire.hs`) defines a portable binary bundle for exchanging Tree Calculus terms, their Merkle DAGs, and associated metadata. It is versioned and schema-driven.
|
||||
|
||||
### Header
|
||||
|
||||
@@ -143,7 +143,7 @@ The **Arborix wire format** (module `Wire.hs`) defines a portable binary bundle
|
||||
+------------------+-----------------+------------------+
|
||||
```
|
||||
|
||||
- **Magic**: `ARBORIX\0` (`0x41 0x52 0x42 0x4f 0x52 0x49 0x58 0x00`)
|
||||
- **Magic**: `ARBORICX` (`0x41 0x52 0x42 0x4f 0x52 0x49 0x43 0x58`)
|
||||
- **Header length**: 32 bytes
|
||||
- **Major version**: `1` | **Minor version**: `0`
|
||||
|
||||
@@ -172,18 +172,18 @@ The manifest describes the bundle's semantics, exports, and schema. Key fields:
|
||||
|
||||
| Field | Value | Description |
|
||||
|-------|-------|-------------|
|
||||
| `schema` | `"arborix.bundle.manifest.v1"` | Manifest schema version |
|
||||
| `schema` | `"arboricx.bundle.manifest.v1"` | Manifest schema version |
|
||||
| `bundleType` | `"tree-calculus-executable-object"` | Bundle category |
|
||||
| `tree.calculus` | `"tree-calculus.v1"` | Tree calculus version |
|
||||
| `tree.nodeHash.algorithm` | `"sha256"` | Hash algorithm |
|
||||
| `tree.nodeHash.domain` | `"arborix.merkle.node.v1"` | Hash domain string |
|
||||
| `tree.nodePayload` | `"arborix.merkle.payload.v1"` | Payload encoding |
|
||||
| `tree.nodeHash.domain` | `"arboricx.merkle.node.v1"` | Hash domain string |
|
||||
| `tree.nodePayload` | `"arboricx.merkle.payload.v1"` | Payload encoding |
|
||||
| `runtime.semantics` | `"tree-calculus.v1"` | Evaluation semantics |
|
||||
| `runtime.abi` | `"arborix.abi.tree.v1"` | Runtime ABI |
|
||||
| `runtime.abi` | `"arboricx.abi.tree.v1"` | Runtime ABI |
|
||||
| `closure` | `"complete"` | Bundle must be a complete DAG |
|
||||
| `roots` | `[{"hash": "...", "role": "..."}]` | Named root hashes |
|
||||
| `exports` | `[{"name": "...", "root": "..."}]` | Export aliases for roots |
|
||||
| `metadata.createdBy` | `"arborix"` | Originator |
|
||||
| `metadata.createdBy` | `"arboricx"` | Originator |
|
||||
|
||||
### Section 2 — Nodes (Binary)
|
||||
|
||||
@@ -252,7 +252,7 @@ tricu/
|
||||
│ ├── REPL.hs
|
||||
│ ├── Research.hs
|
||||
│ ├── ContentStore.hs
|
||||
│ └── Wire.hs # Arborix portable wire format
|
||||
│ └── Wire.hs # Arboricx portable wire format
|
||||
├── test/
|
||||
│ ├── Spec.hs # Tasty + HUnit tests
|
||||
│ ├── *.tri # tricu test programs
|
||||
@@ -270,9 +270,9 @@ tricu/
|
||||
└── AGENTS.md # This file
|
||||
```
|
||||
|
||||
## 9. JS Arborix Runtime
|
||||
## 9. JS Arboricx Runtime
|
||||
|
||||
A JavaScript implementation of the Arborix portable bundle runtime lives in `ext/js/`.
|
||||
A JavaScript implementation of the Arboricx portable bundle runtime lives in `ext/js/`.
|
||||
It is a reference implementation — not a tricu source parser. It reads `.tri.bundle` files produced by the Haskell toolchain, verifies Merkle node hashes, reconstructs tree values, and reduces them.
|
||||
|
||||
From project root:
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# Arborix Portable Bundle v1 (CBOR Manifest Profile)
|
||||
# Arboricx Portable Bundle v1 (CBOR Manifest Profile)
|
||||
|
||||
Status: **Draft, implementation-aligned** (derived from `src/Wire.hs` as of 2026-05-07)
|
||||
|
||||
This document specifies the **actual on-wire format and validation behavior** currently implemented by `tricu` for Arborix bundles, with a focus on the newer CBOR manifest path.
|
||||
This document specifies the **actual on-wire format and validation behavior** currently implemented by `tricu` for Arboricx bundles, with a focus on the newer CBOR manifest path.
|
||||
|
||||
---
|
||||
|
||||
@@ -38,7 +38,7 @@ A bundle is a byte stream:
|
||||
|
||||
| Field | Size | Encoding | Value / Notes |
|
||||
|---|---:|---|---|
|
||||
| Magic | 8 | raw bytes | `41 52 42 4f 52 49 58 00` (`"ARBORIX\0"`) |
|
||||
| Magic | 8 | raw bytes | `41 52 42 4f 52 49 58 00` (`"ARBORICX"`) |
|
||||
| Major | 2 | u16 BE | Must be `1` |
|
||||
| Minor | 2 | u16 BE | Currently `0` |
|
||||
| SectionCount | 4 | u32 BE | Number of section directory entries |
|
||||
@@ -143,18 +143,18 @@ Unknown metadata keys are ignored.
|
||||
|
||||
Writers in `Wire.hs` currently emit:
|
||||
|
||||
- `schema = "arborix.bundle.manifest.v1"`
|
||||
- `schema = "arboricx.bundle.manifest.v1"`
|
||||
- `bundleType = "tree-calculus-executable-object"`
|
||||
- `tree.calculus = "tree-calculus.v1"`
|
||||
- `tree.nodeHash.algorithm = "sha256"`
|
||||
- `tree.nodeHash.domain = "arborix.merkle.node.v1"`
|
||||
- `tree.nodePayload = "arborix.merkle.payload.v1"`
|
||||
- `tree.nodeHash.domain = "arboricx.merkle.node.v1"`
|
||||
- `tree.nodePayload = "arboricx.merkle.payload.v1"`
|
||||
- `runtime.semantics = "tree-calculus.v1"`
|
||||
- `runtime.evaluation = "normal-order"`
|
||||
- `runtime.abi = "arborix.abi.tree.v1"`
|
||||
- `runtime.abi = "arboricx.abi.tree.v1"`
|
||||
- `runtime.capabilities = []`
|
||||
- `closure = "complete"`
|
||||
- `metadata.createdBy = "arborix"`
|
||||
- `metadata.createdBy = "arboricx"`
|
||||
|
||||
---
|
||||
|
||||
@@ -249,17 +249,17 @@ These are important design gaps visible from current code.
|
||||
Status: **resolved in current codebase**.
|
||||
|
||||
What was wrong:
|
||||
- Manifest declared `tree.nodeHash.domain = "arborix.merkle.node.v1"`.
|
||||
- Manifest declared `tree.nodeHash.domain = "arboricx.merkle.node.v1"`.
|
||||
- Hashing implementation previously used `"tricu.merkle.node.v1"`.
|
||||
|
||||
Current state:
|
||||
- Haskell hashing now uses `"arborix.merkle.node.v1"`.
|
||||
- JS reference runtime hashing now uses `"arborix.merkle.node.v1"`.
|
||||
- JS manifest validation now requires `"arborix.merkle.node.v1"`.
|
||||
- Haskell hashing now uses `"arboricx.merkle.node.v1"`.
|
||||
- JS reference runtime hashing now uses `"arboricx.merkle.node.v1"`.
|
||||
- JS manifest validation now requires `"arboricx.merkle.node.v1"`.
|
||||
|
||||
Remaining recommendation:
|
||||
- Keep hash-domain constants centralized/shared to prevent future drift.
|
||||
- Add explicit test vectors for Leaf/Stem/Fork hashes under the Arborix domain.
|
||||
- Add explicit test vectors for Leaf/Stem/Fork hashes under the Arboricx domain.
|
||||
|
||||
### Gap B: CBOR decode is order-strict, not generic-map tolerant
|
||||
|
||||
@@ -334,6 +334,6 @@ A conforming v1 reader/writer for this profile should:
|
||||
|
||||
To stabilize interoperability, add:
|
||||
|
||||
1. `docs/arborix-bundle-test-vectors.md` (golden header/manifest/nodes + expected hashes).
|
||||
2. `docs/arborix-bundle-errors.md` (normative error codes/strings).
|
||||
3. `docs/arborix-bundle-evolution.md` (rules for minor/major upgrades, capability negotiation, extra sections).
|
||||
1. `docs/arboricx-bundle-test-vectors.md` (golden header/manifest/nodes + expected hashes).
|
||||
2. `docs/arboricx-bundle-errors.md` (normative error codes/strings).
|
||||
3. `docs/arboricx-bundle-evolution.md` (rules for minor/major upgrades, capability negotiation, extra sections).
|
||||
@@ -1,17 +1,17 @@
|
||||
{
|
||||
"name": "arborix-runtime",
|
||||
"name": "arboricx-runtime",
|
||||
"version": "0.1.0",
|
||||
"description": "Arborix portable bundle runtime — JavaScript reference implementation",
|
||||
"description": "Arboricx portable bundle runtime — JavaScript reference implementation",
|
||||
"type": "module",
|
||||
"main": "src/bundle.js",
|
||||
"bin": {
|
||||
"arborix-run": "src/cli.js"
|
||||
"arboricx-run": "src/cli.js"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "node --test test/*.test.js",
|
||||
"inspect": "node src/cli.js inspect",
|
||||
"run": "node src/cli.js run"
|
||||
},
|
||||
"keywords": ["arborix", "tree-calculus", "trie", "runtime"],
|
||||
"keywords": ["arboricx", "tree-calculus", "trie", "runtime"],
|
||||
"license": "MIT"
|
||||
}
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
/**
|
||||
* bundle.js — Parse an Arborix portable bundle binary into a JavaScript object.
|
||||
* bundle.js — Parse an Arboricx portable bundle binary into a JavaScript object.
|
||||
*
|
||||
* Format (v1):
|
||||
* Header (32 bytes):
|
||||
* Magic 8B "ARBORIX\0"
|
||||
* Magic 8B "ARBORICX"
|
||||
* Major 2B u16 BE (must be 1)
|
||||
* Minor 2B u16 BE
|
||||
* SectionCount 4B u32 BE
|
||||
@@ -27,7 +27,7 @@ import { decodeCbor } from "./cbor.js";
|
||||
|
||||
// ── Constants ───────────────────────────────────────────────────────────────
|
||||
|
||||
const MAGIC = Buffer.from([0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x58, 0x00]); // "ARBORIX\0"
|
||||
const MAGIC = Buffer.from([0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x43, 0x58]); // "ARBORICX"
|
||||
const HEADER_LENGTH = 32;
|
||||
const SECTION_ENTRY_LENGTH = 60;
|
||||
const SECTION_MANIFEST = 1;
|
||||
@@ -69,7 +69,7 @@ export function parseBundle(buffer) {
|
||||
|
||||
// Check magic
|
||||
if (!buffer.slice(0, 8).equals(MAGIC)) {
|
||||
throw new Error("invalid magic: expected ARBORIX\\0");
|
||||
throw new Error("invalid magic: expected ARBORICX");
|
||||
}
|
||||
|
||||
// Parse header
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/**
|
||||
* cbor.js — Minimal CBOR decoder for the Arborix manifest format.
|
||||
* cbor.js — Minimal CBOR decoder for the Arboricx manifest format.
|
||||
*
|
||||
* Decodes the canonical CBOR produced by the Haskell cborg library:
|
||||
* - Maps: major type 5 (0xa0 + length)
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* cli.js — Minimal CLI for inspecting and running Arborix bundles.
|
||||
* cli.js — Minimal CLI for inspecting and running Arboricx bundles.
|
||||
*
|
||||
* Usage:
|
||||
* node cli.js inspect <bundle>
|
||||
@@ -240,7 +240,7 @@ switch (command) {
|
||||
break;
|
||||
}
|
||||
default:
|
||||
console.log("Arborix JS Runtime");
|
||||
console.log("Arboricx JS Runtime");
|
||||
console.log("");
|
||||
console.log("Usage:");
|
||||
console.log(" node cli.js inspect <bundle>");
|
||||
|
||||
@@ -13,7 +13,7 @@
|
||||
* Throws on violation.
|
||||
*/
|
||||
export function validateManifest(manifest) {
|
||||
if (manifest.schema !== "arborix.bundle.manifest.v1") {
|
||||
if (manifest.schema !== "arboricx.bundle.manifest.v1") {
|
||||
throw new Error(
|
||||
`unsupported manifest schema: ${manifest.schema}`
|
||||
);
|
||||
@@ -33,12 +33,12 @@ export function validateManifest(manifest) {
|
||||
`unsupported node hash algorithm: ${tree.nodeHash.algorithm}`
|
||||
);
|
||||
}
|
||||
if (tree.nodeHash.domain !== "arborix.merkle.node.v1") {
|
||||
if (tree.nodeHash.domain !== "arboricx.merkle.node.v1") {
|
||||
throw new Error(
|
||||
`unsupported node hash domain: ${tree.nodeHash.domain}`
|
||||
);
|
||||
}
|
||||
if (tree.nodePayload !== "arborix.merkle.payload.v1") {
|
||||
if (tree.nodePayload !== "arboricx.merkle.payload.v1") {
|
||||
throw new Error(`unsupported node payload: ${tree.nodePayload}`);
|
||||
}
|
||||
|
||||
@@ -46,7 +46,7 @@ export function validateManifest(manifest) {
|
||||
if (runtime.semantics !== "tree-calculus.v1") {
|
||||
throw new Error(`unsupported runtime semantics: ${runtime.semantics}`);
|
||||
}
|
||||
if (runtime.abi !== "arborix.abi.tree.v1") {
|
||||
if (runtime.abi !== "arboricx.abi.tree.v1") {
|
||||
throw new Error(`unsupported runtime ABI: ${runtime.abi}`);
|
||||
}
|
||||
if (runtime.capabilities && runtime.capabilities.length > 0) {
|
||||
|
||||
@@ -7,14 +7,14 @@
|
||||
* Fork: 0x02 || left_hash (32 bytes raw) || right_hash (32 bytes raw)
|
||||
*
|
||||
* Hash computation:
|
||||
* hash = SHA256( "arborix.merkle.node.v1" || 0x00 || node_payload )
|
||||
* hash = SHA256( "arboricx.merkle.node.v1" || 0x00 || node_payload )
|
||||
*/
|
||||
|
||||
import { createHash } from "node:crypto";
|
||||
|
||||
// ── Constants ───────────────────────────────────────────────────────────────
|
||||
|
||||
const DOMAIN_TAG = "arborix.merkle.node.v1";
|
||||
const DOMAIN_TAG = "arboricx.merkle.node.v1";
|
||||
const HASH_LENGTH = 32; // raw hash bytes
|
||||
const HEX_LENGTH = 64; // hex-encoded hash length
|
||||
|
||||
|
||||
@@ -19,7 +19,7 @@ const fixtureDir = "../../test/fixtures";
|
||||
describe("bundle parsing", () => {
|
||||
it("valid bundle parses header and sections", () => {
|
||||
const bundle = parseBundle(
|
||||
readFileSync(`${fixtureDir}/id.arborix`)
|
||||
readFileSync(`${fixtureDir}/id.arboricx`)
|
||||
);
|
||||
strictEqual(bundle.version, "1.0");
|
||||
strictEqual(bundle.sectionCount, 2);
|
||||
@@ -29,23 +29,23 @@ describe("bundle parsing", () => {
|
||||
|
||||
it("parseManifest returns valid manifest", () => {
|
||||
const manifest = parseManifest(
|
||||
readFileSync(`${fixtureDir}/id.arborix`)
|
||||
readFileSync(`${fixtureDir}/id.arboricx`)
|
||||
);
|
||||
strictEqual(manifest.schema, "arborix.bundle.manifest.v1");
|
||||
strictEqual(manifest.schema, "arboricx.bundle.manifest.v1");
|
||||
strictEqual(manifest.bundleType, "tree-calculus-executable-object");
|
||||
strictEqual(manifest.closure, "complete");
|
||||
strictEqual(manifest.tree.calculus, "tree-calculus.v1");
|
||||
strictEqual(manifest.tree.nodeHash.algorithm, "sha256");
|
||||
strictEqual(manifest.tree.nodeHash.domain, "arborix.merkle.node.v1");
|
||||
strictEqual(manifest.tree.nodeHash.domain, "arboricx.merkle.node.v1");
|
||||
strictEqual(manifest.runtime.semantics, "tree-calculus.v1");
|
||||
strictEqual(manifest.runtime.abi, "arborix.abi.tree.v1");
|
||||
strictEqual(manifest.runtime.abi, "arboricx.abi.tree.v1");
|
||||
});
|
||||
});
|
||||
|
||||
describe("hash verification", () => {
|
||||
it("valid bundle nodes verify", () => {
|
||||
const data = bundleParseNodeSection(
|
||||
readFileSync(`${fixtureDir}/id.arborix`)
|
||||
readFileSync(`${fixtureDir}/id.arboricx`)
|
||||
);
|
||||
const { nodeMap } = parseNodes(data);
|
||||
const { verified } = verifyNodeHashes(nodeMap);
|
||||
@@ -62,20 +62,20 @@ describe("errors", () => {
|
||||
|
||||
it("unsupported version fails", () => {
|
||||
const buf = Buffer.alloc(32, 0);
|
||||
buf.write("ARBORIX\0", 0, 8);
|
||||
buf.write("ARBORICX", 0, 8);
|
||||
buf.writeUInt16BE(2, 8); // major version 2
|
||||
throws(() => parseBundle(buf), /unsupported bundle major version/);
|
||||
});
|
||||
|
||||
it("bad section digest fails", () => {
|
||||
const buf = readFileSync(`${fixtureDir}/id.arborix`);
|
||||
const buf = readFileSync(`${fixtureDir}/id.arboricx`);
|
||||
// Corrupt one byte in the manifest section
|
||||
buf[152] ^= 0x01;
|
||||
throws(() => parseBundle(buf), /digest mismatch/);
|
||||
});
|
||||
|
||||
it("truncated bundle fails", () => {
|
||||
const buf = readFileSync(`${fixtureDir}/id.arborix`);
|
||||
const buf = readFileSync(`${fixtureDir}/id.arboricx`);
|
||||
const truncated = buf.slice(0, 40);
|
||||
throws(() => parseBundle(truncated), /truncated/);
|
||||
});
|
||||
@@ -83,33 +83,33 @@ describe("errors", () => {
|
||||
it("missing nodes section fails", () => {
|
||||
// Build a bundle with only manifest entry in the directory (1 section instead of 2)
|
||||
const header = Buffer.alloc(32, 0);
|
||||
header.write("ARBORIX\0", 0, 8);
|
||||
header.write("ARBORICX", 0, 8);
|
||||
header.writeUInt16BE(1, 8); // major version
|
||||
header.writeUInt16BE(0, 10); // minor version
|
||||
header.writeUInt32BE(1, 12); // 1 section
|
||||
|
||||
// Build a manifest JSON
|
||||
const manifestObj = {
|
||||
schema: "arborix.bundle.manifest.v1",
|
||||
schema: "arboricx.bundle.manifest.v1",
|
||||
bundleType: "tree-calculus-executable-object",
|
||||
tree: {
|
||||
calculus: "tree-calculus.v1",
|
||||
nodeHash: {
|
||||
algorithm: "sha256",
|
||||
domain: "arborix.merkle.node.v1"
|
||||
domain: "arboricx.merkle.node.v1"
|
||||
},
|
||||
nodePayload: "arborix.merkle.payload.v1"
|
||||
nodePayload: "arboricx.merkle.payload.v1"
|
||||
},
|
||||
runtime: {
|
||||
semantics: "tree-calculus.v1",
|
||||
evaluation: "normal-order",
|
||||
abi: "arborix.abi.tree.v1",
|
||||
abi: "arboricx.abi.tree.v1",
|
||||
capabilities: []
|
||||
},
|
||||
closure: "complete",
|
||||
roots: [{ hash: Buffer.alloc(32).toString("hex"), role: "default" }],
|
||||
exports: [{ name: "root", root: Buffer.alloc(32).toString("hex"), kind: "term", abi: "arborix.abi.tree.v1" }],
|
||||
metadata: { createdBy: "arborix" }
|
||||
exports: [{ name: "root", root: Buffer.alloc(32).toString("hex"), kind: "term", abi: "arboricx.abi.tree.v1" }],
|
||||
metadata: { createdBy: "arboricx" }
|
||||
};
|
||||
const manifestJson = JSON.stringify(manifestObj);
|
||||
const manifestBytes = Buffer.from(manifestJson);
|
||||
|
||||
@@ -51,35 +51,35 @@ describe("merkle — computeNodeHash", () => {
|
||||
strictEqual(hash.length, 64);
|
||||
});
|
||||
|
||||
it("Leaf hash matches expected Arborix domain", () => {
|
||||
it("Leaf hash matches expected Arboricx domain", () => {
|
||||
const leaf = { type: "leaf" };
|
||||
const hash = computeNodeHash(leaf);
|
||||
strictEqual(hash, "e54db458aa8e94782f7c61ad6c1f19a1c0c6fca7ffe53674f0d2bc5ff7ab02ff");
|
||||
strictEqual(hash, "92b8a9796dbeafbcd36757535876256392170d137bf36b319d77f11a37112158");
|
||||
});
|
||||
});
|
||||
|
||||
describe("merkle — node section parsing", () => {
|
||||
const fixtureDir = "../../test/fixtures";
|
||||
|
||||
it("parses id.arborix with correct node count", () => {
|
||||
it("parses id.arboricx with correct node count", () => {
|
||||
const data = bundleParseNodeSection(
|
||||
readFileSync(`${fixtureDir}/id.arborix`)
|
||||
readFileSync(`${fixtureDir}/id.arboricx`)
|
||||
);
|
||||
const { nodeMap } = parseNodeSection(data);
|
||||
strictEqual(nodeMap.size, 4);
|
||||
});
|
||||
|
||||
it("parses true.arborix with correct node count", () => {
|
||||
it("parses true.arboricx with correct node count", () => {
|
||||
const data = bundleParseNodeSection(
|
||||
readFileSync(`${fixtureDir}/true.arborix`)
|
||||
readFileSync(`${fixtureDir}/true.arboricx`)
|
||||
);
|
||||
const { nodeMap } = parseNodeSection(data);
|
||||
strictEqual(nodeMap.size, 2);
|
||||
});
|
||||
|
||||
it("parses false.arborix with correct node count", () => {
|
||||
it("parses false.arboricx with correct node count", () => {
|
||||
const data = bundleParseNodeSection(
|
||||
readFileSync(`${fixtureDir}/false.arborix`)
|
||||
readFileSync(`${fixtureDir}/false.arboricx`)
|
||||
);
|
||||
const { nodeMap } = parseNodeSection(data);
|
||||
strictEqual(nodeMap.size, 1);
|
||||
@@ -89,29 +89,29 @@ describe("merkle — node section parsing", () => {
|
||||
describe("merkle — hash verification", () => {
|
||||
const fixtureDir = "../../test/fixtures";
|
||||
|
||||
it("id.arborix nodes all verify", () => {
|
||||
it("id.arboricx nodes all verify", () => {
|
||||
const data = bundleParseNodeSection(
|
||||
readFileSync(`${fixtureDir}/id.arborix`)
|
||||
readFileSync(`${fixtureDir}/id.arboricx`)
|
||||
);
|
||||
const { nodeMap } = parseNodeSection(data);
|
||||
const { verified, mismatches } = verifyNodeHashes(nodeMap);
|
||||
ok(verified, "id.arborix node hashes should verify");
|
||||
ok(verified, "id.arboricx node hashes should verify");
|
||||
strictEqual(mismatches.length, 0);
|
||||
});
|
||||
|
||||
it("true.arborix nodes all verify", () => {
|
||||
it("true.arboricx nodes all verify", () => {
|
||||
const data = bundleParseNodeSection(
|
||||
readFileSync(`${fixtureDir}/true.arborix`)
|
||||
readFileSync(`${fixtureDir}/true.arboricx`)
|
||||
);
|
||||
const { nodeMap } = parseNodeSection(data);
|
||||
const { verified, mismatches } = verifyNodeHashes(nodeMap);
|
||||
ok(verified, "true.arborix node hashes should verify");
|
||||
ok(verified, "true.arboricx node hashes should verify");
|
||||
strictEqual(mismatches.length, 0);
|
||||
});
|
||||
|
||||
it("corrupted node payload fails hash verification", () => {
|
||||
const data = bundleParseNodeSection(
|
||||
readFileSync(`${fixtureDir}/id.arborix`)
|
||||
readFileSync(`${fixtureDir}/id.arboricx`)
|
||||
);
|
||||
const { nodeMap } = parseNodeSection(data);
|
||||
// Find a stem node to corrupt
|
||||
@@ -137,23 +137,23 @@ describe("merkle — hash verification", () => {
|
||||
describe("merkle — closure verification", () => {
|
||||
const fixtureDir = "../../test/fixtures";
|
||||
|
||||
it("id.arborix has complete closure", () => {
|
||||
it("id.arboricx has complete closure", () => {
|
||||
const data = bundleParseNodeSection(
|
||||
readFileSync(`${fixtureDir}/id.arborix`)
|
||||
readFileSync(`${fixtureDir}/id.arboricx`)
|
||||
);
|
||||
const { nodeMap } = parseNodeSection(data);
|
||||
const { complete, missing } = verifyClosure(nodeMap);
|
||||
ok(complete, "id.arborix should have complete closure");
|
||||
ok(complete, "id.arboricx should have complete closure");
|
||||
strictEqual(missing.length, 0);
|
||||
});
|
||||
|
||||
it("verifyRootClosure checks transitive reachability", () => {
|
||||
const data = bundleParseNodeSection(
|
||||
readFileSync(`${fixtureDir}/id.arborix`)
|
||||
readFileSync(`${fixtureDir}/id.arboricx`)
|
||||
);
|
||||
const { nodeMap } = parseNodeSection(data);
|
||||
// Use the actual root hash from the fixture's manifest
|
||||
const manifest = parseManifest(readFileSync(`${fixtureDir}/id.arborix`));
|
||||
const manifest = parseManifest(readFileSync(`${fixtureDir}/id.arboricx`));
|
||||
const rootHash = manifest.exports[0].root;
|
||||
const { complete, missingRoots } = verifyRootClosure(nodeMap, rootHash);
|
||||
ok(complete, "root should be reachable");
|
||||
@@ -162,7 +162,7 @@ describe("merkle — closure verification", () => {
|
||||
|
||||
it("parseNodeSection returns correct node count", () => {
|
||||
const data = bundleParseNodeSection(
|
||||
readFileSync(`${fixtureDir}/id.arborix`)
|
||||
readFileSync(`${fixtureDir}/id.arboricx`)
|
||||
);
|
||||
const result = parseNodeSection(data);
|
||||
strictEqual(result.count, 4);
|
||||
|
||||
@@ -9,8 +9,8 @@ import { buildTreeFromNodeMap } from "../src/cli.js";
|
||||
|
||||
const fixtureDir = "../../test/fixtures";
|
||||
|
||||
describe("run bundle — id.arborix", () => {
|
||||
const bundle = readFileSync(`${fixtureDir}/id.arborix`);
|
||||
describe("run bundle — id.arboricx", () => {
|
||||
const bundle = readFileSync(`${fixtureDir}/id.arboricx`);
|
||||
const manifest = parseManifest(bundle);
|
||||
const nodeSectionData = bundleParseNodeSection(bundle);
|
||||
const { nodeMap } = parseNodes(nodeSectionData);
|
||||
@@ -37,8 +37,8 @@ describe("run bundle — id.arborix", () => {
|
||||
});
|
||||
});
|
||||
|
||||
describe("run bundle — true.arborix", () => {
|
||||
const bundle = readFileSync(`${fixtureDir}/true.arborix`);
|
||||
describe("run bundle — true.arboricx", () => {
|
||||
const bundle = readFileSync(`${fixtureDir}/true.arboricx`);
|
||||
const manifest = parseManifest(bundle);
|
||||
const nodeSectionData = bundleParseNodeSection(bundle);
|
||||
const { nodeMap } = parseNodes(nodeSectionData);
|
||||
@@ -61,8 +61,8 @@ describe("run bundle — true.arborix", () => {
|
||||
});
|
||||
});
|
||||
|
||||
describe("run bundle — false.arborix", () => {
|
||||
const bundle = readFileSync(`${fixtureDir}/false.arborix`);
|
||||
describe("run bundle — false.arboricx", () => {
|
||||
const bundle = readFileSync(`${fixtureDir}/false.arboricx`);
|
||||
const manifest = parseManifest(bundle);
|
||||
const nodeSectionData = bundleParseNodeSection(bundle);
|
||||
const { nodeMap } = parseNodes(nodeSectionData);
|
||||
@@ -83,8 +83,8 @@ describe("run bundle — false.arborix", () => {
|
||||
});
|
||||
});
|
||||
|
||||
describe("run bundle — notQ.arborix", () => {
|
||||
const bundle = readFileSync(`${fixtureDir}/notQ.arborix`);
|
||||
describe("run bundle — notQ.arboricx", () => {
|
||||
const bundle = readFileSync(`${fixtureDir}/notQ.arboricx`);
|
||||
const manifest = parseManifest(bundle);
|
||||
const nodeSectionData = bundleParseNodeSection(bundle);
|
||||
const { nodeMap } = parseNodes(nodeSectionData);
|
||||
@@ -100,7 +100,7 @@ describe("run bundle — notQ.arborix", () => {
|
||||
});
|
||||
|
||||
describe("run bundle — missing export", () => {
|
||||
const bundle = readFileSync(`${fixtureDir}/id.arborix`);
|
||||
const bundle = readFileSync(`${fixtureDir}/id.arboricx`);
|
||||
const manifest = parseManifest(bundle);
|
||||
|
||||
it("nonexistent export fails clearly", () => {
|
||||
@@ -109,8 +109,8 @@ describe("run bundle — missing export", () => {
|
||||
});
|
||||
|
||||
describe("run bundle — auto-select", () => {
|
||||
// true.arborix has only one export, should auto-select
|
||||
const bundle = readFileSync(`${fixtureDir}/true.arborix`);
|
||||
// true.arboricx has only one export, should auto-select
|
||||
const bundle = readFileSync(`${fixtureDir}/true.arboricx`);
|
||||
const manifest = parseManifest(bundle);
|
||||
|
||||
it("single export auto-selects", () => {
|
||||
|
||||
@@ -36,8 +36,6 @@
|
||||
checks.${packageName} = tricuPackageTests;
|
||||
checks.default = tricuPackageTests;
|
||||
|
||||
defaultPackage = self.packages.${system}.default;
|
||||
|
||||
devShells.default = pkgs.mkShell {
|
||||
buildInputs = with pkgs; [
|
||||
haskellPackages.cabal-install
|
||||
@@ -51,7 +49,6 @@
|
||||
tricuPackage
|
||||
];
|
||||
};
|
||||
devShell = self.devShells.${system}.default;
|
||||
|
||||
packages.${containerPackageName} = pkgs.dockerTools.buildImage {
|
||||
name = "tricu";
|
||||
|
||||
@@ -3,11 +3,11 @@
|
||||
!import "bytes.tri" !Local
|
||||
!import "binary.tri" !Local
|
||||
|
||||
arborixMagic = [(65) (82) (66) (79) (82) (73) (88) (0)]
|
||||
arborixMajorVersion = [(0) (1)]
|
||||
arborixMinorVersion = [(0) (0)]
|
||||
arborixManifestSectionId = [(0) (0) (0) (1)]
|
||||
arborixNodesSectionId = [(0) (0) (0) (2)]
|
||||
arboricxMagic = [(65) (82) (66) (79) (82) (73) (67) (88)]
|
||||
arboricxMajorVersion = [(0) (1)]
|
||||
arboricxMinorVersion = [(0) (0)]
|
||||
arboricxManifestSectionId = [(0) (0) (0) (1)]
|
||||
arboricxNodesSectionId = [(0) (0) (0) (2)]
|
||||
|
||||
errMissingSection = 4
|
||||
errUnsupportedVersion = 5
|
||||
@@ -20,10 +20,10 @@ nodePayloadLeafTag = 0
|
||||
nodePayloadStemTag = 1
|
||||
nodePayloadForkTag = 2
|
||||
|
||||
readArborixMagic = (bs : expectBytes arborixMagic bs)
|
||||
readArboricxMagic = (bs : expectBytes arboricxMagic bs)
|
||||
|
||||
readArborixHeader = (bs :
|
||||
bindResult (readArborixMagic bs)
|
||||
readArboricxHeader = (bs :
|
||||
bindResult (readArboricxMagic bs)
|
||||
(_ afterMagic :
|
||||
bindResult (readBytes 2 afterMagic)
|
||||
(majorVersion afterMajor :
|
||||
@@ -296,12 +296,12 @@ beBytesToNat = (bytes :
|
||||
u32BEBytesToNat = beBytesToNat
|
||||
u64BEBytesToNat = beBytesToNat
|
||||
|
||||
arborixHeaderMajorVersion = (header :
|
||||
arboricxHeaderMajorVersion = (header :
|
||||
matchPair
|
||||
(majorVersion _ : majorVersion)
|
||||
header)
|
||||
|
||||
arborixHeaderMinorVersion = (header :
|
||||
arboricxHeaderMinorVersion = (header :
|
||||
matchPair
|
||||
(_ payload :
|
||||
matchPair
|
||||
@@ -309,7 +309,7 @@ arborixHeaderMinorVersion = (header :
|
||||
payload)
|
||||
header)
|
||||
|
||||
arborixHeaderSectionCount = (header :
|
||||
arboricxHeaderSectionCount = (header :
|
||||
matchPair
|
||||
(_ payload :
|
||||
matchPair
|
||||
@@ -320,7 +320,7 @@ arborixHeaderSectionCount = (header :
|
||||
payload)
|
||||
header)
|
||||
|
||||
arborixHeaderFlags = (header :
|
||||
arboricxHeaderFlags = (header :
|
||||
matchPair
|
||||
(_ payload :
|
||||
matchPair
|
||||
@@ -334,7 +334,7 @@ arborixHeaderFlags = (header :
|
||||
payload)
|
||||
header)
|
||||
|
||||
arborixHeaderDirOffset = (header :
|
||||
arboricxHeaderDirOffset = (header :
|
||||
matchPair
|
||||
(_ payload :
|
||||
matchPair
|
||||
@@ -348,22 +348,22 @@ arborixHeaderDirOffset = (header :
|
||||
payload)
|
||||
header)
|
||||
|
||||
validateArborixHeader = (header rest :
|
||||
validateArboricxHeader = (header rest :
|
||||
matchBool
|
||||
(ok header rest)
|
||||
(err errUnsupportedVersion rest)
|
||||
(and?
|
||||
(bytesEq? arborixMajorVersion (arborixHeaderMajorVersion header))
|
||||
(bytesEq? arborixMinorVersion (arborixHeaderMinorVersion header))))
|
||||
(bytesEq? arboricxMajorVersion (arboricxHeaderMajorVersion header))
|
||||
(bytesEq? arboricxMinorVersion (arboricxHeaderMinorVersion header))))
|
||||
|
||||
readArborixContainer = (bs :
|
||||
bindResult (readArborixHeader bs)
|
||||
readArboricxContainer = (bs :
|
||||
bindResult (readArboricxHeader bs)
|
||||
(header afterHeader :
|
||||
bindResult (validateArborixHeader header afterHeader)
|
||||
bindResult (validateArboricxHeader header afterHeader)
|
||||
(validHeader afterValidHeader :
|
||||
bindResult (readSectionDirectory
|
||||
(u32BEBytesToNat (arborixHeaderSectionCount validHeader))
|
||||
(bytesDrop (u64BEBytesToNat (arborixHeaderDirOffset validHeader)) bs))
|
||||
(u32BEBytesToNat (arboricxHeaderSectionCount validHeader))
|
||||
(bytesDrop (u64BEBytesToNat (arboricxHeaderDirOffset validHeader)) bs))
|
||||
(directory afterDirectory :
|
||||
bindResult (validateSectionDirectory directory afterDirectory)
|
||||
(validDirectory afterValidDirectory :
|
||||
@@ -403,21 +403,21 @@ sectionBytesOrErr = (sectionId directory containerBytes rest :
|
||||
(_ _ : err errMissingSection rest)
|
||||
(lookupSectionRecord sectionId directory))
|
||||
|
||||
readArborixSectionBytes = (sectionId bs :
|
||||
bindResult (readArborixContainer bs)
|
||||
readArboricxSectionBytes = (sectionId bs :
|
||||
bindResult (readArboricxContainer bs)
|
||||
(container afterContainer :
|
||||
matchPair
|
||||
(_ directory : sectionBytesOrErr sectionId directory bs afterContainer)
|
||||
container))
|
||||
|
||||
readArborixRequiredSections = (bs :
|
||||
bindResult (readArborixContainer bs)
|
||||
readArboricxRequiredSections = (bs :
|
||||
bindResult (readArboricxContainer bs)
|
||||
(container afterContainer :
|
||||
matchPair
|
||||
(_ directory :
|
||||
bindResult (sectionBytesOrErr arborixManifestSectionId directory bs afterContainer)
|
||||
bindResult (sectionBytesOrErr arboricxManifestSectionId directory bs afterContainer)
|
||||
(manifestBytes _ :
|
||||
bindResult (sectionBytesOrErr arborixNodesSectionId directory bs afterContainer)
|
||||
bindResult (sectionBytesOrErr arboricxNodesSectionId directory bs afterContainer)
|
||||
(nodesBytes _ :
|
||||
ok (pair manifestBytes nodesBytes) afterContainer)))
|
||||
container))
|
||||
@@ -602,12 +602,12 @@ readNodesSectionComplete = (bs :
|
||||
(err errUnexpectedBytes afterNodesSection)
|
||||
(bytesNil? afterNodesSection)))
|
||||
|
||||
readArborixNodesSection = (bs :
|
||||
bindResult (readArborixContainer bs)
|
||||
readArboricxNodesSection = (bs :
|
||||
bindResult (readArboricxContainer bs)
|
||||
(container afterContainer :
|
||||
matchPair
|
||||
(_ directory :
|
||||
bindResult (sectionBytesOrErr arborixNodesSectionId directory bs afterContainer)
|
||||
bindResult (sectionBytesOrErr arboricxNodesSectionId directory bs afterContainer)
|
||||
(nodesBytes _ :
|
||||
bindResult (readNodesSectionComplete nodesBytes)
|
||||
(nodesSection _ : ok nodesSection afterContainer)))
|
||||
@@ -645,10 +645,10 @@ nodeHashToTree = y (self nodeHash nodeRecords :
|
||||
(_ _ : err errMissingNode t)
|
||||
(lookupNodeRecord nodeHash nodeRecords))
|
||||
|
||||
readArborixTreeFromHash = (rootHash bs :
|
||||
bindResult (readArborixNodesSection bs)
|
||||
readArboricxTreeFromHash = (rootHash bs :
|
||||
bindResult (readArboricxNodesSection bs)
|
||||
(nodesSection afterContainer :
|
||||
bindResult (nodeHashToTree rootHash (nodesSectionRecords nodesSection))
|
||||
(tree _ : ok tree afterContainer)))
|
||||
|
||||
readArborixExecutableFromHash = readArborixTreeFromHash
|
||||
readArboricxExecutableFromHash = readArboricxTreeFromHash
|
||||
@@ -162,7 +162,7 @@ nsVariable :: String -> String -> String
|
||||
nsVariable "" name = name
|
||||
nsVariable moduleName name = moduleName ++ "." ++ name
|
||||
|
||||
-- | Compile a tricu source file to a standalone Arborix bundle.
|
||||
-- | Compile a tricu source file to a standalone Arboricx bundle.
|
||||
-- Uses a temp content store so it does not collide with the global one.
|
||||
-- Supports multiple named exports; each is stored separately in the
|
||||
-- temp store so that resolveExportTarget can look them up by name.
|
||||
|
||||
@@ -97,7 +97,7 @@ compileMode = Compile
|
||||
, names = def &= help "Definition name(s) to export as bundle roots (comma-separated or repeated -x). Defaults to 'main'."
|
||||
&= name "x" &= typ "NAME"
|
||||
}
|
||||
&= help "Compile a tricu source file into a standalone Arborix portable bundle."
|
||||
&= help "Compile a tricu source file into a standalone Arboricx portable bundle."
|
||||
&= explicit
|
||||
&= name "compile"
|
||||
|
||||
@@ -106,7 +106,7 @@ serveMode = Serve
|
||||
{ host = "127.0.0.1" &= help "Host to bind the server to." &= name "h" &= typ "HOST"
|
||||
, port = 8787 &= help "HTTP port to listen on." &= name "p" &= typ "PORT"
|
||||
}
|
||||
&= help "Start a read-only HTTP server for exporting Arborix bundles."
|
||||
&= help "Start a read-only HTTP server for exporting Arboricx bundles."
|
||||
&= explicit
|
||||
&= name "server"
|
||||
|
||||
@@ -182,10 +182,10 @@ main = do
|
||||
let exportNames = if null namesArg then [] else map T.pack namesArg
|
||||
in compileFile compileInputFile compileOutFile exportNames
|
||||
Serve { host = hostStr, port = portNum } -> do
|
||||
putStrLn $ "Starting Arborix bundle server on " ++ hostStr ++ ":" ++ show portNum
|
||||
putStrLn $ "Starting Arboricx bundle server on " ++ hostStr ++ ":" ++ show portNum
|
||||
putStrLn $ " GET /bundle/hash/:hash -- primary endpoint"
|
||||
putStrLn $ " GET /bundle/name/:name -- convenience endpoint"
|
||||
putStrLn $ " Content-Type: application/vnd.arborix.bundle"
|
||||
putStrLn $ " Content-Type: application/vnd.arboricx.bundle"
|
||||
runServer hostStr portNum
|
||||
|
||||
runTricu :: String -> String
|
||||
|
||||
@@ -85,12 +85,12 @@ serializeNode (NFork l r) = BS.pack [0x02] <> go (decode (encodeUtf8 l)) <> go (
|
||||
go (Right bs) = bs
|
||||
|
||||
-- | Hash a node per the Merkle content-addressing spec.
|
||||
-- hash = SHA256( "arborix.merkle.node.v1" <> 0x00 <> node_payload )
|
||||
-- hash = SHA256( "arboricx.merkle.node.v1" <> 0x00 <> node_payload )
|
||||
nodeHash :: Node -> MerkleHash
|
||||
nodeHash node = decodeUtf8 (encode (sha256WithPrefix (serializeNode node)))
|
||||
where sha256WithPrefix payload =
|
||||
convert . (hash :: BS.ByteString -> Digest SHA256) $ utf8Tag <> BS.pack [0x00] <> payload
|
||||
utf8Tag = BS.pack $ map fromIntegral $ BS.unpack "arborix.merkle.node.v1"
|
||||
utf8Tag = BS.pack $ map fromIntegral $ BS.unpack "arboricx.merkle.node.v1"
|
||||
|
||||
-- | Deserialize a Node from canonical bytes.
|
||||
deserializeNode :: BS.ByteString -> Node
|
||||
@@ -138,7 +138,7 @@ toBytes t = case toList t of
|
||||
Left err -> Left err
|
||||
Right bs -> BS.pack <$> mapM toByte bs
|
||||
|
||||
-- | Convert a canonical Arborix node payload (ByteString) to a Tree
|
||||
-- | Convert a canonical Arboricx node payload (ByteString) to a Tree
|
||||
-- representation (a list of Byte trees).
|
||||
nodePayloadToTreeBytes :: BS.ByteString -> T
|
||||
nodePayloadToTreeBytes = ofBytes
|
||||
|
||||
@@ -23,7 +23,7 @@ import Data.ByteString.Char8 (unpack)
|
||||
import Data.ByteString.Lazy (fromStrict)
|
||||
import qualified Data.Text as T
|
||||
|
||||
-- | Start an HTTP server that serves Arborix bundles from the
|
||||
-- | Start an HTTP server that serves Arboricx bundles from the
|
||||
-- local content store.
|
||||
--
|
||||
-- This is a read-only export surface. Clients fetch bundle bytes
|
||||
@@ -133,10 +133,10 @@ rootsHandler request respond = do
|
||||
(fromStrict bundleData)
|
||||
|
||||
-- | GET /bundle/name/:name
|
||||
-- Resolve a stored term name, export it as an Arborix bundle,
|
||||
-- Resolve a stored term name, export it as an Arboricx bundle,
|
||||
-- and return the raw bundle bytes.
|
||||
--
|
||||
-- Sets @Content-Type@ and @X-Arborix-Root-Hash@ headers.
|
||||
-- Sets @Content-Type@ and @X-Arboricx-Root-Hash@ headers.
|
||||
-- Returns 404 when the name does not resolve to any stored term.
|
||||
nameHandler :: Text -> IO Response
|
||||
nameHandler nameText = do
|
||||
@@ -155,7 +155,7 @@ nameHandler nameText = do
|
||||
return $ responseLBS status200 (bundleHeaders th cd) (fromStrict bundleData)
|
||||
|
||||
-- | GET /bundle/hash/:hash
|
||||
-- Resolve a full Merkle hash and export the root as an Arborix
|
||||
-- Resolve a full Merkle hash and export the root as an Arboricx
|
||||
-- bundle.
|
||||
--
|
||||
-- - Malformed hash (non-hex or < 16 chars): 400
|
||||
@@ -207,8 +207,8 @@ textResponse status body =
|
||||
|
||||
bundleHeaders :: Text -> Text -> [Header]
|
||||
bundleHeaders root cd =
|
||||
[ (hContentType, encodeUtf8 "application/vnd.arborix.bundle")
|
||||
, ("X-Arborix-Root-Hash", encodeUtf8 root)
|
||||
[ (hContentType, encodeUtf8 "application/vnd.arboricx.bundle")
|
||||
, ("X-Arboricx-Root-Hash", encodeUtf8 root)
|
||||
, ("Content-Disposition", encodeUtf8 cd)
|
||||
]
|
||||
|
||||
|
||||
22
src/Wire.hs
22
src/Wire.hs
@@ -71,7 +71,7 @@ bundleMinorVersion = 0
|
||||
|
||||
-- | Header magic for the portable executable-object container.
|
||||
bundleMagic :: ByteString
|
||||
bundleMagic = BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x58, 0x00] -- "ARBORIX\0"
|
||||
bundleMagic = BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x43, 0x58] -- "ARBORICX"
|
||||
|
||||
headerLength :: Int
|
||||
headerLength = 32
|
||||
@@ -563,20 +563,20 @@ decodeSectionEntries count bytes = reverse <$> go count bytes []
|
||||
|
||||
defaultManifest :: [(Text, MerkleHash)] -> BundleManifest
|
||||
defaultManifest namedRoots = BundleManifest
|
||||
{ manifestSchema = "arborix.bundle.manifest.v1"
|
||||
{ manifestSchema = "arboricx.bundle.manifest.v1"
|
||||
, manifestBundleType = "tree-calculus-executable-object"
|
||||
, manifestTree = TreeSpec
|
||||
{ treeCalculus = "tree-calculus.v1"
|
||||
, treeNodeHash = NodeHashSpec
|
||||
{ nodeHashAlgorithm = "sha256"
|
||||
, nodeHashDomain = "arborix.merkle.node.v1"
|
||||
, nodeHashDomain = "arboricx.merkle.node.v1"
|
||||
}
|
||||
, treeNodePayload = "arborix.merkle.payload.v1"
|
||||
, treeNodePayload = "arboricx.merkle.payload.v1"
|
||||
}
|
||||
, manifestRuntime = RuntimeSpec
|
||||
{ runtimeSemantics = "tree-calculus.v1"
|
||||
, runtimeEvaluation = "normal-order"
|
||||
, runtimeAbi = "arborix.abi.tree.v1"
|
||||
, runtimeAbi = "arboricx.abi.tree.v1"
|
||||
, runtimeCapabilities = []
|
||||
}
|
||||
, manifestClosure = ClosureComplete
|
||||
@@ -587,7 +587,7 @@ defaultManifest namedRoots = BundleManifest
|
||||
, metadataVersion = Nothing
|
||||
, metadataDescription = Nothing
|
||||
, metadataLicense = Nothing
|
||||
, metadataCreatedBy = Just "arborix"
|
||||
, metadataCreatedBy = Just "arboricx"
|
||||
}
|
||||
}
|
||||
where
|
||||
@@ -597,7 +597,7 @@ defaultManifest namedRoots = BundleManifest
|
||||
{ exportName = name
|
||||
, exportRoot = h
|
||||
, exportKind = "term"
|
||||
, exportAbi = "arborix.abi.tree.v1"
|
||||
, exportAbi = "arboricx.abi.tree.v1"
|
||||
}
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
@@ -672,7 +672,7 @@ verifyBundle bundle = do
|
||||
|
||||
verifyManifest :: BundleManifest -> Either String ()
|
||||
verifyManifest manifest = do
|
||||
when (manifestSchema manifest /= "arborix.bundle.manifest.v1") $
|
||||
when (manifestSchema manifest /= "arboricx.bundle.manifest.v1") $
|
||||
Left $ "unsupported manifest schema: " ++ unpack (manifestSchema manifest)
|
||||
when (manifestBundleType manifest /= "tree-calculus-executable-object") $
|
||||
Left $ "unsupported bundle type: " ++ unpack (manifestBundleType manifest)
|
||||
@@ -683,13 +683,13 @@ verifyManifest manifest = do
|
||||
Left $ "unsupported calculus: " ++ unpack (treeCalculus treeSpec)
|
||||
when (nodeHashAlgorithm hashSpec /= "sha256") $
|
||||
Left $ "unsupported node hash algorithm: " ++ unpack (nodeHashAlgorithm hashSpec)
|
||||
when (nodeHashDomain hashSpec /= "arborix.merkle.node.v1") $
|
||||
when (nodeHashDomain hashSpec /= "arboricx.merkle.node.v1") $
|
||||
Left $ "unsupported node hash domain: " ++ unpack (nodeHashDomain hashSpec)
|
||||
when (treeNodePayload treeSpec /= "arborix.merkle.payload.v1") $
|
||||
when (treeNodePayload treeSpec /= "arboricx.merkle.payload.v1") $
|
||||
Left $ "unsupported node payload: " ++ unpack (treeNodePayload treeSpec)
|
||||
when (runtimeSemantics runtimeSpec /= "tree-calculus.v1") $
|
||||
Left $ "unsupported runtime semantics: " ++ unpack (runtimeSemantics runtimeSpec)
|
||||
when (runtimeAbi runtimeSpec /= "arborix.abi.tree.v1") $
|
||||
when (runtimeAbi runtimeSpec /= "arboricx.abi.tree.v1") $
|
||||
Left $ "unsupported runtime ABI: " ++ unpack (runtimeAbi runtimeSpec)
|
||||
when (not (null (runtimeCapabilities runtimeSpec))) $
|
||||
Left "unsupported runtime capabilities"
|
||||
|
||||
332
test/Spec.hs
332
test/Spec.hs
@@ -786,7 +786,7 @@ wireTests = testGroup "Wire Tests"
|
||||
, "main = id t"
|
||||
]
|
||||
wireData <- exportBundle srcConn [termHash]
|
||||
BS.take 8 wireData @?= BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x58, 0x00]
|
||||
BS.take 8 wireData @?= BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x43, 0x58]
|
||||
case decodeBundle wireData of
|
||||
Left err -> assertFailure $ "decodeBundle failed: " ++ err
|
||||
Right bundle -> do
|
||||
@@ -794,15 +794,15 @@ wireTests = testGroup "Wire Tests"
|
||||
tree = manifestTree manifest
|
||||
hashSpec = treeNodeHash tree
|
||||
runtime = manifestRuntime manifest
|
||||
manifestSchema manifest @?= "arborix.bundle.manifest.v1"
|
||||
manifestSchema manifest @?= "arboricx.bundle.manifest.v1"
|
||||
manifestBundleType manifest @?= "tree-calculus-executable-object"
|
||||
manifestClosure manifest @?= ClosureComplete
|
||||
treeCalculus tree @?= "tree-calculus.v1"
|
||||
treeNodePayload tree @?= "arborix.merkle.payload.v1"
|
||||
treeNodePayload tree @?= "arboricx.merkle.payload.v1"
|
||||
nodeHashAlgorithm hashSpec @?= "sha256"
|
||||
nodeHashDomain hashSpec @?= "arborix.merkle.node.v1"
|
||||
nodeHashDomain hashSpec @?= "arboricx.merkle.node.v1"
|
||||
runtimeSemantics runtime @?= "tree-calculus.v1"
|
||||
runtimeAbi runtime @?= "arborix.abi.tree.v1"
|
||||
runtimeAbi runtime @?= "arboricx.abi.tree.v1"
|
||||
runtimeCapabilities runtime @?= []
|
||||
bundleRoots bundle @?= [termHash]
|
||||
map exportRoot (manifestExports manifest) @?= [termHash]
|
||||
@@ -823,7 +823,7 @@ wireTests = testGroup "Wire Tests"
|
||||
exportName exported @?= "validateEmail"
|
||||
exportRoot exported @?= termHash
|
||||
exportKind exported @?= "term"
|
||||
exportAbi exported @?= "arborix.abi.tree.v1"
|
||||
exportAbi exported @?= "arboricx.abi.tree.v1"
|
||||
exports -> assertFailure $ "Expected one export, got: " ++ show exports
|
||||
close srcConn
|
||||
|
||||
@@ -1064,9 +1064,9 @@ u32 n = [0,0,0,n]
|
||||
u64 :: Integer -> [Integer]
|
||||
u64 n = [0,0,0,0,0,0,0,n]
|
||||
|
||||
arborixHeaderBytes :: Integer -> [Integer]
|
||||
arborixHeaderBytes sectionCount =
|
||||
[65,82,66,79,82,73,88,0]
|
||||
arboricxHeaderBytes :: Integer -> [Integer]
|
||||
arboricxHeaderBytes sectionCount =
|
||||
[65,82,66,79,82,73,67,88]
|
||||
++ u16 1
|
||||
++ u16 0
|
||||
++ u32 sectionCount
|
||||
@@ -1107,7 +1107,7 @@ simpleContainerBytes :: [Integer] -> [Integer] -> [Integer]
|
||||
simpleContainerBytes manifestBytes nodesBytes =
|
||||
let manifestOffset = 152
|
||||
nodesOffset = manifestOffset + fromIntegral (length manifestBytes)
|
||||
in arborixHeaderBytes 2
|
||||
in arboricxHeaderBytes 2
|
||||
++ manifestEntryBytes manifestOffset (fromIntegral $ length manifestBytes)
|
||||
++ nodesEntryBytes nodesOffset (fromIntegral $ length nodesBytes)
|
||||
++ manifestBytes
|
||||
@@ -1115,12 +1115,12 @@ simpleContainerBytes manifestBytes nodesBytes =
|
||||
|
||||
singleSectionContainerBytes :: [Integer] -> [Integer] -> [Integer]
|
||||
singleSectionContainerBytes sectionType sectionBytes =
|
||||
arborixHeaderBytes 1
|
||||
arboricxHeaderBytes 1
|
||||
++ sectionEntryBytes sectionType 92 (fromIntegral $ length sectionBytes)
|
||||
++ sectionBytes
|
||||
|
||||
arborixHeaderT :: Integer -> T
|
||||
arborixHeaderT sectionCount =
|
||||
arboricxHeaderT :: Integer -> T
|
||||
arboricxHeaderT sectionCount =
|
||||
pairT (bytesT [0,1])
|
||||
(pairT (bytesT [0,0])
|
||||
(pairT (bytesT $ u32 sectionCount)
|
||||
@@ -1615,120 +1615,114 @@ binaryReaderTests = testGroup "Binary Reader Tests"
|
||||
result env @?= errT eofT (bytesT [1,2,3])
|
||||
|
||||
-- ------------------------------------------------------------------------
|
||||
-- Arborix magic recognition
|
||||
-- Arboricx magic recognition
|
||||
-- ------------------------------------------------------------------------
|
||||
|
||||
, testCase "readArborixMagic: accepts magic and preserves rest" $ do
|
||||
let input = "readArborixMagic [(65) (82) (66) (79) (82) (73) (88) (0) (1) (2)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxMagic: accepts magic and preserves rest" $ do
|
||||
let input = "readArboricxMagic ((append arboricxMagic) [(1) (2)])"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT unitT (bytesT [1,2])
|
||||
|
||||
, testCase "readArborixMagic: rejects wrong magic preserving input" $ do
|
||||
let input = "readArborixMagic [(65) (82) (66) (79) (82) (73) (88) (1) (9)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxMagic: rejects wrong magic preserving input" $ do
|
||||
let input = "readArboricxMagic [(65) (83) (66) (79) (82) (73) (67) (88) (1) (9)]"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT unexpectedBytesT (bytesT [65,82,66,79,82,73,88,1,9])
|
||||
result env @?= errT unexpectedBytesT (bytesT [65,83,66,79,82,73,67,88,1,9])
|
||||
|
||||
, testCase "readArborixMagic: short input returns EOF preserving input" $ do
|
||||
let input = "readArborixMagic [(65) (82) (66) (79)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxMagic: short input returns EOF preserving input" $ do
|
||||
let input = "readArboricxMagic [(65) (82) (66) (79)]"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT eofT (bytesT [65,82,66,79])
|
||||
|
||||
-- ------------------------------------------------------------------------
|
||||
-- Arborix header parsing
|
||||
-- Arboricx header parsing
|
||||
-- ------------------------------------------------------------------------
|
||||
|
||||
, testCase "readArborixHeader: parses portable header" $ do
|
||||
let input = "readArborixHeader " ++ bytesExpr (arborixHeaderBytes 0)
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxHeader: parses portable header" $ do
|
||||
let input = "readArboricxHeader " ++ bytesExpr (arboricxHeaderBytes 0)
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT (arborixHeaderT 0) (bytesT [])
|
||||
result env @?= okT (arboricxHeaderT 0) (bytesT [])
|
||||
|
||||
, testCase "readArborixHeader: preserves trailing bytes" $ do
|
||||
let input = "readArborixHeader " ++ bytesExpr (arborixHeaderBytes 0 ++ [9,8])
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxHeader: preserves trailing bytes" $ do
|
||||
let input = "readArboricxHeader " ++ bytesExpr (arboricxHeaderBytes 0 ++ [9,8])
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT (arborixHeaderT 0) (bytesT [9,8])
|
||||
result env @?= okT (arboricxHeaderT 0) (bytesT [9,8])
|
||||
|
||||
, testCase "readArborixHeader: rejects wrong magic preserving input" $ do
|
||||
let input = "readArborixHeader [(65) (82) (66) (79) (82) (73) (88) (1) (0) (1)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT unexpectedBytesT (bytesT [65,82,66,79,82,73,88,1,0,1])
|
||||
|
||||
, testCase "readArborixHeader: short input returns EOF preserving input" $ do
|
||||
let input = "readArborixHeader [(65) (82)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxHeader: short input returns EOF preserving input" $ do
|
||||
let input = "readArboricxHeader [(65) (82)]"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT eofT (bytesT [65,82])
|
||||
|
||||
-- ------------------------------------------------------------------------
|
||||
-- Arborix section directory record parsing
|
||||
-- Arboricx section directory record parsing
|
||||
-- ------------------------------------------------------------------------
|
||||
|
||||
, testCase "readSectionRecord: parses portable section entry" $ do
|
||||
let input = "readSectionRecord " ++ bytesExpr (nodesEntryBytes 16 32)
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT (sectionRecordT nodesSectionIdBytes 16 32) (bytesT [])
|
||||
|
||||
, testCase "readSectionRecord: preserves trailing bytes" $ do
|
||||
let input = "readSectionRecord " ++ bytesExpr (nodesEntryBytes 16 32 ++ [9,8])
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT (sectionRecordT nodesSectionIdBytes 16 32) (bytesT [9,8])
|
||||
|
||||
, testCase "readSectionRecord: empty input returns EOF" $ do
|
||||
let input = "readSectionRecord []"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT eofT (bytesT [])
|
||||
|
||||
, testCase "readSectionRecord: short section id returns EOF preserving input" $ do
|
||||
let input = "readSectionRecord [(0)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT eofT (bytesT [0])
|
||||
|
||||
, testCase "readSectionRecord: missing section version returns EOF preserving unread bytes" $ do
|
||||
let input = "readSectionRecord [(0) (2)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT eofT (bytesT [0,2])
|
||||
|
||||
, testCase "readSectionRecord: short section version returns EOF preserving unread bytes" $ do
|
||||
let input = "readSectionRecord [(0) (2) (0) (0) (0)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT eofT (bytesT [0])
|
||||
|
||||
, testCase "readSectionRecord: missing length returns EOF preserving unread length bytes" $ do
|
||||
let input = "readSectionRecord [(0) (2) (0) (0) (0) (16)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT eofT (bytesT [])
|
||||
|
||||
, testCase "readSectionRecord: short section flags returns EOF preserving unread bytes" $ do
|
||||
let input = "readSectionRecord [(0) (2) (0) (0) (0) (16) (0) (0) (0)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT eofT (bytesT [0])
|
||||
|
||||
-- ------------------------------------------------------------------------
|
||||
-- Arborix section directory parsing
|
||||
-- Arboricx section directory parsing
|
||||
-- ------------------------------------------------------------------------
|
||||
|
||||
, testCase "readSectionDirectory: zero records preserves input" $ do
|
||||
let input = "readSectionDirectory 0 [(9) (8)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT (ofList []) (bytesT [9,8])
|
||||
|
||||
, testCase "readSectionDirectory: reads requested records and preserves trailing bytes" $ do
|
||||
let input = "readSectionDirectory 2 " ++ bytesExpr (manifestEntryBytes 10 20 ++ nodesEntryBytes 30 40 ++ [9])
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT
|
||||
(ofList
|
||||
@@ -1739,171 +1733,171 @@ binaryReaderTests = testGroup "Binary Reader Tests"
|
||||
|
||||
, testCase "readSectionDirectory: truncated record returns EOF" $ do
|
||||
let input = "readSectionDirectory 2 [(0) (1) (0) (0) (0) (10) (0) (0) (0) (20) (0) (2) (0) (0)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT eofT (bytesT [0,0])
|
||||
|
||||
-- ------------------------------------------------------------------------
|
||||
-- Arborix section lookup and raw byte slicing
|
||||
-- Arboricx section lookup and raw byte slicing
|
||||
-- ------------------------------------------------------------------------
|
||||
|
||||
, testCase "lookupSectionRecord: finds record by raw section id" $ do
|
||||
let input = "lookupSectionRecord " ++ bytesExpr nodesSectionIdBytes ++ " [(" ++ "pair " ++ bytesExpr manifestSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 10) ++ " (pair " ++ bytesExpr (u64 20) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ") (" ++ "pair " ++ bytesExpr nodesSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 30) ++ " (pair " ++ bytesExpr (u64 40) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ")]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= justT (sectionRecordT nodesSectionIdBytes 30 40)
|
||||
|
||||
, testCase "lookupSectionRecord: missing section id returns nothing" $ do
|
||||
let input = "lookupSectionRecord " ++ bytesExpr [0,0,0,3] ++ " [(" ++ "pair " ++ bytesExpr manifestSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 10) ++ " (pair " ++ bytesExpr (u64 20) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ") (" ++ "pair " ++ bytesExpr nodesSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 30) ++ " (pair " ++ bytesExpr (u64 40) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ")]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= nothingT
|
||||
|
||||
, testCase "byteSlice: extracts requested byte range" $ do
|
||||
let input = "byteSlice 2 3 [(10) (11) (12) (13) (14) (15)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= bytesT [12,13,14]
|
||||
|
||||
, testCase "byteSlice: overlong length returns remaining bytes" $ do
|
||||
let input = "byteSlice 4 9 [(10) (11) (12) (13) (14) (15)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= bytesT [14,15]
|
||||
|
||||
-- ------------------------------------------------------------------------
|
||||
-- Arborix minimal container parsing foundation
|
||||
-- Arboricx minimal container parsing foundation
|
||||
-- ------------------------------------------------------------------------
|
||||
|
||||
, testCase "u32BEBytesToNat: decodes zero" $ do
|
||||
let input = "u32BEBytesToNat [(0) (0) (0) (0)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofNumber 0
|
||||
|
||||
, testCase "u32BEBytesToNat: decodes small section count" $ do
|
||||
let input = "u32BEBytesToNat [(0) (0) (0) (2)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofNumber 2
|
||||
|
||||
, testCase "u64BEBytesToNat: decodes small node count" $ do
|
||||
let input = "u64BEBytesToNat [(0) (0) (0) (0) (0) (0) (0) (2)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofNumber 2
|
||||
|
||||
, testCase "u64BEBytesToNat: decodes fixture-scale offset" $ do
|
||||
let input = "u64BEBytesToNat [(0) (0) (0) (0) (0) (0) (3) (214)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofNumber 982
|
||||
|
||||
, testCase "readArborixContainer: reads header directory and preserves payload" $ do
|
||||
let input = "readArborixContainer " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204])
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxContainer: reads header directory and preserves payload" $ do
|
||||
let input = "readArboricxContainer " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204])
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT
|
||||
(pairT
|
||||
(arborixHeaderT 2)
|
||||
(arboricxHeaderT 2)
|
||||
(ofList
|
||||
[ sectionRecordT manifestSectionIdBytes 152 3
|
||||
, sectionRecordT nodesSectionIdBytes 155 4
|
||||
]))
|
||||
(bytesT [101,102,103,201,202,203,204])
|
||||
|
||||
, testCase "readArborixContainer: truncated directory returns EOF" $ do
|
||||
let input = "readArborixContainer " ++ bytesExpr (arborixHeaderBytes 1 ++ [0,0])
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxContainer: truncated directory returns EOF" $ do
|
||||
let input = "readArboricxContainer " ++ bytesExpr (arboricxHeaderBytes 1 ++ [0,0])
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT eofT (bytesT [0,0])
|
||||
|
||||
, testCase "readArborixContainer: rejects unsupported major version" $ do
|
||||
let badHeader = [65,82,66,79,82,73,88,0] ++ u16 2 ++ u16 0 ++ u32 0 ++ u64 0 ++ u64 32
|
||||
input = "readArborixContainer " ++ bytesExpr badHeader
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxContainer: rejects unsupported major version" $ do
|
||||
let badHeader = [65,82,66,79,82,73,67,88] ++ u16 2 ++ u16 0 ++ u32 0 ++ u64 0 ++ u64 32
|
||||
input = "readArboricxContainer " ++ bytesExpr badHeader
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT unsupportedVersionT (bytesT [])
|
||||
|
||||
, testCase "readArborixContainer: rejects unsupported minor version" $ do
|
||||
let badHeader = [65,82,66,79,82,73,88,0] ++ u16 1 ++ u16 1 ++ u32 0 ++ u64 0 ++ u64 32
|
||||
input = "readArborixContainer " ++ bytesExpr badHeader
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxContainer: rejects unsupported minor version" $ do
|
||||
let badHeader = [65,82,66,79,82,73,67,88] ++ u16 1 ++ u16 1 ++ u32 0 ++ u64 0 ++ u64 32
|
||||
input = "readArboricxContainer " ++ bytesExpr badHeader
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT unsupportedVersionT (bytesT [])
|
||||
|
||||
, testCase "readArborixContainer: rejects duplicate section ids" $ do
|
||||
let input = "readArborixContainer " ++ bytesExpr (arborixHeaderBytes 2 ++ manifestEntryBytes 152 1 ++ manifestEntryBytes 153 1 ++ [9])
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxContainer: rejects duplicate section ids" $ do
|
||||
let input = "readArboricxContainer " ++ bytesExpr (arboricxHeaderBytes 2 ++ manifestEntryBytes 152 1 ++ manifestEntryBytes 153 1 ++ [9])
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT duplicateSectionT (bytesT [9])
|
||||
|
||||
, testCase "extractSectionBytes: uses raw offset and length fields" $ do
|
||||
let input = "extractSectionBytes " ++ sectionRecordExpr nodesSectionIdBytes 3 4 ++ " " ++ bytesExpr [10,11,12,13,14,15,16,17]
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= bytesT [13,14,15,16]
|
||||
|
||||
, testCase "lookupSectionBytes: finds section and extracts raw bytes" $ do
|
||||
let input = "lookupSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " [" ++ sectionRecordExpr manifestSectionIdBytes 1 2 ++ " " ++ sectionRecordExpr nodesSectionIdBytes 4 3 ++ "] " ++ bytesExpr [10,11,12,13,14,15,16,17]
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= justT (bytesT [14,15,16])
|
||||
|
||||
, testCase "lookupSectionBytes: missing section returns nothing" $ do
|
||||
let input = "lookupSectionBytes " ++ bytesExpr [0,0,0,3] ++ " [" ++ sectionRecordExpr manifestSectionIdBytes 1 2 ++ " " ++ sectionRecordExpr nodesSectionIdBytes 4 3 ++ "] " ++ bytesExpr [10,11,12,13,14,15,16,17]
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= nothingT
|
||||
|
||||
, testCase "extractSectionBytesResult: rejects out-of-bounds section" $ do
|
||||
let input = "extractSectionBytesResult " ++ sectionRecordExpr nodesSectionIdBytes 6 4 ++ " " ++ bytesExpr [10,11,12,13,14,15,16,17] ++ " []"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT eofT (bytesT [])
|
||||
|
||||
, testCase "readArborixSectionBytes: extracts requested section from container" $ do
|
||||
let input = "readArborixSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204])
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxSectionBytes: extracts requested section from container" $ do
|
||||
let input = "readArboricxSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204])
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT (bytesT [201,202,203,204]) (bytesT [101,102,103,201,202,203,204])
|
||||
|
||||
, testCase "readArborixSectionBytes: missing section returns missing-section err" $ do
|
||||
let input = "readArborixSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103])
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxSectionBytes: missing section returns missing-section err" $ do
|
||||
let input = "readArboricxSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103])
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT missingSectionT (bytesT [101,102,103])
|
||||
|
||||
, testCase "readArborixRequiredSections: extracts manifest and nodes bytes" $ do
|
||||
let input = "readArborixRequiredSections " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204])
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxRequiredSections: extracts manifest and nodes bytes" $ do
|
||||
let input = "readArboricxRequiredSections " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204])
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT
|
||||
(pairT (bytesT [101,102,103]) (bytesT [201,202,203,204]))
|
||||
(bytesT [101,102,103,201,202,203,204])
|
||||
|
||||
, testCase "readArborixRequiredSections: missing nodes section returns missing-section err" $ do
|
||||
let input = "readArborixRequiredSections " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103])
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
, testCase "readArboricxRequiredSections: missing nodes section returns missing-section err" $ do
|
||||
let input = "readArboricxRequiredSections " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103])
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT missingSectionT (bytesT [101,102,103])
|
||||
|
||||
, testCase "readArborixRequiredSections: out-of-bounds section returns EOF" $ do
|
||||
, testCase "readArboricxRequiredSections: out-of-bounds section returns EOF" $ do
|
||||
let manifestBytes = [101,102,103]
|
||||
nodesBytes = [201,202,203,204]
|
||||
badContainer = arborixHeaderBytes 2 ++ manifestEntryBytes 152 3 ++ nodesEntryBytes 155 9 ++ manifestBytes ++ nodesBytes
|
||||
input = "readArborixRequiredSections " ++ bytesExpr badContainer
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
badContainer = arboricxHeaderBytes 2 ++ manifestEntryBytes 152 3 ++ nodesEntryBytes 155 9 ++ manifestBytes ++ nodesBytes
|
||||
input = "readArboricxRequiredSections " ++ bytesExpr badContainer
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT eofT (bytesT [101,102,103,201,202,203,204])
|
||||
|
||||
-- ------------------------------------------------------------------------
|
||||
-- Arborix raw nodes section parsing
|
||||
-- Arboricx raw nodes section parsing
|
||||
-- ------------------------------------------------------------------------
|
||||
|
||||
, testCase "readNodeRecord: parses hash length and raw payload" $ do
|
||||
let input = "readNodeRecord [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (3) (101) (102) (103) (9)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT
|
||||
(pairT (bytesT [1..32])
|
||||
@@ -1913,13 +1907,13 @@ binaryReaderTests = testGroup "Binary Reader Tests"
|
||||
|
||||
, testCase "readNodeRecord: truncated payload returns EOF preserving unread payload" $ do
|
||||
let input = "readNodeRecord [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (3) (101) (102)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT eofT (bytesT [101,102])
|
||||
|
||||
, testCase "readNodesSection: parses node count and records" $ do
|
||||
let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (9)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT
|
||||
(pairT (bytesT [0,0,0,0,0,0,0,1])
|
||||
@@ -1932,37 +1926,37 @@ binaryReaderTests = testGroup "Binary Reader Tests"
|
||||
|
||||
, testCase "readNodesSectionComplete: rejects trailing bytes inside nodes section" $ do
|
||||
let input = "readNodesSectionComplete [(0) (0) (0) (0) (0) (0) (0) (0) (9)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT unexpectedBytesT (bytesT [9])
|
||||
|
||||
, testCase "readNodesSection: rejects duplicate node hashes" $ do
|
||||
let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (9)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT duplicateNodeT (bytesT [9])
|
||||
|
||||
, testCase "nodePayloadValid?: accepts leaf stem and fork payload shapes" $ do
|
||||
let input = "[(nodePayloadValid? [(0)]) (nodePayloadValid? [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]) (nodePayloadValid? [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)])]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofList [trueT, trueT, trueT]
|
||||
|
||||
, testCase "nodePayloadValid?: rejects invalid payload shapes" $ do
|
||||
let input = "[(nodePayloadValid? []) (nodePayloadValid? [(9)]) (nodePayloadValid? [(1) (1)]) (nodePayloadValid? [(2) (1) (2)])]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofList [falseT, falseT, falseT, falseT]
|
||||
|
||||
, testCase "node payload child accessors expose raw hashes" $ do
|
||||
let input = "[(nodePayloadStemChildHash [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]) (nodePayloadForkLeftHash [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]) (nodePayloadForkRightHash [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)])]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofList [bytesT [1..32], bytesT [1..32], bytesT [33..64]]
|
||||
|
||||
, testCase "lookupNodeRecord: finds record by raw node hash" $ do
|
||||
let input = "lookupNodeRecord [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (1)] [(0)]))]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= justT
|
||||
(pairT (bytesT [33..64])
|
||||
@@ -1971,7 +1965,7 @@ binaryReaderTests = testGroup "Binary Reader Tests"
|
||||
|
||||
, testCase "nodeRecordChildHashes: extracts stem and fork references" $ do
|
||||
let input = "[(nodeRecordChildHashes (pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (33)] [(1) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]))) (nodeRecordChildHashes (pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (65)] [(2) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64) (65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)])))]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofList
|
||||
[ ofList [bytesT [33..64]]
|
||||
@@ -1980,20 +1974,20 @@ binaryReaderTests = testGroup "Binary Reader Tests"
|
||||
|
||||
, testCase "readNodesSection: rejects invalid node payload shape" $ do
|
||||
let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (9)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT invalidNodePayloadT (bytesT [])
|
||||
|
||||
, testCase "readNodesSection: rejects missing child node" $ do
|
||||
let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (33) (1) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64) (9)]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= errT missingNodeT (bytesT [9])
|
||||
|
||||
, testCase "readArborixNodesSection: extracts and parses raw nodes section" $ do
|
||||
, testCase "readArboricxNodesSection: extracts and parses raw nodes section" $ do
|
||||
let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0]
|
||||
input = "readArborixNodesSection " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes)
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
input = "readArboricxNodesSection " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes)
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT
|
||||
(pairT (bytesT [0,0,0,0,0,0,0,1])
|
||||
@@ -2005,186 +1999,186 @@ binaryReaderTests = testGroup "Binary Reader Tests"
|
||||
(bytesT ([101,102,103] ++ nodesBytes))
|
||||
|
||||
-- ------------------------------------------------------------------------
|
||||
-- Arborix node DAG reconstruction
|
||||
-- Arboricx node DAG reconstruction
|
||||
-- ------------------------------------------------------------------------
|
||||
|
||||
, testCase "nodeHashToTree: reconstructs leaf node" $ do
|
||||
let input = "nodeHashToTree [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)]))]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT Leaf Leaf
|
||||
|
||||
, testCase "nodeHashToTree: reconstructs stem node" $ do
|
||||
let input = "nodeHashToTree [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (33)] [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]))]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT (Stem Leaf) Leaf
|
||||
|
||||
, testCase "nodeHashToTree: reconstructs fork node" $ do
|
||||
let input = "nodeHashToTree [(65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)] (pair [(0) (0) (0) (65)] [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]))]"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT (Fork Leaf Leaf) Leaf
|
||||
|
||||
, testCase "readArborixTreeFromHash: reconstructs tree from bundle bytes" $ do
|
||||
, testCase "readArboricxTreeFromHash: reconstructs tree from bundle bytes" $ do
|
||||
let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0]
|
||||
input = "readArborixTreeFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes)
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
input = "readArboricxTreeFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes)
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT Leaf (bytesT ([101,102,103] ++ nodesBytes))
|
||||
|
||||
, testCase "readArborixExecutableFromHash: alias reconstructs tree" $ do
|
||||
, testCase "readArboricxExecutableFromHash: alias reconstructs tree" $ do
|
||||
let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0]
|
||||
input = "readArborixExecutableFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes)
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
input = "readArboricxExecutableFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes)
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= okT Leaf (bytesT ([101,102,103] ++ nodesBytes))
|
||||
|
||||
, testCase "readArborixNodesSection: reads id fixture bundle" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/id.arborix"
|
||||
, testCase "readArboricxNodesSection: reads id fixture bundle" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/id.arboricx"
|
||||
case decodeBundle fixtureBytes of
|
||||
Left err -> assertFailure $ "decodeBundle failed: " ++ err
|
||||
Right _ -> do
|
||||
let input = "matchResult (code rest : code) (nodes rest : 0) (readArborixNodesSection "
|
||||
let input = "matchResult (code rest : code) (nodes rest : 0) (readArboricxNodesSection "
|
||||
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
|
||||
++ ")"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofNumber 0
|
||||
|
||||
, testCase "readArborixNodesSection: reads notQ fixture bundle" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix"
|
||||
, testCase "readArboricxNodesSection: reads notQ fixture bundle" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx"
|
||||
case decodeBundle fixtureBytes of
|
||||
Left err -> assertFailure $ "decodeBundle failed: " ++ err
|
||||
Right _ -> do
|
||||
let input = "matchResult (code rest : code) (nodes rest : 0) (readArborixNodesSection "
|
||||
let input = "matchResult (code rest : code) (nodes rest : 0) (readArboricxNodesSection "
|
||||
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
|
||||
++ ")"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofNumber 0
|
||||
|
||||
, testCase "readArborixNodesSection: reads map fixture bundle" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/map.arborix"
|
||||
, testCase "readArboricxNodesSection: reads map fixture bundle" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/map.arboricx"
|
||||
case decodeBundle fixtureBytes of
|
||||
Left err -> assertFailure $ "decodeBundle failed: " ++ err
|
||||
Right _ -> do
|
||||
let input = "matchResult (code rest : code) (nodes rest : 0) (readArborixNodesSection "
|
||||
let input = "matchResult (code rest : code) (nodes rest : 0) (readArboricxNodesSection "
|
||||
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
|
||||
++ ")"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofNumber 0
|
||||
|
||||
, testCase "readArborixExecutableFromHash: reconstructs id fixture root" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/id.arborix"
|
||||
, testCase "readArboricxExecutableFromHash: reconstructs id fixture root" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/id.arboricx"
|
||||
case decodeBundle fixtureBytes of
|
||||
Left err -> assertFailure $ "decodeBundle failed: " ++ err
|
||||
Right bundle -> case bundleRoots bundle of
|
||||
[] -> assertFailure "fixture has no roots"
|
||||
(rootHash:_) -> do
|
||||
let input = "matchResult (code rest : code) (tree rest : 0) (readArborixExecutableFromHash "
|
||||
let input = "matchResult (code rest : code) (tree rest : 0) (readArboricxExecutableFromHash "
|
||||
++ bytesExpr (hexTextBytes rootHash)
|
||||
++ " "
|
||||
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
|
||||
++ ")"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofNumber 0
|
||||
|
||||
, testCase "readArborixExecutableFromHash: reconstructs notQ fixture root" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix"
|
||||
, testCase "readArboricxExecutableFromHash: reconstructs notQ fixture root" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx"
|
||||
case decodeBundle fixtureBytes of
|
||||
Left err -> assertFailure $ "decodeBundle failed: " ++ err
|
||||
Right bundle -> case bundleRoots bundle of
|
||||
[] -> assertFailure "fixture has no roots"
|
||||
(rootHash:_) -> do
|
||||
let input = "matchResult (code rest : code) (tree rest : 0) (readArborixExecutableFromHash "
|
||||
let input = "matchResult (code rest : code) (tree rest : 0) (readArboricxExecutableFromHash "
|
||||
++ bytesExpr (hexTextBytes rootHash)
|
||||
++ " "
|
||||
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
|
||||
++ ")"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofNumber 0
|
||||
|
||||
, testCase "readArborixExecutableFromHash: reconstructs map fixture root" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/map.arborix"
|
||||
, testCase "readArboricxExecutableFromHash: reconstructs map fixture root" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/map.arboricx"
|
||||
case decodeBundle fixtureBytes of
|
||||
Left err -> assertFailure $ "decodeBundle failed: " ++ err
|
||||
Right bundle -> case bundleRoots bundle of
|
||||
[] -> assertFailure "fixture has no roots"
|
||||
(rootHash:_) -> do
|
||||
let input = "matchResult (code rest : code) (tree rest : 0) (readArborixExecutableFromHash "
|
||||
let input = "matchResult (code rest : code) (tree rest : 0) (readArboricxExecutableFromHash "
|
||||
++ bytesExpr (hexTextBytes rootHash)
|
||||
++ " "
|
||||
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
|
||||
++ ")"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofNumber 0
|
||||
|
||||
, testCase "readArborixExecutableFromHash: executes id fixture root" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/id.arborix"
|
||||
, testCase "readArboricxExecutableFromHash: executes id fixture root" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/id.arboricx"
|
||||
case decodeBundle fixtureBytes of
|
||||
Left err -> assertFailure $ "decodeBundle failed: " ++ err
|
||||
Right bundle -> case bundleRoots bundle of
|
||||
[] -> assertFailure "fixture has no roots"
|
||||
(rootHash:_) -> do
|
||||
let input = "matchResult (code rest : code) (tree rest : tree 42) (readArborixExecutableFromHash "
|
||||
let input = "matchResult (code rest : code) (tree rest : tree 42) (readArboricxExecutableFromHash "
|
||||
++ bytesExpr (hexTextBytes rootHash)
|
||||
++ " "
|
||||
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
|
||||
++ ")"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= ofNumber 42
|
||||
|
||||
, testCase "readArborixExecutableFromHash: executes notQ fixture on true" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix"
|
||||
, testCase "readArboricxExecutableFromHash: executes notQ fixture on true" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx"
|
||||
case decodeBundle fixtureBytes of
|
||||
Left err -> assertFailure $ "decodeBundle failed: " ++ err
|
||||
Right bundle -> case bundleRoots bundle of
|
||||
[] -> assertFailure "fixture has no roots"
|
||||
(rootHash:_) -> do
|
||||
let input = "matchResult (code rest : code) (tree rest : tree true) (readArborixExecutableFromHash "
|
||||
let input = "matchResult (code rest : code) (tree rest : tree true) (readArboricxExecutableFromHash "
|
||||
++ bytesExpr (hexTextBytes rootHash)
|
||||
++ " "
|
||||
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
|
||||
++ ")"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= falseT
|
||||
|
||||
, testCase "readArborixExecutableFromHash: executes notQ fixture on false" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix"
|
||||
, testCase "readArboricxExecutableFromHash: executes notQ fixture on false" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx"
|
||||
case decodeBundle fixtureBytes of
|
||||
Left err -> assertFailure $ "decodeBundle failed: " ++ err
|
||||
Right bundle -> case bundleRoots bundle of
|
||||
[] -> assertFailure "fixture has no roots"
|
||||
(rootHash:_) -> do
|
||||
let input = "matchResult (code rest : code) (tree rest : tree false) (readArborixExecutableFromHash "
|
||||
let input = "matchResult (code rest : code) (tree rest : tree false) (readArboricxExecutableFromHash "
|
||||
++ bytesExpr (hexTextBytes rootHash)
|
||||
++ " "
|
||||
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
|
||||
++ ")"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= trueT
|
||||
|
||||
, testCase "readArborixExecutableFromHash: executes map fixture root" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/map.arborix"
|
||||
, testCase "readArboricxExecutableFromHash: executes map fixture root" $ do
|
||||
fixtureBytes <- BS.readFile "test/fixtures/map.arboricx"
|
||||
case decodeBundle fixtureBytes of
|
||||
Left err -> assertFailure $ "decodeBundle failed: " ++ err
|
||||
Right bundle -> case bundleRoots bundle of
|
||||
[] -> assertFailure "fixture has no roots"
|
||||
(rootHash:_) -> do
|
||||
let input = "matchResult (code rest : code) (tree rest : head (tail (tree (a : (t t t)) [(t) (t) (t)]))) (readArborixExecutableFromHash "
|
||||
let input = "matchResult (code rest : code) (tree rest : head (tail (tree (a : (t t t)) [(t) (t) (t)]))) (readArboricxExecutableFromHash "
|
||||
++ bytesExpr (hexTextBytes rootHash)
|
||||
++ " "
|
||||
++ bytesExpr (map toInteger $ BS.unpack fixtureBytes)
|
||||
++ ")"
|
||||
library <- evaluateFile "./lib/arborix.tri"
|
||||
library <- evaluateFile "./lib/arboricx.tri"
|
||||
let env = evalTricu library (parseTricu input)
|
||||
result env @?= Fork Leaf Leaf
|
||||
]
|
||||
|
||||
BIN
test/fixtures/false.arboricx
vendored
Normal file
BIN
test/fixtures/false.arboricx
vendored
Normal file
Binary file not shown.
BIN
test/fixtures/false.arborix
vendored
BIN
test/fixtures/false.arborix
vendored
Binary file not shown.
BIN
test/fixtures/id.arboricx
vendored
Normal file
BIN
test/fixtures/id.arboricx
vendored
Normal file
Binary file not shown.
BIN
test/fixtures/id.arborix
vendored
BIN
test/fixtures/id.arborix
vendored
Binary file not shown.
BIN
test/fixtures/map.arboricx
vendored
Normal file
BIN
test/fixtures/map.arboricx
vendored
Normal file
Binary file not shown.
BIN
test/fixtures/map.arborix
vendored
BIN
test/fixtures/map.arborix
vendored
Binary file not shown.
BIN
test/fixtures/notQ.arboricx
vendored
Normal file
BIN
test/fixtures/notQ.arboricx
vendored
Normal file
Binary file not shown.
BIN
test/fixtures/notQ.arborix
vendored
BIN
test/fixtures/notQ.arborix
vendored
Binary file not shown.
BIN
test/fixtures/true.arboricx
vendored
Normal file
BIN
test/fixtures/true.arboricx
vendored
Normal file
Binary file not shown.
BIN
test/fixtures/true.arborix
vendored
BIN
test/fixtures/true.arborix
vendored
Binary file not shown.
Reference in New Issue
Block a user