From 31bf7094f4ed1c1023de02e1e0d52a9b5d6b8d7f Mon Sep 17 00:00:00 2001 From: James Eversole Date: Mon, 11 May 2026 19:53:37 -0500 Subject: [PATCH] Arboricx bundle format 1.1 We don't need SHA verification or Merkle dags in our transport bundle. Content stores can handle both bundle and term verification and hashing. --- docs/arboricx-bundle-format.md | 363 +- ext/js/.gitignore | 1 + ext/js/package-lock.json | 29 + ext/js/package.json | 9 +- ext/js/src/bundle.js | 191 - ext/js/src/cli.js | 251 +- ext/js/src/codecs.js | 135 - ext/js/src/lib.js | 224 + ext/js/src/manifest.js | 374 -- ext/js/src/merkle.js | 276 -- ext/js/src/tree.js | 125 - ext/js/test/bundle.test.js | 189 +- ext/js/test/merkle.test.js | 180 - ext/js/test/reduce.test.js | 163 +- ext/js/test/run-bundle.test.js | 207 +- ext/zig/kernel_run_arboricx_typed.dag | 4822 ++++++++++----------- ext/zig/result | 1 + ext/zig/src/bundle.zig | 232 +- ext/zig/src/main.zig | 46 +- ext/zig/src/reduce.zig | 20 +- ext/zig/tests/native_bundle_append_test.c | 2 +- ext/zig/tests/native_bundle_bools_test.c | 8 +- ext/zig/tests/native_bundle_id_test.c | 2 +- ext/zig/tests/python_ffi_test.py | 2 +- flake.nix | 44 + lib/arboricx-common.tri | 22 +- lib/arboricx-dispatch.tri | 19 +- lib/arboricx-manifest.tri | 26 +- lib/arboricx-nodes.tri | 222 +- lib/arboricx.tri | 57 +- lib/list.tri | 12 + src/FileEval.hs | 45 +- src/Main.hs | 38 +- src/REPL.hs | 23 +- src/Server.hs | 23 +- src/Wire.hs | 831 ++-- test/Spec.hs | 1943 +-------- test/fixtures/append.arboricx | Bin 4711 -> 968 bytes test/fixtures/false.arboricx | Bin 572 -> 432 bytes test/fixtures/id.arboricx | Bin 811 -> 460 bytes test/fixtures/map.arboricx | Bin 5593 -> 1079 bytes test/fixtures/notQ.arboricx | Bin 1045 -> 492 bytes test/fixtures/size.arboricx | Bin 0 -> 1300 bytes test/fixtures/true.arboricx | Bin 641 -> 440 bytes tricu.cabal | 2 + 45 files changed, 4032 insertions(+), 7127 deletions(-) create mode 100644 ext/js/.gitignore create mode 100644 ext/js/package-lock.json delete mode 100644 ext/js/src/bundle.js delete mode 100644 ext/js/src/codecs.js create mode 100644 ext/js/src/lib.js delete mode 100644 ext/js/src/manifest.js delete mode 100644 ext/js/src/merkle.js delete mode 100644 ext/js/src/tree.js delete mode 100644 ext/js/test/merkle.test.js create mode 120000 ext/zig/result create mode 100644 test/fixtures/size.arboricx diff --git a/docs/arboricx-bundle-format.md b/docs/arboricx-bundle-format.md index f567f6a..9b9f26c 100644 --- a/docs/arboricx-bundle-format.md +++ b/docs/arboricx-bundle-format.md @@ -1,117 +1,119 @@ # Arboricx Portable Bundle Format Specification -**Version:** 0.1 -**Status:** Exploratory -**Author:** A range of slopmachines guided by James Eversole -**Human Review Status:** 5 minute scan-through - this is an evolving and malleable document +**Version:** 1.1 (Indexed) -The Arboricx Portable Bundle is a self-contained, content-addressed binary format for distributing Tree Calculus programs and their associated Merkle DAGs. It provides: +**Status:** Stable -- A fixed binary container with header, section directory, and typed sections -- A language-neutral Merkle node layer for content-addressed tree values -- A fixed-order binary manifest for semantic metadata, exports, and optional extensions +**Author:** Slopmachines guided by James Eversole + +The Arboricx Portable Bundle is a self-contained binary format for distributing Tree Calculus programs. It uses topological indexing instead of cryptographic hashing for node identity, making it writable from pure Tree Calculus and verifiable via structural inspection. ## Table of Contents -1. [Top-Level Container Layout](#1-top-level-container-layout) -2. [Header](#2-header) -3. [Section Directory](#3-section-directory) -4. [Section: Manifest (type 1)](#4-section-manifest-type-1) -5. [Section: Nodes (type 2)](#5-section-nodes-type-2) -6. [Merkle Node Payload Format](#6-merkle-node-payload-format) -7. [Merkle Hash Computation](#7-merkle-hash-computation) +1. [Design Principles](#1-design-principles) +2. [Top-Level Container Layout](#2-top-level-container-layout) +3. [Header](#3-header) +4. [Section Directory](#4-section-directory) +5. [Section: Manifest (type 1)](#5-section-manifest-type-1) +6. [Section: Nodes (type 2)](#6-section-nodes-type-2) +7. [Node Payload Format](#7-node-payload-format) 8. [Tree Calculus Reduction Semantics](#8-tree-calculus-reduction-semantics) 9. [Binary Primitives](#9-binary-primitives) 10. [Bundle Verification](#10-bundle-verification) -11. [Known Section Types](#11-known-section-types) +11. [Canonicalization](#11-canonicalization) +12. [Known Section Types](#12-known-section-types) --- -## 1. Top-Level Container Layout +## 1. Design Principles -An Arboricx bundle is a flat binary blob with the following layout: +- **No cryptographic primitives required.** Node identity is topological (array index), not a SHA-256 hash. +- **Self-contained.** A bundle includes all nodes reachable from its exports. No external references. +- **Deterministic.** Canonical bundles produce byte-identical output for identical input terms. +- **Small.** ~5 bytes per node entry (length + payload) versus ~36 bytes in hash-based formats. +- **Verifiable via structure.** Bounds checking and acyclicity verification replace hash recomputation. + +Global artifact identity (for registries, lockfiles, or content-addressed caches) is achieved by hashing the complete canonical bundle file externally. The bundle format itself knows nothing about this hash. + +--- + +## 2. Top-Level Container Layout ``` +------------------+------------------+------------------+------------------+ | Header | Section Directory| Manifest Section | Nodes Section | -| (32 bytes) | (N × 60 bytes) | (variable) | (variable) | +| (32 bytes) | (N × 32 bytes) | (variable) | (variable) | +------------------+------------------+------------------+------------------+ ``` -The container uses **big-endian** byte order for all multi-byte integers. +Total bundle size = 32 + (sectionCount × 32) + manifestSize + nodesSize -Total bundle size = 32 + (sectionCount × 60) + manifestSize + nodesSize +All multi-byte integers use **big-endian** byte order. --- -## 2. Header +## 3. Header | Offset | Size | Field | Description | |--------|------|-------|-------------| -| 0 | 8 bytes | Magic | ASCII `"ARBORICX"` (`0x41 0x52 0x42 0x4F 0x52 0x49 0x43 0x58`) | +| 0 | 8 bytes | Magic | ASCII `"ARBORICX"` | | 8 | 2 bytes | Major version | `u16` BE. Currently `1` | | 10 | 2 bytes | Minor version | `u16` BE. Currently `0` | | 12 | 4 bytes | Section count | `u32` BE. Number of entries in the section directory | | 16 | 8 bytes | Flags | `u64` BE. Reserved; currently all zeros | -| 24 | 8 bytes | Directory offset | `u64` BE. Byte offset from the start of the bundle to the section directory | - -**Constraints:** -- Major version must be `1`. Bundles with unsupported major versions are rejected. -- The directory offset must point to a valid location within the bundle. -- The directory offset is always `32` for bundles with the current layout (header immediately followed by the directory). +| 24 | 8 bytes | Directory offset | `u64` BE. Byte offset to the section directory (always `32`) | --- -## 3. Section Directory +## 4. Section Directory -The section directory is an array of `N` entries, where `N` is the section count from the header. Each entry is exactly **60 bytes**. +Array of `N` entries, each exactly **32 bytes**. | Offset (within entry) | Size | Field | Description | |----------------------|------|-------|-------------| -| 0 | 4 bytes | Type | `u32` BE. Section type identifier (see [Known Section Types](#11-known-section-types)) | +| 0 | 4 bytes | Type | `u32` BE. Section type identifier | | 4 | 2 bytes | Version | `u16` BE. Section-specific version | -| 6 | 2 bytes | Flags | `u16` BE. Bit flags: bit 0 (`0x0001`) = critical section | -| 8 | 2 bytes | Compression | `u16` BE. Compression codec (currently only `0` = none) | -| 10 | 2 bytes | Digest algorithm | `u16` BE. Hash algorithm (currently only `1` = SHA-256) | -| 12 | 8 bytes | Offset | `u64` BE. Byte offset from the start of the bundle to the section data | -| 20 | 8 bytes | Length | `u64` BE. Length of the section data in bytes | -| 28 | 32 bytes | SHA-256 digest | Raw digest of the section data | +| 6 | 2 bytes | Flags | `u16` BE. Bit 0 (`0x0001`) = critical section | +| 8 | 2 bytes | Compression | `u16` BE. `0` = none (currently the only value) | +| 10 | 2 bytes | Reserved | `u16` BE. Padding; must be zero | +| 12 | 8 bytes | Offset | `u64` BE. Byte offset from bundle start to section data | +| 20 | 8 bytes | Length | `u64` BE. Length of section data in bytes | +| 28 | 4 bytes | Reserved | Padding; must be zero | **Verification:** -- Unknown critical sections (flags & `0x0001`) are rejected. +- Unknown critical sections are rejected. - Compression must be `0` (none). -- Digest algorithm must be `1` (SHA-256). -- The SHA-256 digest in the directory entry must match `SHA256(section_data)`. +- Reserved fields must be zero. + +**Note:** No per-section digest is stored. Integrity is verified at the distribution layer (e.g. SHA-256 of the complete bundle file) rather than inside the container. --- -## 4. Section: Manifest (type 1) +## 5. Section: Manifest (type 1) -The manifest is a binary encoding of bundle metadata. It uses a **fixed-order core** layout followed by an optional **TLV tail** for extensibility. - -### 4.1 Format +Binary encoding of bundle metadata. Fixed-order core layout followed by optional TLV tail. ``` Manifest = magic 8 bytes "ARBMNFST" major u16 BE Manifest major version (1) - minor u16 BE Manifest minor version (0) + minor u16 BE Manifest minor version (1) - schema string Length-prefixed UTF-8 text - bundleType string Length-prefixed UTF-8 text + schema string "arboricx.bundle.manifest.v1" + bundleType string "tree-calculus-executable-object" - treeCalculus string Length-prefixed UTF-8 text - treeHashAlgorithm string Length-prefixed UTF-8 text - treeHashDomain string Length-prefixed UTF-8 text - treeNodePayload string Length-prefixed UTF-8 text + treeCalculus string "tree-calculus.v1" + treeHashAlgorithm string "indexed" + treeHashDomain string "arboricx.indexed.node.v1" + treeNodePayload string "arboricx.indexed.payload.v1" - runtimeSemantics string Length-prefixed UTF-8 text - runtimeEvaluation string Length-prefixed UTF-8 text - runtimeAbi string Length-prefixed UTF-8 text - capabilityCount u32 BE Number of capability strings - capabilities string[] Array of length-prefixed UTF-8 capability strings + runtimeSemantics string "tree-calculus.v1" + runtimeEvaluation string "normal-order" + runtimeAbi string "arboricx.abi.tree.v1" + capabilityCount u32 BE Number of capability strings (currently 0) + capabilities string[] Array of length-prefixed UTF-8 strings - closure u8 0 = complete, 1 = partial + closure u8 0 = complete rootCount u32 BE Number of root entries roots Root[] Array of root entries exportCount u32 BE Number of export entries @@ -119,93 +121,76 @@ Manifest = metadataFieldCount u32 BE Number of metadata TLV entries metadataFields TLV[] Metadata tag-value entries - extensionFieldCount u32 BE Number of extension TLV entries - extensionFields TLV[] Extension tag-value entries (skipped by parsers) + extensionFieldCount u32 BE Number of extension TLV entries (currently 0) + extensionFields TLV[] Extension entries (skipped by parsers) ``` -**Trailing bytes after the manifest must be zero** (no leftover data). - -### 4.2 String Format - -Every `string` field uses the same encoding: +### String Format ``` string = - length u32 BE Number of UTF-8 bytes in the string (not the number of characters) - bytes byte[length] UTF-8 encoded string content + length u32 BE Number of UTF-8 bytes + bytes byte[length] UTF-8 content ``` -The length field carries the byte count, so parsers can skip strings without decoding UTF-8. - -### 4.3 Root Entry +### Root Entry ``` Root = - hash 32 bytes Raw SHA-256 hash of the Merkle node - role string Length-prefixed UTF-8 text ("default" for the first root, "root" for others) + index u32 BE Node index into the nodes section + role string Length-prefixed UTF-8 ("default" for first root, "root" for others) ``` -The hash is stored as **raw bytes** (not hex-encoded). It corresponds to the Merkle hash of the node. - -### 4.4 Export Entry +### Export Entry ``` Export = - name string Length-prefixed UTF-8 text (export identifier) - root 32 bytes Raw SHA-256 hash of the Merkle node - kind string Length-prefixed UTF-8 text (currently "term") - abi string Length-prefixed UTF-8 text (ABI string) + name string Length-prefixed UTF-8 export identifier + root u32 BE Node index into the nodes section + kind string Length-prefixed UTF-8 (currently "term") + abi string Length-prefixed UTF-8 ABI string ``` -### 4.5 TLV Entry +### TLV Entry ``` TLV = - tag u16 BE Tag identifier (type) - length u32 BE Number of bytes in the value - value byte[length] Raw bytes + tag u16 BE Tag identifier + length u32 BE Value length in bytes + value byte[length] ``` -TLV entries support variable-length values and are skippable by parsers that do not recognize a tag: read the `u32` length and advance by `2 + 4 + length` bytes. - -### 4.6 Metadata Tags +### Metadata Tags | Tag | Name | Value | |-----|------|-------| -| 1 | package | UTF-8 text: package name | -| 2 | version | UTF-8 text: version string | -| 3 | description | UTF-8 text: description | -| 4 | license | UTF-8 text: license identifier or text | -| 5 | createdBy | UTF-8 text: creator identifier | +| 1 | package | UTF-8 text | +| 2 | version | UTF-8 text | +| 3 | description | UTF-8 text | +| 4 | license | UTF-8 text | +| 5 | createdBy | UTF-8 text | Unknown metadata tags are ignored. Unknown extension tags are skipped by length. -### 4.7 Semantic Constraints - -A valid bundle manifest must satisfy: +### Semantic Constraints | Constraint | Value | |-----------|-------| | `schema` | `"arboricx.bundle.manifest.v1"` | | `bundleType` | `"tree-calculus-executable-object"` | | `treeCalculus` | `"tree-calculus.v1"` | -| `treeHashAlgorithm` | `"sha256"` | -| `treeHashDomain` | `"arboricx.merkle.node.v1"` | -| `treeNodePayload` | `"arboricx.merkle.payload.v1"` | +| `treeHashAlgorithm` | `"indexed"` | +| `treeHashDomain` | `"arboricx.indexed.node.v1"` | +| `treeNodePayload` | `"arboricx.indexed.payload.v1"` | | `runtimeSemantics` | `"tree-calculus.v1"` | | `runtimeAbi` | `"arboricx.abi.tree.v1"` | -| `runtimeCapabilities` | Empty array | | `closure` | `0` (complete) | | `rootCount` | At least 1 | | `exportCount` | At least 1 | -| Export names | Non-empty | -| Export roots | Non-empty (32 bytes each) | --- -## 5. Section: Nodes (type 2) - -The nodes section contains all Merkle DAG nodes referenced by the manifest. It is a sequence of node entries preceded by a count. +## 6. Section: Nodes (type 2) ``` NodesSection = @@ -213,22 +198,21 @@ NodesSection = entries NodeEntry[] ``` -Each node entry: +### Node Entry ``` NodeEntry = - hash 32 bytes Raw SHA-256 hash of this node - payloadLen u32 BE Length of the payload in bytes - payload byte[payloadLen] Node payload (see Section 6) + payloadLen u32 BE Length of payload in bytes + payload byte[payloadLen] ``` -The node count is `u64` to support large bundles. Entries are stored in the order produced by the exporter (typically sorted by hash for determinism). +There is **no hash field**. The node is identified solely by its position in the array. --- -## 6. Merkle Node Payload Format +## 7. Node Payload Format -Each node in the Merkle DAG is one of three types. The payload is a single byte type tag followed by hash references: +Child references are `u32` big-endian indices into the node array. The array **must** be topologically sorted: every child index must be strictly less than the entry's own position. ### Leaf @@ -236,152 +220,116 @@ Each node in the Merkle DAG is one of three types. The payload is a single byte Payload = 0x00 ``` -A leaf has no children. The payload is exactly 1 byte. +Exactly 1 byte. ### Stem ``` -Payload = 0x01 || child_hash (32 bytes raw) +Payload = 0x01 || child_index (u32 BE) ``` -A stem has exactly one child. The payload is 33 bytes. +Exactly 5 bytes. ### Fork ``` -Payload = 0x02 || left_hash (32 bytes raw) || right_hash (32 bytes raw) +Payload = 0x02 || left_index (u32 BE) || right_index (u32 BE) ``` -A fork has exactly two children. The payload is 65 bytes. - -**Validation:** -- Leaf payloads must be exactly 1 byte (`0x00`). -- Stem payloads must be exactly 33 bytes. -- Fork payloads must be exactly 65 bytes. -- Unknown type bytes are rejected. - ---- - -## 7. Merkle Hash Computation - -Each node is identified by a SHA-256 hash of its canonical payload: - -``` -hash = SHA256( domain_tag || 0x00 || payload ) -``` - -Where: - -| Component | Value | -|-----------|-------| -| `domain_tag` | `"arboricx.merkle.node.v1"` as UTF-8 bytes | -| Separator | `0x00` (one zero byte) | -| `payload` | The node's canonical serialization from Section 6 | - -**Examples:** - -- **Leaf:** `SHA256("arboricx.merkle.node.v1" || 0x00 || 0x00)` -- **Stem:** `SHA256("arboricx.merkle.node.v1" || 0x00 || 0x01 || child_hash_bytes)` -- **Fork:** `SHA256("arboricx.merkle.node.v1" || 0x00 || 0x02 || left_hash_bytes || right_hash_bytes)` - -The resulting SHA-256 hash is stored as a hex-encoded string in the manifest (64 hex characters). Within the nodes section, it is stored as raw bytes. +Exactly 9 bytes. --- ## 8. Tree Calculus Reduction Semantics -The bundle represents a **Tree Calculus** term as a Merkle DAG. The reduction rules are: - -### Apply Rules +The bundle represents a **Tree Calculus** term. The reduction rules are: ``` -apply(Fork(Leaf, a), _) = a -apply(Fork(Stem(a), b), c) = apply(apply(a, c), apply(b, c)) -apply(Fork(Fork, _, _), Leaf) = left of inner Fork -apply(Fork(Fork, _, _), Stem) = right of inner Fork -apply(Fork(Fork, _, _), Fork) = apply(apply(c, u), v) where c = Fork(u, v) -apply(Leaf, b) = Stem(b) -apply(Stem(a), b) = Fork(a, b) +The t operator is left associative. +1. t t a b -> a +2. t (t a) b c -> a c (b c) +3a. t (t a b) c t -> a +3b. t (t a b) c (t u) -> b u +3c. t (t a b) c (t u v) -> c u v ``` -### Internal Representation - -In the reduction engine, Fork nodes use a `[right, left]` (stack) ordering: -- `Fork = [right_child, left_child]` -- `Stem = [child]` -- `Leaf = []` - -This ordering supports stack-based reduction: pop two terms, apply, push results back. - -### Closure - -The bundle declares `closure = "complete"`, meaning all nodes reachable from export roots are present in the nodes section. No external references exist. +**Closure:** The bundle declares `closure = "complete"`, meaning all nodes reachable from export roots are present in the nodes section. No external references exist. --- ## 9. Binary Primitives -All multi-byte integers use **big-endian** byte order. +### u8 + +Single byte, value `0-255`. ### u16 (2 bytes) ``` -byte[0] | byte[1] value = (byte[0] << 8) | byte[1] ``` ### u32 (4 bytes) ``` -byte[0] | byte[1] | byte[2] | byte[3] value = (byte[0] << 24) | (byte[1] << 16) | (byte[2] << 8) | byte[3] ``` ### u64 (8 bytes) ``` -byte[0] ... byte[7] value = (byte[0] << 56) | ... | byte[7] ``` -### u8 (1 byte) - -A single byte, value `0-255`. - --- ## 10. Bundle Verification -A complete bundle verification proceeds in this order: - 1. **Magic check:** First 8 bytes must be `"ARBORICX"`. 2. **Version check:** Major version must be `1`. -3. **Section directory:** Parse all entries; reject unknown critical sections. -4. **Digest verification:** For each section, compute `SHA256(section_data)` and compare with the digest in the directory entry. -5. **Manifest parsing:** Decode the fixed-order manifest; validate semantic constraints. -6. **Node section:** Parse all node entries; reject duplicates. -7. **Root verification:** All root hashes from the manifest must exist in the node map. -8. **Export verification:** All export root hashes must exist in the node map. -9. **Node hash verification:** For each node, compute `SHA256(domain || 0x00 || payload)` and compare with the stored hash. -10. **Children verification:** For each Stem/Fork node, both child hashes must exist in the node map. -11. **Closure verification:** Starting from each root hash, traverse the DAG and confirm all reachable nodes are present. +3. **Section directory:** Parse all entries; reject unknown critical sections. Verify reserved fields are zero. +4. **Manifest parsing:** Decode fixed-order manifest; validate semantic constraints. +5. **Nodes section:** Parse all entries. +6. **Bounds checking:** + - Every root index `< nodeCount` + - Every export index `< nodeCount` + - In every Stem payload, `child_index < entry_position` and `child_index < nodeCount` + - In every Fork payload, both indices `< entry_position` and `< nodeCount` +7. **Acyclicity:** Guaranteed by the `child < parent` rule above. +8. **Closure:** Traverse from all root/export indices; confirm every reached index is valid. + +No hash computation is required. --- -## 11. Known Section Types +## 11. Canonicalization + +A bundle is **canonical** iff: + +1. **Maximal deduplication.** No two entries represent structurally identical subtrees. +2. **Topological order.** Children precede parents. +3. **Deterministic post-order traversal.** Nodes are emitted in the order discovered by a left-to-right recursive post-order walk. +4. **No trailing bytes** in any section. +5. **Reserved fields are zero.** + +Canonical bundles produce deterministic bytes and can be file-level hashed for global identity. + +--- + +## 12. Known Section Types | Type | Name | Required | Version | Description | |------|------|----------|---------|-------------| -| 1 | Manifest | Yes | 1 | Bundle metadata in fixed-order binary format | -| 2 | Nodes | Yes | 1 | Merkle DAG node entries | +| 1 | Manifest | Yes | 1 | Bundle metadata | +| 2 | Nodes | Yes | 1 | Topological DAG node entries | -Unknown section types are permitted if not marked as critical (flags bit 0 is not set). +Unknown section types are permitted if not marked critical. --- -## Appendix A: Complete Example Layout (id.arboricx) +## Appendix A: Complete Example Layout -A minimal `id.arboricx` bundle has: +A minimal bundle for `Stem(Leaf)` (the Tree Calculus encoding of `t t`): ``` +---------------------------------------------------+ @@ -392,28 +340,25 @@ A minimal `id.arboricx` bundle has: | Flags: 0 | | Dir offset: 32 | +---------------------------------------------------+ -| Section Directory (120 bytes = 2 × 60) | -| Entry 0: type=1 (manifest), offset=152, len=375 | -| Entry 1: type=2 (nodes), offset=527, len=284 | +| Section Directory (64 bytes = 2 × 32) | +| Entry 0: type=1 (manifest), offset=96, len=~200 | +| Entry 1: type=2 (nodes), offset=~296, len=10 | +---------------------------------------------------+ -| Manifest Section (375 bytes) | -| Magic: "ARBMNFST" | -| Version: 1.0 | -| Core strings (schema, bundleType, tree spec, | -| runtime spec, capabilities, closure, roots, | -| exports, metadata TLVs, extension fields) | +| Manifest Section (~200 bytes) | +| Magic: "ARBMNFST", Version: 1.1 | +| Schema, bundleType, tree spec, runtime spec | +| Closure: 0, Roots: [1], Exports: ["main" -> 1] | +| Metadata TLVs, zero extension fields | +---------------------------------------------------+ -| Nodes Section (284 bytes) | +| Nodes Section (10 bytes) | | Node count: 2 | -| Node entry 1: hash + payload (Leaf) | -| Node entry 2: hash + payload (Fork) | +| Entry 0: payloadLen=1, payload=[0x00] | +| Entry 1: payloadLen=5, payload=[0x01, 0,0,0,0] | +---------------------------------------------------+ ``` -The manifest section starts at byte 152 (0x98) and the nodes section at byte 527 (0x20F). - --- ## Appendix B: File Extension -Bundles produced by the `tricu` tool use the `.arboricx` file extension. The `.tri` extension is used for plain source files; the `.arboricx` extension identifies the portable binary format. +Bundles use the `.arboricx` file extension. Plain source files use `.tri`. diff --git a/ext/js/.gitignore b/ext/js/.gitignore new file mode 100644 index 0000000..3c3629e --- /dev/null +++ b/ext/js/.gitignore @@ -0,0 +1 @@ +node_modules diff --git a/ext/js/package-lock.json b/ext/js/package-lock.json new file mode 100644 index 0000000..fe5163a --- /dev/null +++ b/ext/js/package-lock.json @@ -0,0 +1,29 @@ +{ + "name": "arboricx-runtime", + "version": "0.1.0", + "lockfileVersion": 3, + "requires": true, + "packages": { + "": { + "name": "arboricx-runtime", + "version": "0.1.0", + "license": "MIT", + "dependencies": { + "koffi": "^2.16.2" + }, + "bin": { + "arboricx-run": "src/cli.js" + } + }, + "node_modules/koffi": { + "version": "2.16.2", + "resolved": "https://registry.npmjs.org/koffi/-/koffi-2.16.2.tgz", + "integrity": "sha512-owU0MRwv6xkrVqCd+33uw6BaYppkTRXbO/rVdJNI2dvZG0gzyRhYwW25eWtc5pauwK8TGh3AbkFONSezdykfSA==", + "hasInstallScript": true, + "license": "MIT", + "funding": { + "url": "https://liberapay.com/Koromix" + } + } + } +} diff --git a/ext/js/package.json b/ext/js/package.json index a9ba01a..4dc0ded 100644 --- a/ext/js/package.json +++ b/ext/js/package.json @@ -1,9 +1,9 @@ { "name": "arboricx-runtime", "version": "0.1.0", - "description": "Arboricx portable bundle runtime — JavaScript reference implementation", + "description": "Arboricx portable bundle runtime — JavaScript host via libarboricx FFI", "type": "module", - "main": "src/bundle.js", + "main": "src/lib.js", "bin": { "arboricx-run": "src/cli.js" }, @@ -12,6 +12,9 @@ "inspect": "node src/cli.js inspect", "run": "node src/cli.js run" }, - "keywords": ["arboricx", "tree-calculus", "trie", "runtime"], + "dependencies": { + "koffi": "^2.16.0" + }, + "keywords": ["arboricx", "tree-calculus", "trie", "runtime", "ffi"], "license": "MIT" } diff --git a/ext/js/src/bundle.js b/ext/js/src/bundle.js deleted file mode 100644 index 1179ac7..0000000 --- a/ext/js/src/bundle.js +++ /dev/null @@ -1,191 +0,0 @@ -/** - * bundle.js — Parse an Arboricx portable bundle binary into a JavaScript object. - * - * Format (v1): - * Header (32 bytes): - * Magic 8B "ARBORICX" - * Major 2B u16 BE (must be 1) - * Minor 2B u16 BE - * SectionCount 4B u32 BE - * Flags 8B u64 BE - * DirOffset 8B u64 BE - * Section Directory (SectionCount × 60 bytes): - * Type 4B u32 BE - * Version 2B u16 BE - * Flags 2B u16 BE (bit 0 = critical) - * Compression 2B u16 BE - * DigestAlgo 2B u16 BE - * Offset 8B u64 BE - * Length 8B u64 BE - * SHA256Digest 32B raw - * Manifest: fixed-order core + TLV tail (ARBMNFST magic) - * Nodes: binary section - */ - -import { createHash } from "node:crypto"; -import { decodeManifest } from "./manifest.js"; - -// ── Constants ─────────────────────────────────────────────────────────────── - -const MAGIC = Buffer.from([0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x43, 0x58]); // "ARBORICX" -const HEADER_LENGTH = 32; -const SECTION_ENTRY_LENGTH = 60; -const SECTION_MANIFEST = 1; -const SECTION_NODES = 2; -const FLAG_CRITICAL = 0x0001; -const COMPRESSION_NONE = 0; -const DIGEST_SHA256 = 1; -const MAJOR_VERSION = 1; -const MINOR_VERSION = 0; - -// ── Helpers ───────────────────────────────────────────────────────────────── - -function readU16BE(buf, offset) { - return buf.readUint16BE(offset); -} -function readU32BE(buf, offset) { - return buf.readUint32BE(offset); -} -function readU64BE(buf, offset) { - return buf.readBigUInt64BE(offset); -} - -function sha256(data) { - return createHash("sha256").update(data).digest(); -} - -// ── Public API ────────────────────────────────────────────────────────────── - -/** - * Parse a bundle Buffer into a Bundle object. - * - * Returns { version, sectionCount, sections } where sections maps - * section type numbers to parsed section info (offset, length, data). - */ -export function parseBundle(buffer) { - if (buffer.length < HEADER_LENGTH) { - throw new Error("bundle too short for header"); - } - - // Check magic - if (!buffer.slice(0, 8).equals(MAGIC)) { - throw new Error("invalid magic: expected ARBORICX"); - } - - // Parse header - const major = readU16BE(buffer, 8); - const minor = readU16BE(buffer, 10); - const sectionCount = readU32BE(buffer, 12); - - if (major !== MAJOR_VERSION) { - throw new Error( - `unsupported bundle major version: ${major} (expected ${MAJOR_VERSION})` - ); - } - - const dirOffset = Number(readU64BE(buffer, 24)); - - // Parse section directory - const dirStart = dirOffset; - const dirEnd = dirStart + sectionCount * SECTION_ENTRY_LENGTH; - - if (buffer.length < dirEnd) { - throw new Error("bundle truncated in section directory"); - } - - const entries = []; - for (let i = 0; i < sectionCount; i++) { - const off = dirStart + i * SECTION_ENTRY_LENGTH; - const entry = { - type: readU32BE(buffer, off), - version: readU16BE(buffer, off + 4), - flags: readU16BE(buffer, off + 6), - compression: readU16BE(buffer, off + 8), - digestAlgorithm: readU16BE(buffer, off + 10), - offset: Number(readU64BE(buffer, off + 12)), - length: Number(readU64BE(buffer, off + 20)), - digest: buffer.slice(off + 28, off + 28 + 32), - }; - entries.push(entry); - } - - // Validate sections - for (const entry of entries) { - const isCritical = (entry.flags & FLAG_CRITICAL) !== 0; - const isKnown = - entry.type === SECTION_MANIFEST || entry.type === SECTION_NODES; - if (isCritical && !isKnown) { - throw new Error(`unknown critical section type: ${entry.type}`); - } - if (entry.compression !== COMPRESSION_NONE) { - throw new Error( - `unsupported compression codec in section ${entry.type}` - ); - } - if (entry.digestAlgorithm !== DIGEST_SHA256) { - throw new Error( - `unsupported digest algorithm in section ${entry.type}` - ); - } - } - - // Verify section digests and extract data - const sections = new Map(); - for (const entry of entries) { - if (entry.offset < 0 || entry.length < 0) { - throw new Error(`section ${entry.type} has negative offset/length`); - } - if (buffer.length < entry.offset + entry.length) { - throw new Error( - `section ${entry.type} extends beyond bundle end` - ); - } - - const data = buffer.slice(entry.offset, entry.offset + entry.length); - - // Verify digest - const computed = sha256(data); - if (!computed.equals(entry.digest)) { - throw new Error( - `section digest mismatch for section type ${entry.type}` - ); - } - - sections.set(entry.type, { - ...entry, - data, - }); - } - - // Check required sections - if (!sections.has(SECTION_MANIFEST)) { - throw new Error("missing required section: manifest"); - } - if (!sections.has(SECTION_NODES)) { - throw new Error("missing required section: nodes"); - } - - return { - version: `${major}.${minor}`, - sectionCount, - sections, - }; -} - -/** - * Convenience: parse and return the manifest from the fixed-order binary format. - */ -export function parseManifest(buffer) { - const bundle = parseBundle(buffer); - const manifestEntry = bundle.sections.get(SECTION_MANIFEST); - return decodeManifest(manifestEntry.data); -} - -/** - * Convenience: parse and return the node section binary. - */ -export function parseNodeSection(buffer) { - const bundle = parseBundle(buffer); - const nodesEntry = bundle.sections.get(SECTION_NODES); - return nodesEntry.data; -} diff --git a/ext/js/src/cli.js b/ext/js/src/cli.js index 67e51a0..b57115f 100644 --- a/ext/js/src/cli.js +++ b/ext/js/src/cli.js @@ -1,249 +1,104 @@ #!/usr/bin/env node /** - * cli.js — Minimal CLI for inspecting and running Arboricx bundles. + * cli.js — Arboricx JS host shell via libarboricx C ABI. * * Usage: - * node cli.js inspect - * node cli.js run [exportName] [input] + * node cli.js inspect + * node cli.js run [args...] */ -import { readFileSync } from "node:fs"; -import { parseBundle, parseManifest } from "./bundle.js"; -import { parseNodeSection as parseNodeSectionMerkle } from "./merkle.js"; +import { readFileSync } from 'node:fs'; import { - validateManifest, - selectExport, - printManifestInfo, -} from "./manifest.js"; -import { parseNodeSection as parseNodeSectionBundle } from "./bundle.js"; -import { - verifyNodeHashes, - verifyClosure, - verifyRootClosure, -} from "./merkle.js"; -import { isTree, apply, triage, isFork, isStem } from "./tree.js"; -import { decodeResult, formatTree } from "./codecs.js"; + init, + free, + loadBundleDefault, + reduce, + app, + ofNumber, + ofString, + decode, + decodeType, + findLib, +} from './lib.js'; -// ── Commands ──────────────────────────────────────────────────────────────── +// ── Commands ───────────────────────────────────────────────────────────────── function cmdInspect(bundlePath) { - const buffer = readFileSync(bundlePath); + const ctx = init(); try { - const manifest = parseManifest(buffer); - validateManifest(manifest); - - const nodeSectionBytes = parseNodeSectionBundle(buffer); - const { nodeMap } = parseNodeSectionMerkle(nodeSectionBytes); - + const bundle = readFileSync(bundlePath); console.log(`Bundle: ${bundlePath}`); - console.log(""); + console.log(`Size: ${bundle.length} bytes\n`); - printManifestInfo(manifest, " "); + const term = loadBundleDefault(ctx, bundle); + const result = reduce(ctx, term); - console.log(` Nodes: ${nodeMap.size}`); - - // Verify hashes - const { verified: hashesOk, mismatches } = verifyNodeHashes(nodeMap); - console.log(` Hash verification: ${hashesOk ? "OK" : "FAIL"}`); - for (const m of mismatches) { - console.log(` MISMATCH ${m.type} ${m.hash.substring(0, 16)}... expected ${m.expected.substring(0, 16)}...`); + const type = decodeType(ctx, result); + let value; + try { + value = decode(ctx, result); + } catch { + value = '(raw tree)'; } - // Verify closure - const { complete: closureOk, missing } = verifyClosure(nodeMap); - console.log(` Closure verification: ${closureOk ? "OK" : "FAIL"}`); - for (const m of missing) { - console.log(` MISSING ${m.parent.substring(0, 16)}... → ${m.child.substring(0, 16)}...`); - } - - // Verify root closure for each export - for (const exp of manifest.exports || []) { - const { complete, missingRoots } = verifyRootClosure( - nodeMap, - exp.root - ); - if (!complete) { - console.log( - ` Root closure for "${exp.name}": FAIL — missing: ${missingRoots - .map((r) => r.substring(0, 16) + "...") - .join(", ")}` - ); - } - } - - console.log(""); - console.log("Inspection complete."); + console.log(`Type: ${type}`); + console.log(`Value: ${value}`); } catch (e) { console.error(`Error: ${e.message}`); process.exit(1); + } finally { + free(ctx); } } -function cmdRun(bundlePath, exportName, inputArg) { - const buffer = readFileSync(bundlePath); - let result; +function cmdRun(bundlePath, args) { + const ctx = init(); try { - const manifest = parseManifest(buffer); - validateManifest(manifest); + const bundle = readFileSync(bundlePath); + let term = loadBundleDefault(ctx, bundle); - const selectedExport = selectExport(manifest, exportName); - - const nodeSectionBytes = parseNodeSectionBundle(buffer); - const { nodeMap } = parseNodeSectionMerkle(nodeSectionBytes); - - // Verify hashes - const { verified, mismatches } = verifyNodeHashes(nodeMap); - if (!verified) { - console.error( - `Node hash mismatch:\n ${mismatches - .map((m) => ` ${m.type}: ${m.hash} (expected ${m.expected})`) - .join("\n")}` - ); - process.exit(1); + for (const arg of args) { + const argTree = /^\d+$/.test(arg) ? ofNumber(ctx, BigInt(arg)) : ofString(ctx, arg); + term = app(ctx, term, argTree); } - // Reconstruct the tree for the selected export - const root = buildTreeFromNodeMap(nodeMap, selectedExport.root); - if (!isTree(root)) { - console.error("Reconstructed root is not a valid tree value"); - process.exit(1); - } - - // Apply input if provided - let term = root; - if (inputArg !== undefined) { - // TODO: parse input (string/number) into a tree - // For now, just run the term as-is - } - - // Reduce with fuel limit - const finalTerm = reduce(term, 1_000_000); - - // Print result as tree calculus form - console.log(formatTree(finalTerm)); + const result = reduce(ctx, term); + console.log(decode(ctx, result)); } catch (e) { console.error(`Error: ${e.message}`); process.exit(1); + } finally { + free(ctx); } } -// ── Tree reconstruction ───────────────────────────────────────────────────── - -/** - * Reconstruct a tree from a node map. - * - * Node map: Map - * - * Returns the tree representation: [] for Leaf, [child] for Stem, [right, left] for Fork. - * Uses memoization to avoid re-processing nodes. - */ -export function buildTreeFromNodeMap(nodeMap, hash, memo = new Map()) { - if (memo.has(hash)) return memo.get(hash); - - const node = nodeMap.get(hash); - if (!node) { - throw new Error(`missing node in bundle: ${hash}`); - } - - let tree; - switch (node.type) { - case "leaf": - tree = []; - break; - case "stem": - tree = [buildTreeFromNodeMap(nodeMap, node.childHash, memo)]; - break; - case "fork": - tree = [ - buildTreeFromNodeMap(nodeMap, node.rightHash, memo), - buildTreeFromNodeMap(nodeMap, node.leftHash, memo), - ]; - break; - default: - throw new Error(`unknown node type: ${node.type}`); - } - - memo.set(hash, tree); - return tree; -} - -// ── Reduction ─────────────────────────────────────────────────────────────── - -/** - * Reduce a term to normal form with a fuel limit. - * Uses the stack-based approach from the TS evaluator. - */ -export function reduce(term, fuel) { - const stack = [term]; - let remaining = fuel; - - while (stack.length >= 2 && remaining-- > 0) { - // Pop right (top), then left - const b = stack.pop(); // right - const a = stack.pop(); // left - - if (stack.length >= 2) { - // Push a back for potential further reduction - stack.push(a); - } - - const result = apply(a, b); - - if (isTree(result)) { - // If result is a value, push it. But if it's a Fork/Stem, - // we need to push its components for further reduction. - if (isFork(result)) { - // Push right first (so it's popped second), then left - stack.push(result[1]); // left - stack.push(result[0]); // right - } else if (isStem(result)) { - stack.push(result[0]); // child - } else { - stack.push(result); // Leaf - } - } else { - // Not a tree — push as-is (shouldn't happen after buildTree) - stack.push(result); - } - } - - if (remaining <= 0) { - throw new Error("reduction step limit exceeded"); - } - - if (stack.length === 1) { - return stack[0]; - } - return stack[0]; // fallback -} - -// ── Main ──────────────────────────────────────────────────────────────────── +// ── Main ───────────────────────────────────────────────────────────────────── const args = process.argv.slice(2); const command = args[0]; switch (command) { - case "inspect": { + case 'inspect': { if (args.length < 2) { - console.error("Usage: node cli.js inspect "); + console.error('Usage: node cli.js inspect '); process.exit(1); } cmdInspect(args[1]); break; } - case "run": { + case 'run': { if (args.length < 2) { - console.error("Usage: node cli.js run [exportName] [input]"); + console.error('Usage: node cli.js run [args...]'); process.exit(1); } - cmdRun(args[1], args[2], args[3]); + cmdRun(args[1], args.slice(2)); break; } default: - console.log("Arboricx JS Runtime"); - console.log(""); - console.log("Usage:"); - console.log(" node cli.js inspect "); - console.log(" node cli.js run [exportName] [input]"); + console.log('Arboricx JS Host (via libarboricx FFI)'); + console.log(''); + console.log('Usage:'); + console.log(' node cli.js inspect '); + console.log(' node cli.js run [args...]'); break; } diff --git a/ext/js/src/codecs.js b/ext/js/src/codecs.js deleted file mode 100644 index a369a45..0000000 --- a/ext/js/src/codecs.js +++ /dev/null @@ -1,135 +0,0 @@ -/** - * codecs.js — Minimal codecs for decoding tree results. - * - * Implements: decodeResult (from Research.hs) - * - Leaf → "t" - * - Numbers: toNumber - * - Strings: toString - * - Lists: toList - * - Fallback: raw tree format - */ - -// ── toNumber ──────────────────────────────────────────────────────────────── - -/** - * Decode a tree as a binary number (big-endian). - * Leaf = 0, Fork(Leaf, rest) = 2*n, Fork(Stem Leaf, rest) = 2*n+1. - */ -export function toNumber(t) { - if (!Array.isArray(t)) return null; - if (t.length === 0) return 0; // Leaf = 0 - if (t.length !== 2) return null; // must be Fork - - const [right, left] = t; - // Fork structure: [right, left] - // left child determines bit: Leaf = 0, Stem(Leaf) = 1 - let bit; - if (Array.isArray(left) && left.length === 0) { - bit = 0; // Leaf - } else if (Array.isArray(left) && left.length === 1) { - const child = left[0]; - if (Array.isArray(child) && child.length === 0) { - bit = 1; // Stem(Leaf) = 1 - } else { - return null; // Stem of something other than Leaf - } - } else { - return null; - } - - const rest = toNumber(right); - if (rest === null) return null; - - return bit + 2 * rest; -} - -// ── toString ──────────────────────────────────────────────────────────────── - -/** - * Decode a tree as a list of numbers (characters). - * Fork(x, rest) = x : list. - */ -export function toList(t) { - if (!Array.isArray(t)) return null; - if (t.length === 0) return []; // Leaf = empty list - if (t.length !== 2) return null; // must be Fork - - const [right, left] = t; - const rest = toList(right); - if (rest === null) return null; - - return [left, ...rest]; -} - -/** - * Decode a tree as a string. - */ -export function toString(t) { - const list = toList(t); - if (list === null) return null; - try { - return list.map((ch) => String.fromCharCode(ch)).join(""); - } catch { - return null; - } -} - -// ── decodeResult ──────────────────────────────────────────────────────────── - -/** - * Decode a tree result using multiple strategies: - * 1. Leaf → "t" - * 2. String (if all chars are printable) - * 3. Number - * 4. List - * 5. Raw tree format - */ -export function decodeResult(t) { - if (!Array.isArray(t)) { - return String(t); - } - - // Leaf - if (t.length === 0) { - return "t"; - } - - // Try string first (list of char codes) - const list = toList(t); - if (list !== null && list.length > 0) { - const str = list.map((n) => { - if (n < 32 || n > 126) return null; - return String.fromCharCode(n); - }).join(""); - if (str) return `"${str}"`; - } - - // Try number - const num = toNumber(t); - if (num !== null) { - return String(num); - } - - // Try list (elements are trees) - if (t.length === 2) { - const elements = toList(t); - if (elements !== null) { - const decoded = elements.map((e) => decodeResult(e)); - return `[${decoded.join(", ")}]`; - } - } - - // Raw tree format - return formatTree(t); -} - -/** - * Format a tree as a parenthesized expression. - */ -export function formatTree(t) { - if (!Array.isArray(t)) return String(t); - if (t.length === 0) return "Leaf"; - if (t.length === 1) return `Stem(${formatTree(t[0])})`; - if (t.length === 2) return `Fork(${formatTree(t[1])}, ${formatTree(t[0])})`; - return `[${t.map(formatTree).join(", ")}]`; -} diff --git a/ext/js/src/lib.js b/ext/js/src/lib.js new file mode 100644 index 0000000..3496e12 --- /dev/null +++ b/ext/js/src/lib.js @@ -0,0 +1,224 @@ +/** + * lib.js — FFI wrapper around libarboricx.so via koffi. + * + * Exports low-level C ABI bindings and high-level helpers. + */ + +import { existsSync } from 'node:fs'; +import { dirname, join, resolve } from 'node:path'; +import { fileURLToPath } from 'node:url'; +import koffi from 'koffi'; + +const __dirname = dirname(fileURLToPath(import.meta.url)); + +koffi.opaque('arb_ctx_t'); + +// ── Library discovery ─────────────────────────────────────────────────────── + +export function findLib() { + const env = process.env.ARBORICX_LIB; + if (env) { + if (existsSync(env)) return env; + throw new Error(`ARBORICX_LIB set but file not found: ${env}`); + } + + const candidates = [ + resolve(__dirname, 'libarboricx.so'), + 'libarboricx.so', + './libarboricx.so', + '/usr/local/lib/libarboricx.so', + '/usr/lib/libarboricx.so', + ]; + + for (const p of candidates) { + if (existsSync(p)) return p; + } + + throw new Error('libarboricx.so not found. Set ARBORICX_LIB to its full path.'); +} + +// ── FFI setup ─────────────────────────────────────────────────────────────── + +let _lib = null; +let _libPath = null; + +function ensureLib() { + if (_lib) return _lib; + const path = findLib(); + _lib = koffi.load(path); + _libPath = path; + return _lib; +} + +export function loadLib(path) { + if (_lib && _libPath === path) return; + _lib = koffi.load(path); + _libPath = path; +} + +function getLib() { + if (_lib) return _lib; + return ensureLib(); +} + +// ── Context lifecycle ─────────────────────────────────────────────────────── + +export function init(libPath) { + if (libPath) loadLib(libPath); + const lib = getLib(); + const ctx = lib.func('arb_ctx_t *arboricx_init(void)')(); + if (!ctx) throw new Error('arboricx_init failed'); + return ctx; +} + +export function free(ctx) { + getLib().func('void arboricx_free(arb_ctx_t *ctx)')(ctx); +} + +// ── Bundle loading ────────────────────────────────────────────────────────── + +export function loadBundle(ctx, bytes, name) { + const result = getLib().func('uint32_t arb_load_bundle(arb_ctx_t *ctx, _In_ uint8_t *bytes, size_t len, const char *name)')(ctx, bytes, bytes.length, name); + if (result === 0) throw new Error(`arb_load_bundle failed for export "${name}"`); + return result; +} + +export function loadBundleDefault(ctx, bytes) { + const result = getLib().func('uint32_t arb_load_bundle_default(arb_ctx_t *ctx, _In_ uint8_t *bytes, size_t len)')(ctx, bytes, bytes.length); + if (result === 0) throw new Error('arb_load_bundle_default failed'); + return result; +} + +// ── Reduction ─────────────────────────────────────────────────────────────── + +export function reduce(ctx, root, fuel = 1_000_000_000n) { + const f = getLib().func('uint32_t arb_reduce(arb_ctx_t *ctx, uint32_t root, uint64_t fuel)'); + return f(ctx, root, typeof fuel === 'bigint' ? fuel : BigInt(fuel)); +} + +// ── Tree construction ─────────────────────────────────────────────────────── + +export function leaf(ctx) { + return getLib().func('uint32_t arb_leaf(arb_ctx_t *ctx)')(ctx); +} + +export function stem(ctx, child) { + return getLib().func('uint32_t arb_stem(arb_ctx_t *ctx, uint32_t child)')(ctx, child); +} + +export function fork(ctx, left, right) { + return getLib().func('uint32_t arb_fork(arb_ctx_t *ctx, uint32_t left, uint32_t right)')(ctx, left, right); +} + +export function app(ctx, func, arg) { + return getLib().func('uint32_t arb_app(arb_ctx_t *ctx, uint32_t func, uint32_t arg)')(ctx, func, arg); +} + +// ── Codec constructors ────────────────────────────────────────────────────── + +export function ofNumber(ctx, n) { + const big = typeof n === 'bigint' ? n : BigInt(n); + return getLib().func('uint32_t arb_of_number(arb_ctx_t *ctx, uint64_t n)')(ctx, big); +} + +export function ofString(ctx, s) { + return getLib().func('uint32_t arb_of_string(arb_ctx_t *ctx, const char *s)')(ctx, s); +} + +export function ofBytes(ctx, bytes) { + return getLib().func('uint32_t arb_of_bytes(arb_ctx_t *ctx, _In_ uint8_t *bytes, size_t len)')(ctx, bytes, bytes.length); +} + +export function ofList(ctx, items) { + const arr = new Uint32Array(items); + return getLib().func('uint32_t arb_of_list(arb_ctx_t *ctx, _In_ uint32_t *items, size_t len)')(ctx, arr, arr.length); +} + +// ── Codec destructors ─────────────────────────────────────────────────────── + +export function toNumber(ctx, root) { + const out = [0]; + const ok = getLib().func('int arb_to_number(arb_ctx_t *ctx, uint32_t root, _Out_ uint64_t *out)')(ctx, root, out); + if (!ok) throw new Error('arb_to_number failed'); + return typeof out[0] === 'bigint' ? Number(out[0]) : out[0]; +} + +export function toString(ctx, root) { + const ptrOut = [null]; + const lenOut = [0]; + const ok = getLib().func('int arb_to_string(arb_ctx_t *ctx, uint32_t root, _Out_ uint8_t **out_ptr, _Out_ size_t *out_len)')(ctx, root, ptrOut, lenOut); + if (!ok) throw new Error('arb_to_string failed'); + + const bytes = koffi.decode(ptrOut[0], 'uint8_t', lenOut[0]); + const str = Buffer.from(bytes).toString('utf-8'); + getLib().func('void arboricx_free_buf(arb_ctx_t *ctx, uint8_t *ptr, size_t len)')(ctx, ptrOut[0], lenOut[0]); + return str; +} + +export function toBytes(ctx, root) { + const ptrOut = [null]; + const lenOut = [0]; + const ok = getLib().func('int arb_to_bytes(arb_ctx_t *ctx, uint32_t root, _Out_ uint8_t **out_ptr, _Out_ size_t *out_len)')(ctx, root, ptrOut, lenOut); + if (!ok) throw new Error('arb_to_bytes failed'); + + const bytes = Buffer.from(koffi.decode(ptrOut[0], 'uint8_t', lenOut[0])); + getLib().func('void arboricx_free_buf(arb_ctx_t *ctx, uint8_t *ptr, size_t len)')(ctx, ptrOut[0], lenOut[0]); + return bytes; +} + +export function toBool(ctx, root) { + const out = [0]; + const ok = getLib().func('int arb_to_bool(arb_ctx_t *ctx, uint32_t root, _Out_ int *out)')(ctx, root, out); + if (!ok) throw new Error('arb_to_bool failed'); + return out[0] !== 0; +} + +// ── Result unwrapping ─────────────────────────────────────────────────────── + +export function unwrapResult(ctx, root) { + const outOk = [0]; + const outValue = [0]; + const outRest = [0]; + const ok = getLib().func('int arb_unwrap_result(arb_ctx_t *ctx, uint32_t root, _Out_ int *out_ok, _Out_ uint32_t *out_value, _Out_ uint32_t *out_rest)')(ctx, root, outOk, outValue, outRest); + if (!ok) throw new Error('arb_unwrap_result failed'); + return { ok: outOk[0] !== 0, value: outValue[0], rest: outRest[0] }; +} + +export function unwrapHostValue(ctx, root) { + const outTag = [0n]; + const outPayload = [0]; + const ok = getLib().func('int arb_unwrap_host_value(arb_ctx_t *ctx, uint32_t root, _Out_ uint64_t *out_tag, _Out_ uint32_t *out_payload)')(ctx, root, outTag, outPayload); + if (!ok) throw new Error('arb_unwrap_host_value failed'); + return { tag: outTag[0], payload: outPayload[0] }; +} + +// ── Kernel ────────────────────────────────────────────────────────────────── + +export function kernelRoot(ctx) { + return getLib().func('uint32_t arb_kernel_root(arb_ctx_t *ctx)')(ctx); +} + +// ── High-level helpers ────────────────────────────────────────────────────── + +export function decode(ctx, root) { + try { + return toBool(ctx, root) ? 'true' : 'false'; + } catch { + try { + return toString(ctx, root); + } catch { + try { + return String(toNumber(ctx, root)); + } catch { + throw new Error('could not decode result'); + } + } + } +} + +export function decodeType(ctx, root) { + try { toBool(ctx, root); return 'bool'; } catch {} + try { toString(ctx, root); return 'string'; } catch {} + try { toNumber(ctx, root); return 'number'; } catch {} + return 'unknown (raw tree)'; +} diff --git a/ext/js/src/manifest.js b/ext/js/src/manifest.js deleted file mode 100644 index 4a55b3b..0000000 --- a/ext/js/src/manifest.js +++ /dev/null @@ -1,374 +0,0 @@ -/** - * manifest.js — Fixed-order manifest parsing and export lookup. - * - * The manifest binary format (ManifestV1): - * magic(8) + major(u16) + minor(u16) - * + schema(string) + bundleType(string) - * + treeCalculus(string) + treeHashAlgorithm(string) + treeHashDomain(string) + treeNodePayload(string) - * + runtimeSemantics(string) + runtimeEvaluation(string) + runtimeAbi(string) - * + capabilityCount(u32) + capabilities(string[]) - * + closure(u8) - * + rootCount(u32) + roots[] - * + exportCount(u32) + exports[] - * + metadataFieldCount(u32) + metadataTLVs[] - * + extensionFieldCount(u32) + extensionTLVs[] - * - * String format: u32 BE length + UTF-8 bytes. - * Root: 32 bytes raw hash + role(string). - * Export: name(string) + 32 bytes raw root hash + kind(string) + abi(string). - * TLV: u16 tag + u32 length + value bytes. - */ - -// ── Constants ─────────────────────────────────────────────────────────────── - -const MANIFEST_MAGIC = "ARBMNFST"; -const MANIFEST_MAJOR = 1; -const MANIFEST_MINOR = 0; - -// Metadata TLV tags -const TAG_PACKAGE = 1; -const TAG_VERSION = 2; -const TAG_DESCRIPTION = 3; -const TAG_LICENSE = 4; -const TAG_CREATED_BY = 5; - -// Closure bytes -const CLOSURE_COMPLETE = 0; -const CLOSURE_PARTIAL = 1; - -// ── Binary helpers ────────────────────────────────────────────────────────── - -function u16(buf, off) { - if (off + 2 > buf.length) throw new Error("manifest: not enough bytes for u16"); - return { value: buf.readUint16BE(off), next: off + 2 }; -} - -function u32(buf, off) { - if (off + 4 > buf.length) throw new Error("manifest: not enough bytes for u32"); - return { value: buf.readUint32BE(off), next: off + 4 }; -} - -function u8(buf, off) { - if (off >= buf.length) throw new Error("manifest: not enough bytes for u8"); - return { value: buf.readUint8(off), next: off + 1 }; -} - -/** - * Read a length-prefixed UTF-8 string: u32 BE length + UTF-8 bytes. - * Returns { text, next }. - */ -function readStr(buf, off) { - const { value: len, next: afterLen } = u32(buf, off); - if (afterLen + len > buf.length) throw new Error("manifest: string extends beyond input"); - return { text: buf.toString("utf-8", afterLen, afterLen + len), next: afterLen + len }; -} - -/** - * Read raw bytes of given length. - * Returns { bytes, next }. - */ -function readRaw(buf, off, n) { - if (off + n > buf.length) throw new Error(`manifest: not enough bytes for ${n}-byte read`); - return { value: buf.slice(off, off + n), next: off + n }; -} - -// ── Manifest decoder ──────────────────────────────────────────────────────── - -/** - * Decode the manifest binary from a Buffer. - * - * Returns a normalized manifest object matching the shape expected - * by validateManifest / selectExport. - */ -export function decodeManifest(buf) { - let off = 0; - - // Magic (8 bytes) - const magic = buf.toString("utf-8", 0, 8); - if (magic !== MANIFEST_MAGIC) { - throw new Error(`invalid manifest magic: expected ${MANIFEST_MAGIC}, got "${magic}"`); - } - off = 8; - - // Version - const { value: major } = u16(buf, off); - if (major !== MANIFEST_MAJOR) throw new Error(`unsupported manifest major version: ${major}`); - off += 4; // u16 major + u16 minor - - // Helper: read length-prefixed text - const readText = () => { - const { text, next } = readStr(buf, off); - off = next; - return text; - }; - - // Core strings - const schema = readText(); - const bundleType = readText(); - const treeCalculus = readText(); - const treeHashAlgorithm = readText(); - const treeHashDomain = readText(); - const treeNodePayload = readText(); - const runtimeSemantics = readText(); - const runtimeEvaluation = readText(); - const runtimeAbi = readText(); - - // Capabilities (u32 count + string[]) - const { value: capCount } = u32(buf, off); - off += 4; - const capabilities = []; - for (let i = 0; i < capCount; i++) { - capabilities.push(readText()); - } - - // Closure (u8) - const { value: closureByte } = u8(buf, off); - off += 1; - const closure = closureByte === CLOSURE_COMPLETE ? "complete" : "partial"; - - // Roots (u32 count + Root[]) - // Root: 32 bytes raw hash + role(string) - const { value: rootCount } = u32(buf, off); - off += 4; - const roots = []; - for (let i = 0; i < rootCount; i++) { - const { value: hashRaw } = readRaw(buf, off, 32); - off += 32; - const { text: role, next: rOff } = readStr(buf, off); - off = rOff; - roots.push({ hash: hashRaw.toString("hex"), role }); - } - - // Exports (u32 count + Export[]) - // Export: name(string) + 32 bytes raw root hash + kind(string) + abi(string) - const { value: exportCount } = u32(buf, off); - off += 4; - const exports = []; - for (let i = 0; i < exportCount; i++) { - const { text: name, next: nOff } = readStr(buf, off); - off = nOff; - const { value: expHashRaw } = readRaw(buf, off, 32); - off += 32; - const { text: kind, next: kOff } = readStr(buf, off); - off = kOff; - const { text: abi, next: aOff } = readStr(buf, off); - off = aOff; - exports.push({ name, root: expHashRaw.toString("hex"), kind, abi }); - } - - // Metadata (u32 count + TLV[]) - // TLV: u16 tag + u32 length + value bytes - const { value: metaCount } = u32(buf, off); - off += 4; - const metadata = {}; - for (let i = 0; i < metaCount; i++) { - const { value: tag } = u16(buf, off); - off += 2; - const { value: tlvLen } = u32(buf, off); - off += 4; - const { value: tlvRaw } = readRaw(buf, off, tlvLen); - off += tlvLen; - const val = tlvRaw.toString("utf-8"); - switch (tag) { - case TAG_PACKAGE: metadata.package = val; break; - case TAG_VERSION: metadata.version = val; break; - case TAG_DESCRIPTION: metadata.description = val; break; - case TAG_LICENSE: metadata.license = val; break; - case TAG_CREATED_BY: metadata.createdBy = val; break; - } - } - - // Extensions (u32 count + TLV[] — skip all) - const { value: extCount } = u32(buf, off); - off += 4; - for (let i = 0; i < extCount; i++) { - const { value: _tag } = u16(buf, off); - off += 2; - const { value: tlvLen } = u32(buf, off); - off += 4; - off += tlvLen; // skip value - } - - return { - schema, - bundleType, - tree: { - calculus: treeCalculus, - nodeHash: { - algorithm: treeHashAlgorithm, - domain: treeHashDomain, - }, - nodePayload: treeNodePayload, - }, - runtime: { - semantics: runtimeSemantics, - evaluation: runtimeEvaluation, - abi: runtimeAbi, - capabilities, - }, - closure, - roots, - exports, - metadata: Object.keys(metadata).length > 0 ? metadata : undefined, - }; -} - -// ── Validation ────────────────────────────────────────────────────────────── - -/** - * Validate the manifest against the runtime profile requirements. - * Throws on violation. - */ -export function validateManifest(manifest) { - if (manifest.schema !== "arboricx.bundle.manifest.v1") { - throw new Error( - `unsupported manifest schema: ${manifest.schema}` - ); - } - if (manifest.bundleType !== "tree-calculus-executable-object") { - throw new Error( - `unsupported bundle type: ${manifest.bundleType}` - ); - } - - const tree = manifest.tree; - if (tree.calculus !== "tree-calculus.v1") { - throw new Error(`unsupported calculus: ${tree.calculus}`); - } - if (tree.nodeHash.algorithm !== "sha256") { - throw new Error( - `unsupported node hash algorithm: ${tree.nodeHash.algorithm}` - ); - } - if (tree.nodeHash.domain !== "arboricx.merkle.node.v1") { - throw new Error( - `unsupported node hash domain: ${tree.nodeHash.domain}` - ); - } - if (tree.nodePayload !== "arboricx.merkle.payload.v1") { - throw new Error(`unsupported node payload: ${tree.nodePayload}`); - } - - const runtime = manifest.runtime; - if (runtime.semantics !== "tree-calculus.v1") { - throw new Error(`unsupported runtime semantics: ${runtime.semantics}`); - } - if (runtime.abi !== "arboricx.abi.tree.v1") { - throw new Error(`unsupported runtime ABI: ${runtime.abi}`); - } - if (runtime.capabilities && runtime.capabilities.length > 0) { - throw new Error( - `host/runtime capabilities not supported: ${runtime.capabilities.join(", ")}` - ); - } - - if (manifest.closure !== "complete") { - throw new Error("bundle v1 requires closure = complete"); - } - if (manifest.imports && manifest.imports.length > 0) { - throw new Error("bundle v1 requires an empty imports list"); - } - if (!manifest.roots || manifest.roots.length === 0) { - throw new Error("manifest has no roots"); - } - if (!manifest.exports || manifest.exports.length === 0) { - throw new Error("manifest has no exports"); - } - - for (const exp of manifest.exports) { - if (!exp.name) { - throw new Error("manifest export has empty name"); - } - if (!exp.root) { - throw new Error("manifest export has empty root"); - } - } -} - -/** - * Select an export hash given a requested name. - * - * Selection strategy: - * 1. Explicit export name - * 2. Export named "main" - * 3. Single export (auto-select) - * 4. Error if multiple exports and no "main" - */ -export function selectExport(manifest, requestedName) { - const exports = manifest.exports || []; - - // Strategy 1: explicit name - if (requestedName) { - const found = exports.find((e) => e.name === requestedName); - if (found) { - return found; - } - throw new Error( - `requested export "${requestedName}" not found. Available: ${exports.map((e) => e.name).join(", ")}` - ); - } - - // Strategy 2: prefer "main" - const mainExport = exports.find((e) => e.name === "main"); - if (mainExport) { - return mainExport; - } - - // Strategy 3: single export - if (exports.length === 1) { - return exports[0]; - } - - // Strategy 4: multiple exports, require explicit - throw new Error( - `multiple exports available but none named "main": ${exports.map((e) => e.name).join(", ")}. Specify an export name.` - ); -} - -/** - * Get all root hashes from the manifest. - */ -export function getRootHashes(manifest) { - return (manifest.roots || []).map((r) => r.hash); -} - -/** - * Get all export names. - */ -export function getExportNames(manifest) { - return (manifest.exports || []).map((e) => e.name); -} - -/** - * Print manifest summary info. - */ -export function printManifestInfo(manifest, indent = "") { - const tree = manifest.tree; - const runtime = manifest.runtime; - - console.log(`${indent}Schema: ${manifest.schema}`); - console.log(`${indent}Bundle type: ${manifest.bundleType}`); - console.log(`${indent}Closure: ${manifest.closure}`); - console.log(`${indent}Tree calculus: ${tree.calculus}`); - console.log(`${indent}Hash algo: ${tree.nodeHash.algorithm}`); - console.log(`${indent}Hash domain: ${tree.nodeHash.domain}`); - console.log(`${indent}Runtime: ${runtime.semantics}`); - console.log(`${indent}ABI: ${runtime.abi}`); - console.log(`${indent}Evaluation: ${runtime.evaluation || "N/A"}`); - console.log(""); - console.log(`${indent}Roots (${getRootHashes(manifest).length}):`); - for (const root of getRootHashes(manifest)) { - console.log(`${indent} ${root.substring(0, 16)}...`); - } - console.log(""); - console.log(`${indent}Exports (${getExportNames(manifest).length}):`); - for (const name of getExportNames(manifest)) { - console.log(`${indent} ${name}`); - } - - const meta = manifest.metadata; - if (meta && meta.createdBy) { - console.log(""); - console.log(`${indent}Created by: ${meta.createdBy}`); - } -} diff --git a/ext/js/src/merkle.js b/ext/js/src/merkle.js deleted file mode 100644 index c218f88..0000000 --- a/ext/js/src/merkle.js +++ /dev/null @@ -1,276 +0,0 @@ -/** - * merkle.js — Node payload decoding and hash verification. - * - * Node payload format: - * Leaf: 0x00 - * Stem: 0x01 || child_hash (32 bytes raw) - * Fork: 0x02 || left_hash (32 bytes raw) || right_hash (32 bytes raw) - * - * Hash computation: - * hash = SHA256( "arboricx.merkle.node.v1" || 0x00 || node_payload ) - */ - -import { createHash } from "node:crypto"; - -// ── Constants ─────────────────────────────────────────────────────────────── - -const DOMAIN_TAG = "arboricx.merkle.node.v1"; -const HASH_LENGTH = 32; // raw hash bytes -const HEX_LENGTH = 64; // hex-encoded hash length - -// ── Helpers ───────────────────────────────────────────────────────────────── - -function rawToHex(buf) { - if (buf.length !== HASH_LENGTH) { - throw new Error(`raw hash must be ${HASH_LENGTH} bytes, got ${buf.length}`); - } - return buf.toString("hex"); -} - -function hexToRaw(hex) { - const buf = Buffer.from(hex, "hex"); - if (buf.length !== HASH_LENGTH) { - throw new Error(`hex hash must decode to ${HASH_LENGTH} bytes`); - } - return buf; -} - -function sha256(data) { - return createHash("sha256").update(data).digest(); -} - -function nodeHash(prefix, payload) { - return sha256(Buffer.concat([Buffer.from(prefix), Buffer.from([0x00]), payload])); -} - -// ── Node payload types ────────────────────────────────────────────────────── - -/** - * Deserialize a node payload into { type, childHash, leftHash, rightHash }. - * - * type: "leaf" | "stem" | "fork" - * childHash: hex string (for stem) - * leftHash: hex string (for fork) - * rightHash: hex string (for fork) - */ -export function deserializePayload(payload) { - if (payload.length === 0) { - throw new Error("empty payload"); - } - - const type = payload.readUInt8(0); - - switch (type) { - case 0x00: - if (payload.length !== 1) { - throw new Error( - `invalid leaf payload: expected 1 byte, got ${payload.length}` - ); - } - return { type: "leaf" }; - - case 0x01: - if (payload.length !== 1 + HASH_LENGTH) { - throw new Error( - `invalid stem payload: expected ${1 + HASH_LENGTH} bytes, got ${payload.length}` - ); - } - return { - type: "stem", - childHash: rawToHex(payload.slice(1, 1 + HASH_LENGTH)), - }; - - case 0x02: - if (payload.length !== 1 + 2 * HASH_LENGTH) { - throw new Error( - `invalid fork payload: expected ${1 + 2 * HASH_LENGTH} bytes, got ${payload.length}` - ); - } - return { - type: "fork", - leftHash: rawToHex(payload.slice(1, 1 + HASH_LENGTH)), - rightHash: rawToHex(payload.slice(1 + HASH_LENGTH, 1 + 2 * HASH_LENGTH)), - }; - - default: - throw new Error( - `invalid merkle node payload: unknown type 0x${type.toString(16)}` - ); - } -} - -/** - * Compute the canonical payload bytes for a given tree node structure. - */ -export function serializeNode(node) { - switch (node.type) { - case "leaf": - return Buffer.from([0x00]); - case "stem": - return Buffer.concat([Buffer.from([0x01]), hexToRaw(node.childHash)]); - case "fork": - return Buffer.concat([ - Buffer.from([0x02]), - hexToRaw(node.leftHash), - hexToRaw(node.rightHash), - ]); - } -} - -/** - * Compute the Merkle hash of a node from its type and parameters. - */ -export function computeNodeHash(node) { - const payload = serializeNode(node); - const hash = nodeHash(DOMAIN_TAG, payload); - return hash.toString("hex"); -} - -// ── Node section parsing ──────────────────────────────────────────────────── - -/** - * Parse the node section binary into a Map. - * - * Node section format: - * nodeCount (8B u64 BE) - * entries[]: - * hash (32B raw) - * payloadLen (4B u32 BE) - * payload (payloadLen bytes) - */ -export function parseNodeSection(data) { - if (data.length < 8) { - throw new Error("node section too short for count"); - } - - const nodeCount = Number(data.readBigUInt64BE(0)); - let offset = 8; - - const nodeMap = new Map(); - const errors = []; - - for (let i = 0; i < nodeCount; i++) { - // Read hash - if (offset + HASH_LENGTH > data.length) { - errors.push(`node ${i}: not enough bytes for hash`); - break; - } - const hash = rawToHex(data.slice(offset, offset + HASH_LENGTH)); - offset += HASH_LENGTH; - - // Read payload length - if (offset + 4 > data.length) { - errors.push(`node ${i} (${hash}): not enough bytes for payload length`); - break; - } - const payloadLen = data.readUint32BE(offset); - offset += 4; - - // Read payload - if (offset + payloadLen > data.length) { - errors.push(`node ${i} (${hash}): payload extends beyond section end`); - break; - } - const payload = data.slice(offset, offset + payloadLen); - offset += payloadLen; - - // Deserialize payload - let node; - try { - node = deserializePayload(payload); - } catch (e) { - errors.push(`node ${i} (${hash}): ${e.message}`); - continue; - } - - nodeMap.set(hash, { - hash, - payload, - ...node, - }); - } - - if (errors.length > 0) { - throw new Error( - `node section parse errors:\n ${errors.join("\n ")}` - ); - } - - return { nodeMap, count: nodeCount }; -} - -// ── Verification ──────────────────────────────────────────────────────────── - -/** - * Verify all node hashes match their payloads. - * Returns { verified, mismatches } - */ -export function verifyNodeHashes(nodeMap) { - const mismatches = []; - - for (const [hash, node] of nodeMap) { - const expected = computeNodeHash(node); - if (hash !== expected) { - mismatches.push({ - hash, - expected, - type: node.type, - }); - } - } - - return { verified: mismatches.length === 0, mismatches }; -} - -/** - * Verify that all child references exist in the node map (closure). - * Returns { complete, missing } where missing is an array of { parent, child }. - */ -export function verifyClosure(nodeMap) { - const missing = []; - - for (const [hash, node] of nodeMap) { - if (node.type === "stem") { - if (!nodeMap.has(node.childHash)) { - missing.push({ parent: hash, child: node.childHash }); - } - } else if (node.type === "fork") { - if (!nodeMap.has(node.leftHash)) { - missing.push({ parent: hash, child: node.leftHash }); - } - if (!nodeMap.has(node.rightHash)) { - missing.push({ parent: hash, child: node.rightHash }); - } - } - } - - return { complete: missing.length === 0, missing }; -} - -/** - * Verify closure for a specific root hash (transitive reachability). - * Returns { complete, missingRoots }. - */ -export function verifyRootClosure(nodeMap, rootHash) { - const visited = new Set(); - const missingRoots = []; - - function visit(hash) { - if (visited.has(hash)) return; - if (!nodeMap.has(hash)) { - missingRoots.push(hash); - return; - } - visited.add(hash); - const node = nodeMap.get(hash); - if (node.type === "stem") { - visit(node.childHash); - } else if (node.type === "fork") { - visit(node.leftHash); - visit(node.rightHash); - } - } - - visit(rootHash); - return { complete: missingRoots.length === 0, missingRoots }; -} diff --git a/ext/js/src/tree.js b/ext/js/src/tree.js deleted file mode 100644 index 7829589..0000000 --- a/ext/js/src/tree.js +++ /dev/null @@ -1,125 +0,0 @@ -/** - * tree.js — Runtime tree representation. - * - * The JS tree uses a simple array representation matching the - * TypeScript reference evaluator: - * - * Leaf = [] - * Stem = [child] (array length === 1) - * Fork = [right, left] (array length === 2) - * - * This is a "flattened stack" representation: when reduced, terms - * become arrays and the evaluator pops three elements at a time. - */ - -/** - * Check if a value is a Leaf (empty array). - */ -export function isLeaf(t) { - return Array.isArray(t) && t.length === 0; -} - -/** - * Check if a value is a Stem (single element array). - */ -export function isStem(t) { - return Array.isArray(t) && t.length === 1; -} - -/** - * Check if a value is a Fork (two element array). - */ -export function isFork(t) { - return Array.isArray(t) && t.length === 2; -} - -/** - * Check if a value is a valid tree calculus value (Leaf, Stem, or Fork). - */ -export function isTree(t) { - return isLeaf(t) || isStem(t) || isFork(t); -} - -/** - * Triage a tree: classify it as Leaf/Stem/Fork. - * The tree must be in normal form (no reducible redexes). - * - * Returns { kind: "leaf"|"stem"|"fork", ...rest } - */ -export function triage(t) { - if (!Array.isArray(t)) { - throw new Error("not a tree (not an array)"); - } - if (t.length === 0) return { kind: "leaf" }; - if (t.length === 1) return { kind: "stem", child: t[0] }; - if (t.length === 2) return { kind: "fork", right: t[0], left: t[1] }; - throw new Error(`not a value/binary tree: length ${t.length}`); -} - -/** - * Apply the Tree Calculus apply rules. - * - * apply(a, b) computes the application of term a to term b. - * - * Rules: - * apply(Fork(Leaf, a), _) = a - * apply(Fork(Stem(a), b), c) = apply(apply(a, c), apply(b, c)) - * apply(Fork(Fork, _, _), Leaf) = left of inner Fork - * apply(Fork(Fork, _, _), Stem) = right of inner Fork - * apply(Fork(Fork, _, _), Fork) = apply(apply(c, u), v) where c=Fork(u,v) - * apply(Leaf, b) = Stem(b) - * apply(Stem(a), b) = Fork(a, b) - * - * For Fork, the inner structure is [right, left], so: - * a = right, b = left - */ -export function apply(a, b) { - // apply(Fork(Leaf, a), _) = a - // Fork = [right, left] = [Leaf, a] → left child is Leaf - if (isFork(a) && isLeaf(a[1])) { - return a[0]; // return right child - } - - // apply(Fork(Stem(a), b), c) - if (isFork(a) && isStem(a[1])) { - const stemChild = a[1][0]; // left child of fork - const right = a[0]; // right child of fork - const innerA = stemChild; - const innerB = right; - const appliedA = apply(innerA, b); - const appliedB = apply(innerB, b); - return apply(appliedA, appliedB); - } - - // apply(Fork(Fork, _, _), Leaf) - if (isFork(a) && isFork(a[1]) && isLeaf(b)) { - return a[1][0]; // right child of inner fork (which is left child) - } - - // apply(Fork(Fork, _, _), Stem) - if (isFork(a) && isFork(a[1]) && isStem(b)) { - return a[1][1]; // left child of inner fork - } - - // apply(Fork(Fork, _, _), Fork) - if (isFork(a) && isFork(a[1]) && isFork(b)) { - // b = Fork(u, v) = [v, u] - const u = b[0]; - const v = b[1]; - // apply(apply(c, u), v) where c = inner fork - const applied = apply(apply(a[1], u), v); - return applied; - } - - // apply(Leaf, b) = Stem(b) - if (isLeaf(a)) { - return [b]; - } - - // apply(Stem(a), b) = Fork(a, b) - if (isStem(a)) { - return [b, a[0]]; // [right, left] - } - - throw new Error("apply: undefined reduction for terms"); -} diff --git a/ext/js/test/bundle.test.js b/ext/js/test/bundle.test.js index c253a25..23a30c1 100644 --- a/ext/js/test/bundle.test.js +++ b/ext/js/test/bundle.test.js @@ -1,134 +1,93 @@ -import { readFileSync } from "node:fs"; -import { strictEqual, ok, throws } from "node:assert"; -import { createHash } from "node:crypto"; -import { describe, it } from "node:test"; +import { readFileSync } from 'node:fs'; +import { strictEqual, ok, throws } from 'node:assert'; +import { describe, it } from 'node:test'; import { - parseBundle, - parseManifest, -} from "../src/bundle.js"; -import { - parseNodeSection as bundleParseNodeSection, -} from "../src/bundle.js"; -import { - verifyNodeHashes, - parseNodeSection as parseNodes, -} from "../src/merkle.js"; + findLib, + init, + free, + loadBundle, + loadBundleDefault, + kernelRoot, +} from '../src/lib.js'; -const fixtureDir = "../../test/fixtures"; +const fixtureDir = '../../test/fixtures'; +const libPath = findLib(); -describe("bundle parsing", () => { - it("valid bundle parses header and sections", () => { - const bundle = parseBundle( - readFileSync(`${fixtureDir}/id.arboricx`) - ); - strictEqual(bundle.version, "1.0"); - strictEqual(bundle.sectionCount, 2); - ok(bundle.sections.has(1)); // manifest - ok(bundle.sections.has(2)); // nodes - }); - - it("parseManifest returns valid manifest", () => { - const manifest = parseManifest( - readFileSync(`${fixtureDir}/id.arboricx`) - ); - strictEqual(manifest.schema, "arboricx.bundle.manifest.v1"); - strictEqual(manifest.bundleType, "tree-calculus-executable-object"); - strictEqual(manifest.closure, "complete"); - strictEqual(manifest.tree.calculus, "tree-calculus.v1"); - strictEqual(manifest.tree.nodeHash.algorithm, "sha256"); - strictEqual(manifest.tree.nodeHash.domain, "arboricx.merkle.node.v1"); - strictEqual(manifest.runtime.semantics, "tree-calculus.v1"); - strictEqual(manifest.runtime.abi, "arboricx.abi.tree.v1"); +describe('library discovery', () => { + it('findLib returns an existing .so path', () => { + ok(libPath.endsWith('.so') || libPath.endsWith('.dylib') || libPath.endsWith('.dll')); + ok(readFileSync(libPath)); }); }); -describe("hash verification", () => { - it("valid bundle nodes verify", () => { - const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arboricx`) - ); - const { nodeMap } = parseNodes(data); - const { verified } = verifyNodeHashes(nodeMap); - ok(verified, "all node hashes should verify"); +describe('context lifecycle', () => { + it('init creates a valid context', () => { + const ctx = init(libPath); + ok(ctx); + free(ctx); + }); + + it('kernel root is available', () => { + const ctx = init(libPath); + try { + const root = kernelRoot(ctx); + ok(root > 0, 'kernel root should be a positive index'); + } finally { + free(ctx); + } }); }); -describe("errors", () => { - it("bad magic fails", () => { - const buf = Buffer.alloc(32, 0); - buf.write("WRONGMAG", 0, 8); - throws(() => parseBundle(buf), /invalid magic/); +describe('bundle loading', () => { + it('loadBundleDefault loads id.arboricx', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync(`${fixtureDir}/id.arboricx`); + const root = loadBundleDefault(ctx, bundle); + ok(root > 0, 'loaded root should be a positive index'); + } finally { + free(ctx); + } }); - it("unsupported version fails", () => { - const buf = Buffer.alloc(32, 0); - buf.write("ARBORICX", 0, 8); - buf.writeUInt16BE(2, 8); // major version 2 - throws(() => parseBundle(buf), /unsupported bundle major version/); + it('loadBundleDefault loads true.arboricx', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync(`${fixtureDir}/true.arboricx`); + const root = loadBundleDefault(ctx, bundle); + ok(root > 0); + } finally { + free(ctx); + } }); - it("bad section digest fails", () => { - const buf = readFileSync(`${fixtureDir}/id.arboricx`); - // Corrupt one byte in the manifest section - buf[152] ^= 0x01; - throws(() => parseBundle(buf), /digest mismatch/); + it('loadBundle loads named export from id.arboricx', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync(`${fixtureDir}/id.arboricx`); + const root = loadBundle(ctx, bundle, 'id'); + ok(root > 0); + } finally { + free(ctx); + } }); - it("truncated bundle fails", () => { - const buf = readFileSync(`${fixtureDir}/id.arboricx`); - const truncated = buf.slice(0, 40); - throws(() => parseBundle(truncated), /truncated/); + it('loadBundle fails for missing export name', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync(`${fixtureDir}/id.arboricx`); + throws(() => loadBundle(ctx, bundle, 'nonexistent'), /failed/); + } finally { + free(ctx); + } }); - it("missing nodes section fails", () => { - // Build a bundle with only manifest entry in the directory (1 section instead of 2) - const header = Buffer.alloc(32, 0); - header.write("ARBORICX", 0, 8); - header.writeUInt16BE(1, 8); // major version - header.writeUInt16BE(0, 10); // minor version - header.writeUInt32BE(1, 12); // 1 section - - // Build a manifest JSON - const manifestObj = { - schema: "arboricx.bundle.manifest.v1", - bundleType: "tree-calculus-executable-object", - tree: { - calculus: "tree-calculus.v1", - nodeHash: { - algorithm: "sha256", - domain: "arboricx.merkle.node.v1" - }, - nodePayload: "arboricx.merkle.payload.v1" - }, - runtime: { - semantics: "tree-calculus.v1", - evaluation: "normal-order", - abi: "arboricx.abi.tree.v1", - capabilities: [] - }, - closure: "complete", - roots: [{ hash: Buffer.alloc(32).toString("hex"), role: "default" }], - exports: [{ name: "root", root: Buffer.alloc(32).toString("hex"), kind: "term", abi: "arboricx.abi.tree.v1" }], - metadata: { createdBy: "arboricx" } - }; - const manifestJson = JSON.stringify(manifestObj); - const manifestBytes = Buffer.from(manifestJson); - - // Section directory entry (60 bytes, all fields are u64 after the u16s) - const entry = Buffer.alloc(60, 0); - entry.writeUInt32BE(1, 0); // type: manifest - entry.writeUInt16BE(1, 4); // version - entry.writeUInt16BE(1, 6); // flags: critical - entry.writeUInt16BE(0, 8); // compression: none - entry.writeUInt16BE(1, 10); // digest algorithm: sha256 - entry.writeBigUInt64BE(BigInt(32 + 60), 12); // offset (u64) - entry.writeBigUInt64BE(BigInt(manifestBytes.length), 20); // length (u64) - entry.set(createHash("sha256").update(manifestBytes).digest(), 28); // digest (32 bytes) - - // Set dirOffset to 32 so parseBundle reads directory from after header - header.writeBigUInt64BE(BigInt(32), 24); - - const bundleBuf = Buffer.concat([header, entry, manifestBytes]); - throws(() => parseBundle(bundleBuf), /missing required section/); + it('loadBundleDefault fails for invalid bytes', () => { + const ctx = init(libPath); + try { + throws(() => loadBundleDefault(ctx, Buffer.from('not a bundle')), /failed/); + } finally { + free(ctx); + } }); }); diff --git a/ext/js/test/merkle.test.js b/ext/js/test/merkle.test.js deleted file mode 100644 index bbafe10..0000000 --- a/ext/js/test/merkle.test.js +++ /dev/null @@ -1,180 +0,0 @@ -import { readFileSync } from "node:fs"; -import { strictEqual, ok } from "node:assert"; -import { describe, it } from "node:test"; -import { parseNodeSection as bundleParseNodeSection, parseBundle, parseManifest } from "../src/bundle.js"; -import { - verifyNodeHashes, - verifyClosure, - verifyRootClosure, - deserializePayload, - computeNodeHash, - parseNodeSection, -} from "../src/merkle.js"; - -describe("merkle — deserializePayload", () => { - it("Leaf (0x00)", () => { - const result = deserializePayload(Buffer.from([0x00])); - strictEqual(result.type, "leaf"); - }); - - it("Stem (0x01 + 32 bytes)", () => { - const childHash = Buffer.alloc(32, 0xab); - const payload = Buffer.concat([Buffer.from([0x01]), childHash]); - const result = deserializePayload(payload); - strictEqual(result.type, "stem"); - strictEqual(result.childHash, "ab".repeat(32)); - }); - - it("Fork (0x02 + 64 bytes)", () => { - const left = Buffer.alloc(32, 0x01); - const right = Buffer.alloc(32, 0x02); - const payload = Buffer.concat([Buffer.from([0x02]), left, right]); - const result = deserializePayload(payload); - strictEqual(result.type, "fork"); - strictEqual(result.leftHash, "01".repeat(32)); - strictEqual(result.rightHash, "02".repeat(32)); - }); - - it("Leaf with extra bytes fails", () => { - throws(() => deserializePayload(Buffer.from([0x00, 0x00])), /invalid leaf/); - }); - - it("Unknown type fails", () => { - throws(() => deserializePayload(Buffer.from([0xff])), /unknown type/); - }); -}); - -describe("merkle — computeNodeHash", () => { - it("Leaf hash is correct length", () => { - const leaf = { type: "leaf" }; - const hash = computeNodeHash(leaf); - strictEqual(hash.length, 64); - }); - - it("Leaf hash matches expected Arboricx domain", () => { - const leaf = { type: "leaf" }; - const hash = computeNodeHash(leaf); - strictEqual(hash, "92b8a9796dbeafbcd36757535876256392170d137bf36b319d77f11a37112158"); - }); -}); - -describe("merkle — node section parsing", () => { - const fixtureDir = "../../test/fixtures"; - - it("parses id.arboricx with correct node count", () => { - const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arboricx`) - ); - const { nodeMap } = parseNodeSection(data); - strictEqual(nodeMap.size, 4); - }); - - it("parses true.arboricx with correct node count", () => { - const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/true.arboricx`) - ); - const { nodeMap } = parseNodeSection(data); - strictEqual(nodeMap.size, 2); - }); - - it("parses false.arboricx with correct node count", () => { - const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/false.arboricx`) - ); - const { nodeMap } = parseNodeSection(data); - strictEqual(nodeMap.size, 1); - }); -}); - -describe("merkle — hash verification", () => { - const fixtureDir = "../../test/fixtures"; - - it("id.arboricx nodes all verify", () => { - const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arboricx`) - ); - const { nodeMap } = parseNodeSection(data); - const { verified, mismatches } = verifyNodeHashes(nodeMap); - ok(verified, "id.arboricx node hashes should verify"); - strictEqual(mismatches.length, 0); - }); - - it("true.arboricx nodes all verify", () => { - const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/true.arboricx`) - ); - const { nodeMap } = parseNodeSection(data); - const { verified, mismatches } = verifyNodeHashes(nodeMap); - ok(verified, "true.arboricx node hashes should verify"); - strictEqual(mismatches.length, 0); - }); - - it("corrupted node payload fails hash verification", () => { - const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arboricx`) - ); - const { nodeMap } = parseNodeSection(data); - // Find a stem node to corrupt - let stemKey = null; - for (const [key, node] of nodeMap) { - if (node.type === "stem") { stemKey = key; break; } - } - ok(stemKey, "should find a stem node to corrupt"); - const stem = nodeMap.get(stemKey); - // Corrupt the child hash so serializeNode produces a different payload - const corrupted = { - ...stem, - childHash: "00".repeat(32), - payload: Buffer.concat([Buffer.from([0x01]), Buffer.alloc(32, 0x00)]), - }; - nodeMap.set(stemKey, corrupted); - const { verified, mismatches } = verifyNodeHashes(nodeMap); - ok(!verified, "corrupted stem should fail hash verification"); - ok(mismatches.length > 0, "should have mismatches"); - }); -}); - -describe("merkle — closure verification", () => { - const fixtureDir = "../../test/fixtures"; - - it("id.arboricx has complete closure", () => { - const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arboricx`) - ); - const { nodeMap } = parseNodeSection(data); - const { complete, missing } = verifyClosure(nodeMap); - ok(complete, "id.arboricx should have complete closure"); - strictEqual(missing.length, 0); - }); - - it("verifyRootClosure checks transitive reachability", () => { - const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arboricx`) - ); - const { nodeMap } = parseNodeSection(data); - // Use the actual root hash from the fixture's manifest - const manifest = parseManifest(readFileSync(`${fixtureDir}/id.arboricx`)); - const rootHash = manifest.exports[0].root; - const { complete, missingRoots } = verifyRootClosure(nodeMap, rootHash); - ok(complete, "root should be reachable"); - strictEqual(missingRoots.length, 0); - }); - - it("parseNodeSection returns correct node count", () => { - const data = bundleParseNodeSection( - readFileSync(`${fixtureDir}/id.arboricx`) - ); - const result = parseNodeSection(data); - strictEqual(result.count, 4); - }); -}); - -// Helper for throws -function throws(fn, expected) { - try { - fn(); - return false; - } catch (e) { - return expected.test(e.message); - } -} diff --git a/ext/js/test/reduce.test.js b/ext/js/test/reduce.test.js index f5887c4..f3528da 100644 --- a/ext/js/test/reduce.test.js +++ b/ext/js/test/reduce.test.js @@ -1,80 +1,113 @@ -import { strictEqual, ok } from "node:assert"; -import { describe, it } from "node:test"; -import { apply, isLeaf, isStem, isFork } from "../src/tree.js"; -import { reduce } from "../src/cli.js"; +import { readFileSync } from 'node:fs'; +import { strictEqual, ok } from 'node:assert'; +import { describe, it } from 'node:test'; +import { + findLib, + init, + free, + leaf, + stem, + fork, + app, + reduce, + toBool, + toString, + toNumber, + loadBundleDefault, + ofString, + ofNumber, +} from '../src/lib.js'; -describe("tree — basic types", () => { - it("Leaf is empty array", () => { - ok(isLeaf([])); - ok(!isStem([])); - ok(!isFork([])); +const libPath = findLib(); + +describe('tree construction', () => { + it('leaf returns a positive index', () => { + const ctx = init(libPath); + try { + const idx = leaf(ctx); + ok(idx > 0); + } finally { + free(ctx); + } }); - it("Stem is single-element array", () => { - ok(isStem([[]])); - ok(!isLeaf([[]])); + it('stem wraps a child', () => { + const ctx = init(libPath); + try { + const l = leaf(ctx); + const s = stem(ctx, l); + ok(s > 0); + ok(s !== l); + } finally { + free(ctx); + } }); - it("Fork is two-element array", () => { - ok(isFork([[], []])); - ok(!isLeaf([[], []])); + it('fork combines left and right', () => { + const ctx = init(libPath); + try { + const a = leaf(ctx); + const b = leaf(ctx); + const f = fork(ctx, a, b); + ok(f > 0); + ok(f !== a && f !== b); + } finally { + free(ctx); + } }); }); -describe("tree — apply rules", () => { - // Leaf = [], Stem = [child], Fork = [right, left] - - it("apply(Leaf, b) = Stem(b)", () => { - const b = []; // Leaf - const result = apply([], b); - ok(isStem(result), "Stem(b) should be a Stem"); - strictEqual(result[0], b); +describe('reduction — booleans', () => { + it('true.arboricx reduces to boolean true', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync('../../test/fixtures/true.arboricx'); + const root = loadBundleDefault(ctx, bundle); + const result = reduce(ctx, root, 1_000_000n); + strictEqual(toBool(ctx, result), true); + } finally { + free(ctx); + } }); - it("apply(Stem(a), b) = Fork(a, b)", () => { - const a = []; // Leaf - const b = []; // Leaf - const result = apply([a], b); - ok(isFork(result), "Fork(a, b) should be a Fork"); - // Fork = [right, left] = [b, a] - strictEqual(result[0], b); - strictEqual(result[1], a); - }); - - it("apply(Fork(Leaf, a), _) = a", () => { - // Fork(Leaf, a) = [a, Leaf] - const a = []; // Leaf - const result = apply([a, []], []); - strictEqual(result, a); - ok(isLeaf(result)); + it('false.arboricx reduces to boolean false', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync('../../test/fixtures/false.arboricx'); + const root = loadBundleDefault(ctx, bundle); + const result = reduce(ctx, root, 1_000_000n); + strictEqual(toBool(ctx, result), false); + } finally { + free(ctx); + } }); }); -describe("tree — reduction", () => { - it("reduces Leaf to Leaf", () => { - const result = reduce([], 100); - ok(isLeaf(result)); - }); - - it("reduces Stem Leaf to Stem Leaf", () => { - const result = reduce([[]], 100); - ok(isStem(result)); - ok(isLeaf(result[0])); - }); - - it("reduces Fork Leaf Leaf to Fork Leaf Leaf", () => { - const result = reduce([[], []], 100); - ok(isFork(result)); - ok(isLeaf(result[0])); - ok(isLeaf(result[1])); - }); - - it("S combinator applied to Leaf reduces", () => { - // S = t (t (t t)) t = Fork (Fork (Fork Leaf Leaf) Leaf) Leaf - // In array form: [[[], []], [], []] - const s = [[], [[[], []], []]]; - const leaf = []; - const result = reduce([s, leaf], 100); - ok(Array.isArray(result), "S Leaf should reduce to an array"); +describe('reduction — id', () => { + it('id applied to string returns the string', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync('../../test/fixtures/id.arboricx'); + const idRoot = loadBundleDefault(ctx, bundle); + const arg = ofString(ctx, 'hello'); + const applied = app(ctx, idRoot, arg); + const result = reduce(ctx, applied, 1_000_000n); + strictEqual(toString(ctx, result), 'hello'); + } finally { + free(ctx); + } }); }); + +describe('reduction — numbers', () => { + it('ofNumber round-trips through toNumber', () => { + const ctx = init(libPath); + try { + const num = ofNumber(ctx, 42); + strictEqual(toNumber(ctx, num), 42); + } finally { + free(ctx); + } + }); +}); + diff --git a/ext/js/test/run-bundle.test.js b/ext/js/test/run-bundle.test.js index d50d315..e0e4e7c 100644 --- a/ext/js/test/run-bundle.test.js +++ b/ext/js/test/run-bundle.test.js @@ -1,120 +1,125 @@ -import { readFileSync } from "node:fs"; -import { strictEqual, ok, throws } from "node:assert"; -import { describe, it } from "node:test"; -import { parseManifest } from "../src/bundle.js"; -import { parseNodeSection as bundleParseNodeSection } from "../src/bundle.js"; -import { validateManifest, selectExport } from "../src/manifest.js"; -import { verifyNodeHashes, parseNodeSection as parseNodes } from "../src/merkle.js"; -import { buildTreeFromNodeMap } from "../src/cli.js"; +import { readFileSync } from 'node:fs'; +import { strictEqual, ok, throws } from 'node:assert'; +import { describe, it } from 'node:test'; +import { + findLib, + init, + free, + loadBundleDefault, + loadBundle, + reduce, + app, + ofString, + ofNumber, + toBool, + toString, + decode, + decodeType, +} from '../src/lib.js'; -const fixtureDir = "../../test/fixtures"; +const fixtureDir = '../../test/fixtures'; +const libPath = findLib(); -describe("run bundle — id.arboricx", () => { - const bundle = readFileSync(`${fixtureDir}/id.arboricx`); - const manifest = parseManifest(bundle); - const nodeSectionData = bundleParseNodeSection(bundle); - const { nodeMap } = parseNodes(nodeSectionData); - - it("manifest validates", () => { - validateManifest(manifest); +describe('run bundle — booleans', () => { + it('true.arboricx evaluates to true', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync(`${fixtureDir}/true.arboricx`); + const root = loadBundleDefault(ctx, bundle); + const result = reduce(ctx, root); + strictEqual(toBool(ctx, result), true); + strictEqual(decodeType(ctx, result), 'bool'); + strictEqual(decode(ctx, result), 'true'); + } finally { + free(ctx); + } }); - it("node hashes verify", () => { - const { verified } = verifyNodeHashes(nodeMap); - ok(verified); - }); - - it("export 'root' is selectable", () => { - const exp = selectExport(manifest, "root"); - strictEqual(exp.name, "root"); - }); - - it("tree reconstructs as a Fork", () => { - const exp = selectExport(manifest, "root"); - const tree = buildTreeFromNodeMap(nodeMap, exp.root); - ok(Array.isArray(tree)); - ok(tree.length >= 2, "tree should be a Fork (length >= 2)"); + it('false.arboricx evaluates to false', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync(`${fixtureDir}/false.arboricx`); + const root = loadBundleDefault(ctx, bundle); + const result = reduce(ctx, root); + strictEqual(toBool(ctx, result), false); + strictEqual(decodeType(ctx, result), 'bool'); + strictEqual(decode(ctx, result), 'false'); + } finally { + free(ctx); + } }); }); -describe("run bundle — true.arboricx", () => { - const bundle = readFileSync(`${fixtureDir}/true.arboricx`); - const manifest = parseManifest(bundle); - const nodeSectionData = bundleParseNodeSection(bundle); - const { nodeMap } = parseNodes(nodeSectionData); - - it("manifest validates", () => { - validateManifest(manifest); - }); - - it("export 'root' is selectable", () => { - const exp = selectExport(manifest, "root"); - strictEqual(exp.name, "root"); - }); - - it("tree reconstructs as Stem Leaf", () => { - const exp = selectExport(manifest, "root"); - const tree = buildTreeFromNodeMap(nodeMap, exp.root); - ok(Array.isArray(tree)); - strictEqual(tree.length, 1, "true should be a Stem (single child)"); - strictEqual(tree[0].length, 0, "child should be Leaf"); +describe('run bundle — id', () => { + it('id applied to string returns the string', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync(`${fixtureDir}/id.arboricx`); + const idRoot = loadBundleDefault(ctx, bundle); + const arg = ofString(ctx, 'hello'); + const applied = app(ctx, idRoot, arg); + const result = reduce(ctx, applied); + strictEqual(toString(ctx, result), 'hello'); + strictEqual(decodeType(ctx, result), 'string'); + } finally { + free(ctx); + } }); }); -describe("run bundle — false.arboricx", () => { - const bundle = readFileSync(`${fixtureDir}/false.arboricx`); - const manifest = parseManifest(bundle); - const nodeSectionData = bundleParseNodeSection(bundle); - const { nodeMap } = parseNodes(nodeSectionData); - - it("manifest validates", () => { - validateManifest(manifest); - }); - - it("export 'root' is selectable", () => { - const exp = selectExport(manifest, "root"); - strictEqual(exp.name, "root"); - }); - - it("tree reconstructs as Leaf", () => { - const exp = selectExport(manifest, "root"); - const tree = buildTreeFromNodeMap(nodeMap, exp.root); - strictEqual(tree.length, 0, "false should be Leaf (empty array)"); +describe('run bundle — append', () => { + it('append "hello " "world" = "hello world"', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync(`${fixtureDir}/append.arboricx`); + let term = loadBundleDefault(ctx, bundle); + term = app(ctx, term, ofString(ctx, 'hello ')); + term = app(ctx, term, ofString(ctx, 'world')); + const result = reduce(ctx, term); + strictEqual(toString(ctx, result), 'hello world'); + } finally { + free(ctx); + } }); }); -describe("run bundle — notQ.arboricx", () => { - const bundle = readFileSync(`${fixtureDir}/notQ.arboricx`); - const manifest = parseManifest(bundle); - const nodeSectionData = bundleParseNodeSection(bundle); - const { nodeMap } = parseNodes(nodeSectionData); - - it("manifest validates", () => { - validateManifest(manifest); - }); - - it("node hashes verify", () => { - const { verified } = verifyNodeHashes(nodeMap); - ok(verified); +describe('run bundle — notQ', () => { + it('notQ loads and reduces without error', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync(`${fixtureDir}/notQ.arboricx`); + const root = loadBundleDefault(ctx, bundle); + const result = reduce(ctx, root); + ok(result > 0); + } finally { + free(ctx); + } }); }); -describe("run bundle — missing export", () => { - const bundle = readFileSync(`${fixtureDir}/id.arboricx`); - const manifest = parseManifest(bundle); +describe('run bundle — named export', () => { + it('loadBundle selects named export', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync(`${fixtureDir}/id.arboricx`); + const root = loadBundle(ctx, bundle, 'id'); + ok(root > 0); + // id is a function; apply it before reducing + const applied = app(ctx, root, ofString(ctx, 'test')); + const result = reduce(ctx, applied); + strictEqual(toString(ctx, result), 'test'); + } finally { + free(ctx); + } + }); - it("nonexistent export fails clearly", () => { - throws(() => selectExport(manifest, "nonexistent"), /not found/); - }); -}); - -describe("run bundle — auto-select", () => { - // true.arboricx has only one export, should auto-select - const bundle = readFileSync(`${fixtureDir}/true.arboricx`); - const manifest = parseManifest(bundle); - - it("single export auto-selects", () => { - const exp = selectExport(manifest, undefined); - ok(exp, "should auto-select the only export"); + it('missing export throws', () => { + const ctx = init(libPath); + try { + const bundle = readFileSync(`${fixtureDir}/id.arboricx`); + throws(() => loadBundle(ctx, bundle, 'nonexistent'), /failed/); + } finally { + free(ctx); + } }); }); diff --git a/ext/zig/kernel_run_arboricx_typed.dag b/ext/zig/kernel_run_arboricx_typed.dag index 22b0a64..bcf5920 100644 --- a/ext/zig/kernel_run_arboricx_typed.dag +++ b/ext/zig/kernel_run_arboricx_typed.dag @@ -1,4 +1,4 @@ -2692 +2576 leaf fork 0 0 stem 1 @@ -9,2686 +9,2570 @@ stem 0 fork 5 6 fork 0 7 stem 8 -stem 6 -fork 10 0 -stem 11 -fork 12 11 -stem 13 -stem 14 -fork 0 15 -stem 16 -fork 17 6 -stem 18 -fork 0 1 -stem 20 -fork 0 21 -stem 22 -stem 23 -fork 0 24 -stem 25 fork 0 6 -stem 27 -fork 28 3 -fork 5 29 -stem 30 -fork 31 27 -stem 32 -stem 33 -fork 0 34 -stem 35 -fork 36 6 -fork 0 37 -fork 33 38 -fork 0 39 -stem 40 -stem 38 -stem 42 -fork 0 43 -stem 44 +stem 10 +fork 11 3 +fork 5 12 +stem 13 +fork 14 10 +stem 15 +stem 16 +fork 0 17 +stem 18 +fork 19 6 +fork 0 20 +stem 21 +stem 6 +fork 23 0 +stem 24 +fork 25 24 stem 26 -fork 0 46 -stem 47 -stem 12 -fork 0 49 -stem 50 -fork 51 6 -fork 42 52 -stem 53 -stem 9 -fork 0 55 -stem 56 -fork 9 37 -stem 58 -fork 33 1 -fork 0 60 -stem 61 -stem 2 -fork 0 63 -stem 64 -fork 42 6 -stem 66 -fork 67 1 -fork 65 68 -fork 62 69 -fork 0 70 -fork 59 71 -fork 0 72 -stem 73 -stem 74 -fork 0 75 -stem 76 -fork 77 52 -fork 57 78 -fork 54 79 -fork 42 80 -stem 81 -stem 82 -fork 0 83 -stem 84 -stem 57 -fork 0 86 -stem 87 -stem 45 -fork 88 89 -fork 85 90 -fork 48 91 -fork 45 92 -fork 41 93 -fork 26 94 -fork 0 95 -stem 96 -stem 37 -fork 0 18 -fork 98 99 -fork 97 100 -fork 0 101 -fork 19 102 -fork 9 103 -stem 104 -fork 1 0 -stem 106 -fork 0 107 -stem 108 -fork 65 52 -fork 5 110 -fork 9 111 -fork 0 112 -fork 98 113 -fork 109 114 -fork 0 115 -stem 116 -fork 117 100 -fork 0 118 -fork 19 119 -stem 120 -fork 1 6 -fork 2 122 -stem 123 -fork 0 122 -stem 125 -fork 0 11 -fork 1 127 -fork 126 128 -fork 124 129 -fork 0 130 -fork 121 131 -fork 0 132 -fork 105 133 -fork 9 134 -fork 9 135 -stem 136 +stem 27 +fork 0 28 +stem 29 +fork 30 6 +stem 31 fork 6 1 -fork 138 20 -stem 139 +fork 0 1 +fork 33 34 +stem 35 +fork 2 36 +fork 0 37 +stem 38 +fork 0 34 +fork 16 40 +fork 0 41 +stem 42 +fork 2 6 +fork 0 44 +stem 45 +stem 46 +fork 43 47 +fork 39 48 +stem 49 +stem 34 +fork 0 51 +stem 52 +stem 53 +fork 0 54 +stem 55 +stem 56 +fork 0 57 +stem 58 +stem 22 +fork 22 60 +stem 61 +fork 0 60 +stem 63 +stem 9 +fork 0 65 +stem 66 +stem 67 +fork 0 68 +stem 69 +fork 0 24 +fork 1 71 +fork 0 71 +fork 72 73 +fork 0 74 +stem 75 +stem 76 +fork 0 77 +stem 78 +stem 79 +fork 70 80 +fork 64 81 +fork 62 82 +fork 59 83 +fork 50 84 +fork 0 85 +stem 86 +stem 20 +fork 0 31 +fork 88 89 +fork 87 90 +fork 0 91 +fork 32 92 +stem 93 +fork 94 1 +fork 22 95 +fork 9 96 +stem 97 +fork 11 6 +fork 22 99 +stem 100 +fork 22 6 +stem 102 +stem 2 +fork 0 104 +stem 105 +fork 106 0 +fork 0 107 +fork 103 108 +fork 101 109 +fork 0 110 +stem 111 +stem 112 +fork 0 113 +stem 114 +fork 0 23 +stem 116 +stem 117 +fork 0 118 +stem 119 +fork 120 0 +fork 0 121 +stem 122 +fork 123 6 +fork 115 124 +fork 5 125 +stem 126 +stem 127 +fork 0 128 +stem 129 +fork 6 0 +fork 0 131 +fork 94 132 +fork 22 133 +fork 9 134 +stem 135 +stem 131 +fork 0 137 +stem 138 +stem 33 fork 0 140 stem 141 -stem 142 -fork 0 143 -stem 144 -stem 28 -fork 0 146 -stem 147 +fork 6 10 fork 0 10 -stem 149 -stem 150 +fork 143 144 +fork 112 145 +stem 146 +fork 144 144 +fork 147 148 +fork 0 149 +fork 88 150 fork 0 151 stem 152 -fork 153 0 -fork 0 154 -stem 155 -fork 156 6 -fork 0 157 -fork 33 158 +fork 153 95 +stem 154 +fork 94 10 +fork 155 156 +fork 76 157 +fork 9 158 fork 0 159 -stem 160 -fork 0 2 -stem 162 +fork 88 160 +fork 142 161 +fork 0 162 stem 163 -fork 0 164 -stem 165 -stem 5 -fork 0 167 -stem 168 -fork 28 6 -fork 42 170 -stem 171 -fork 65 0 -fork 0 173 -fork 67 174 -fork 172 175 -fork 0 176 -stem 177 -stem 178 -fork 0 179 -stem 180 -stem 181 -fork 169 182 -fork 5 183 -stem 184 -fork 148 0 -fork 0 186 -fork 185 187 -fork 42 188 -stem 189 -fork 0 58 -fork 190 191 -fork 166 192 -fork 163 193 -fork 0 194 -fork 98 195 -fork 0 196 -stem 197 -stem 198 -fork 0 199 -stem 200 -fork 6 0 -fork 0 202 -fork 0 203 -fork 0 204 -fork 0 205 -fork 0 206 -fork 6 207 -fork 6 203 -fork 0 209 -fork 0 210 -fork 6 211 -fork 0 212 -fork 6 206 -fork 0 214 -fork 6 204 -fork 6 216 -fork 6 217 -fork 6 218 -fork 0 216 -fork 0 220 -fork 6 221 -fork 6 214 -fork 6 209 -fork 0 224 -fork 0 225 -fork 0 226 -fork 227 0 -fork 223 228 -fork 222 229 -fork 213 230 -fork 219 231 -fork 215 232 -fork 213 233 -fork 208 234 -fork 0 235 -fork 33 236 -fork 0 237 -stem 238 +fork 164 90 +fork 0 165 +fork 32 166 +fork 76 167 +fork 9 168 +fork 0 169 +fork 88 170 +fork 139 171 +fork 0 172 stem 173 -fork 240 1 +fork 174 90 +fork 0 175 +fork 32 176 +fork 22 177 +stem 178 +fork 6 131 +fork 6 180 +fork 0 181 +stem 182 +fork 11 183 +fork 22 184 +stem 185 +fork 123 137 +fork 115 187 +fork 186 188 +fork 179 189 +fork 115 190 +fork 5 191 +stem 192 +stem 193 +fork 0 194 +stem 195 +fork 0 132 +fork 94 197 +fork 22 198 +fork 9 199 +stem 200 +fork 22 167 +stem 202 +stem 132 +fork 123 204 +fork 115 205 +fork 186 206 +fork 203 207 +fork 115 208 +fork 5 209 +stem 210 +stem 211 +fork 0 212 +stem 213 +fork 0 180 +fork 94 215 +fork 22 216 +fork 9 217 +stem 218 +fork 22 157 +stem 220 +stem 180 +fork 123 222 +fork 115 223 +fork 186 224 +fork 221 225 +fork 115 226 +fork 5 227 +stem 228 +stem 229 +fork 0 230 +stem 231 +fork 0 197 +fork 94 233 +fork 22 234 +fork 9 235 +stem 236 +fork 142 6 +fork 0 238 +stem 239 +fork 240 90 fork 0 241 -stem 242 -fork 150 6 -fork 243 244 -fork 9 245 -stem 246 -fork 202 0 -fork 0 248 -fork 249 0 -fork 250 154 -fork 42 251 -fork 9 252 -fork 9 253 -stem 254 -stem 166 -fork 0 256 +fork 32 242 +fork 22 243 +stem 244 +stem 197 +fork 123 246 +fork 115 247 +fork 186 248 +fork 245 249 +fork 115 250 +fork 5 251 +stem 252 +stem 253 +fork 0 254 +stem 255 +fork 6 132 +fork 0 257 +fork 94 258 +fork 22 259 +fork 9 260 +stem 261 stem 257 -fork 33 191 -fork 0 259 -stem 260 -fork 0 184 -fork 98 262 -fork 45 263 -fork 261 264 -fork 258 265 -fork 166 266 -fork 0 267 -stem 268 -fork 98 187 -fork 148 270 -fork 269 271 -fork 9 272 -fork 9 273 -stem 274 -stem 203 -fork 28 276 -fork 42 277 -fork 9 278 -stem 279 -fork 0 182 -stem 281 -stem 247 -fork 0 283 +fork 123 263 +fork 115 264 +fork 186 265 +fork 179 266 +fork 115 267 +fork 186 268 +fork 0 269 +fork 262 270 +fork 256 271 +fork 237 272 +fork 232 273 +fork 219 274 +fork 214 275 +fork 201 276 +fork 196 277 +fork 136 278 +fork 130 279 +fork 98 280 +fork 22 281 +fork 9 282 +fork 9 283 stem 284 -stem 275 -fork 0 286 +fork 0 2 +stem 286 stem 287 -fork 2 140 -fork 0 289 -stem 290 -fork 0 20 -fork 33 292 -fork 0 293 -stem 294 -fork 2 6 -fork 0 296 -stem 297 -stem 298 -fork 295 299 -fork 291 300 -stem 301 -fork 42 43 -stem 303 -fork 0 127 -fork 128 305 +fork 0 288 +stem 289 +stem 5 +fork 0 291 +stem 292 +stem 115 +fork 293 294 +fork 5 295 +stem 296 +stem 11 +fork 0 298 +stem 299 +fork 300 0 +fork 0 301 +fork 297 302 +fork 22 303 +stem 304 +fork 9 20 fork 0 306 -stem 307 -stem 308 -fork 0 309 -stem 310 -stem 311 -fork 88 312 -fork 45 313 -fork 304 314 -fork 48 315 -fork 302 316 -fork 0 317 -stem 318 -fork 319 100 -fork 0 320 -fork 19 321 -fork 0 322 -fork 98 323 -fork 45 324 -fork 42 325 -fork 0 326 -fork 33 327 +fork 305 307 +fork 290 308 +fork 287 309 +fork 0 310 +fork 88 311 +fork 0 312 +stem 313 +stem 314 +fork 0 315 +stem 316 +stem 317 +fork 0 318 +stem 319 +stem 123 +fork 0 321 +stem 322 +fork 24 0 +stem 324 +fork 0 325 +stem 326 +stem 327 fork 0 328 stem 329 -fork 0 89 -stem 331 -stem 280 -fork 0 333 +fork 88 21 +fork 70 331 +fork 64 332 +fork 5 333 stem 334 -stem 282 +fork 64 331 fork 0 336 -stem 337 -fork 98 191 -fork 338 339 -fork 335 340 -fork 332 341 -fork 45 342 -fork 330 343 -fork 288 344 -fork 42 345 -stem 346 -fork 0 254 -fork 347 348 -fork 285 349 -fork 42 350 -stem 351 -fork 352 191 -fork 0 353 -stem 354 -fork 355 100 +fork 335 337 +fork 330 338 +fork 22 339 +stem 340 +fork 341 21 +fork 0 342 +stem 343 +fork 344 90 +fork 0 345 +fork 32 346 +stem 347 +fork 348 71 +fork 0 349 +fork 88 350 +fork 0 351 +fork 88 352 +fork 323 353 +fork 0 354 +fork 16 355 fork 0 356 -fork 19 357 -fork 0 358 -fork 59 359 -fork 282 360 -fork 280 361 -fork 45 362 -fork 42 363 -stem 364 -fork 365 327 -fork 275 366 +stem 357 +fork 0 233 +fork 0 359 +fork 6 360 +fork 0 258 +fork 6 362 +fork 0 363 +fork 6 359 +fork 0 365 +fork 6 197 +fork 6 367 +fork 6 368 +fork 6 369 fork 0 367 -fork 255 368 -fork 247 369 -fork 239 370 -stem 371 -fork 372 11 -fork 198 373 -stem 374 -fork 9 58 -fork 9 376 -stem 377 -stem 378 -fork 0 379 -stem 380 -stem 169 -fork 0 382 -stem 383 -stem 384 -fork 0 385 -stem 386 -stem 243 +fork 0 371 +fork 6 372 +fork 6 365 +fork 6 257 +fork 0 375 +fork 0 376 +fork 0 377 +fork 378 0 +fork 374 379 +fork 373 380 +fork 364 381 +fork 370 382 +fork 366 383 +fork 364 384 +fork 361 385 +fork 0 386 +fork 16 387 fork 0 388 stem 389 -stem 390 -fork 0 391 -stem 392 +stem 107 +fork 391 1 +fork 0 392 stem 393 -fork 0 394 -stem 395 -stem 396 -fork 0 397 -stem 398 -fork 11 0 -stem 400 -fork 0 401 -stem 402 -fork 9 0 +fork 117 6 +fork 394 395 +fork 9 396 +stem 397 +fork 131 0 +fork 0 399 +fork 400 0 +fork 401 121 +fork 22 402 +fork 9 403 fork 9 404 -fork 0 405 -fork 98 406 -fork 403 407 -fork 0 408 -stem 409 -fork 410 100 -fork 0 411 -fork 19 412 -fork 0 413 -stem 414 -stem 415 -fork 42 416 -stem 417 -fork 42 138 -fork 0 419 -fork 418 420 -fork 163 421 -fork 0 422 -stem 423 -fork 424 100 -fork 0 425 -fork 19 426 -fork 156 427 -fork 0 428 -fork 98 429 -fork 181 430 -fork 42 431 -stem 432 -stem 202 -fork 28 434 -fork 42 435 -fork 0 436 -fork 433 437 -fork 42 438 -fork 9 439 -stem 440 -fork 57 325 -fork 0 442 -fork 441 443 -fork 399 444 -fork 387 445 -fork 384 446 -fork 169 447 -fork 5 448 +stem 405 +stem 290 +fork 0 407 +stem 408 +fork 16 307 +fork 0 410 +stem 411 +fork 0 296 +fork 88 413 +fork 64 414 +fork 412 415 +fork 409 416 +fork 290 417 +fork 0 418 +stem 419 +fork 88 302 +fork 300 421 +fork 420 422 +fork 9 423 +fork 9 424 +stem 425 +fork 11 204 +fork 22 427 +fork 9 428 +stem 429 +fork 0 294 +stem 431 +stem 306 +stem 398 +fork 0 434 +stem 435 +stem 426 +fork 0 437 +stem 438 +fork 0 93 +fork 88 440 +fork 64 441 +fork 22 442 +fork 0 443 +fork 16 444 +fork 0 445 +stem 446 +stem 64 +fork 0 448 stem 449 -stem 450 +stem 430 fork 0 451 stem 452 -fork 57 442 -fork 57 454 -fork 5 455 -stem 456 -stem 457 -fork 0 458 -stem 459 -stem 460 -fork 0 461 -stem 462 -fork 98 1 -fork 45 464 -fork 0 465 -fork 33 466 -fork 0 467 -stem 468 +stem 432 +fork 0 454 +stem 455 +fork 88 307 +fork 456 457 +fork 453 458 +fork 450 459 +fork 64 460 +fork 447 461 +fork 439 462 +fork 22 463 +stem 464 +fork 0 405 +fork 465 466 +fork 436 467 +fork 22 468 stem 469 -fork 0 470 -stem 471 +fork 470 307 +fork 0 471 stem 472 -fork 0 473 -stem 474 -stem 88 +fork 473 90 +fork 0 474 +fork 32 475 fork 0 476 -stem 477 -stem 332 -fork 0 479 -stem 480 -stem 41 -fork 0 482 -stem 483 -stem 248 +fork 433 477 +fork 432 478 +fork 430 479 +fork 64 480 +fork 22 481 +stem 482 +fork 483 444 +fork 426 484 fork 0 485 -stem 486 -stem 10 -fork 2 488 -fork 0 489 -stem 490 -fork 28 146 -fork 491 492 -stem 493 -fork 494 1 -fork 487 495 -fork 0 496 -stem 497 -fork 498 100 -fork 0 499 -fork 19 500 -fork 42 501 -fork 0 502 -fork 33 503 -fork 0 504 -stem 505 -fork 98 38 -fork 45 507 -fork 506 508 -fork 332 509 -fork 484 510 -fork 481 511 -fork 478 512 -fork 475 513 -fork 42 514 -stem 515 -fork 57 431 -fork 57 517 -fork 5 518 -fork 9 519 -fork 9 520 -fork 9 521 +fork 406 486 +fork 398 487 +fork 390 488 +stem 489 +fork 490 24 +fork 314 491 +stem 492 +fork 9 306 +fork 9 494 +stem 495 +stem 496 +fork 0 497 +stem 498 +stem 293 +fork 0 500 +stem 501 +stem 502 +fork 0 503 +stem 504 +stem 394 +fork 0 506 +stem 507 +stem 508 +fork 0 509 +stem 510 +stem 511 +fork 0 512 +stem 513 +stem 514 +fork 0 515 +stem 516 +fork 9 0 +fork 9 518 +fork 0 519 +fork 88 520 +fork 327 521 fork 0 522 -fork 516 523 -fork 463 524 -fork 453 525 -fork 381 526 +stem 523 +fork 524 90 +fork 0 525 +fork 32 526 fork 0 527 stem 528 -fork 529 100 -fork 0 530 -fork 19 531 -stem 532 -fork 533 204 -stem 534 -fork 535 1 -stem 536 -fork 537 11 -stem 538 -fork 539 1 -fork 198 540 -stem 541 -fork 533 205 -stem 543 -fork 544 1 -stem 545 -fork 546 11 -stem 547 -fork 548 1 -fork 198 549 -fork 0 550 -fork 33 551 -fork 0 552 +stem 529 +fork 22 530 +stem 531 +fork 22 33 +fork 0 533 +fork 532 534 +fork 287 535 +fork 0 536 +stem 537 +fork 538 90 +fork 0 539 +fork 32 540 +fork 123 541 +fork 0 542 +fork 88 543 +fork 115 544 +fork 22 545 +stem 546 +fork 11 137 +fork 22 548 +fork 0 549 +fork 547 550 +fork 22 551 +fork 9 552 stem 553 -fork 533 206 -stem 555 -fork 556 1 -stem 557 -fork 558 11 -stem 559 -fork 560 1 -fork 198 561 -fork 0 562 -fork 33 563 +fork 67 442 +fork 0 555 +fork 554 556 +fork 517 557 +fork 505 558 +fork 502 559 +fork 293 560 +fork 5 561 +stem 562 +stem 563 fork 0 564 stem 565 -stem 566 -fork 0 567 -stem 568 +fork 67 555 +fork 67 567 +fork 5 568 stem 569 -fork 0 570 -stem 571 -stem 156 -fork 0 573 -stem 574 -stem 575 -fork 0 576 -stem 577 -stem 578 -fork 0 579 -stem 580 -stem 581 -fork 0 582 -stem 583 -fork 9 405 -fork 9 585 -stem 586 -stem 585 -stem 405 -fork 589 1 -fork 0 590 -fork 588 591 -fork 0 592 -fork 587 593 -fork 584 594 -fork 481 595 -fork 572 596 -fork 332 597 -fork 569 598 -fork 45 599 -fork 554 600 -fork 42 601 -stem 602 -fork 0 541 -fork 603 604 -fork 0 605 -fork 542 606 -fork 0 607 -fork 0 608 -fork 375 609 -fork 198 610 -stem 611 -fork 5 325 -stem 613 -stem 614 -fork 0 615 -stem 616 -stem 617 -fork 0 618 -stem 619 -fork 5 431 -fork 9 621 -fork 9 622 -stem 623 -stem 624 -fork 0 625 -stem 626 -fork 88 510 -fork 472 628 -fork 42 629 -stem 630 -stem 550 -fork 33 604 -fork 0 633 -stem 634 -stem 635 -fork 0 636 -stem 637 -stem 638 -fork 0 639 -stem 640 +stem 570 +fork 0 571 stem 572 -fork 0 642 -stem 643 -stem 481 -fork 0 645 -stem 646 -stem 644 -fork 0 648 -stem 649 -stem 647 -fork 0 651 -stem 652 -fork 0 207 -fork 533 654 -stem 655 -fork 656 1 -stem 657 -fork 658 11 -stem 659 -fork 660 1 -fork 198 661 -fork 0 662 -fork 33 663 -fork 0 664 -stem 665 -stem 666 -fork 0 667 -stem 668 -stem 669 -fork 0 670 -stem 671 -stem 672 -fork 0 673 -stem 674 -stem 675 -fork 0 676 -stem 677 -stem 678 -fork 0 679 -stem 680 -stem 653 -fork 0 682 -stem 683 +stem 573 +fork 0 574 +stem 575 +fork 88 1 +fork 64 577 +fork 0 578 +fork 16 579 +fork 0 580 +stem 581 +stem 582 +fork 0 583 stem 584 -fork 0 685 -stem 686 -stem 687 -fork 0 688 +stem 585 +fork 0 586 +stem 587 +stem 70 +fork 0 589 +stem 590 +stem 450 +fork 0 592 +stem 593 +fork 16 21 +fork 0 595 +stem 596 +stem 597 +fork 0 598 +stem 599 +stem 399 +fork 0 601 +stem 602 +stem 23 +fork 2 604 +fork 0 605 +stem 606 +fork 11 298 +fork 607 608 +stem 609 +fork 610 1 +fork 603 611 +fork 0 612 +stem 613 +fork 614 90 +fork 0 615 +fork 32 616 +fork 22 617 +fork 0 618 +fork 16 619 +fork 0 620 +stem 621 +fork 622 336 +fork 450 623 +fork 600 624 +fork 594 625 +fork 591 626 +fork 588 627 +fork 22 628 +stem 629 +fork 67 545 +fork 67 631 +fork 5 632 +fork 9 633 +fork 9 634 +fork 9 635 +fork 0 636 +fork 630 637 +fork 576 638 +fork 566 639 +fork 499 640 +fork 0 641 +stem 642 +fork 643 90 +fork 0 644 +fork 32 645 +stem 646 +fork 647 197 +stem 648 +fork 649 1 +stem 650 +fork 651 24 +stem 652 +fork 653 1 +fork 314 654 +stem 655 +fork 647 233 +stem 657 +fork 658 1 +stem 659 +fork 660 24 +stem 661 +fork 662 1 +fork 314 663 +fork 0 664 +fork 16 665 +fork 0 666 +stem 667 +fork 647 359 +stem 669 +fork 670 1 +stem 671 +fork 672 24 +stem 673 +fork 674 1 +fork 314 675 +fork 0 676 +fork 16 677 +fork 0 678 +stem 679 +stem 680 +fork 0 681 +stem 682 +stem 683 +fork 0 684 +stem 685 +stem 323 +fork 0 687 +stem 688 stem 689 -stem 690 -fork 0 691 +fork 0 690 +stem 691 stem 692 -fork 9 586 -fork 9 694 -fork 9 695 -stem 696 -stem 695 +fork 0 693 stem 694 -fork 0 594 -fork 699 700 +fork 9 519 +fork 9 696 +stem 697 +stem 696 +stem 519 +fork 700 1 fork 0 701 -fork 698 702 +fork 699 702 fork 0 703 -fork 697 704 -fork 693 705 -fork 684 706 -fork 681 707 -fork 653 708 -fork 650 709 -fork 647 710 -fork 644 711 -fork 481 712 -fork 641 713 -fork 332 714 -fork 638 715 -fork 45 716 -fork 635 717 -fork 42 718 -stem 719 -fork 720 604 -fork 0 721 -fork 632 722 -fork 198 723 -fork 9 724 -fork 9 725 -fork 9 726 -fork 0 727 -fork 631 728 -fork 627 729 -fork 620 730 -fork 0 731 -stem 732 -fork 733 100 -fork 0 734 -fork 19 735 -fork 0 736 -fork 98 737 -fork 62 738 -fork 62 739 -fork 0 740 -stem 741 -stem 403 -fork 0 743 +fork 698 704 +fork 695 705 +fork 594 706 +fork 686 707 +fork 450 708 +fork 683 709 +fork 64 710 +fork 668 711 +fork 22 712 +stem 713 +fork 0 655 +fork 714 715 +fork 0 716 +fork 656 717 +fork 0 718 +fork 0 719 +fork 493 720 +fork 314 721 +stem 722 +fork 16 1 +fork 0 724 +stem 725 +fork 5 442 +stem 727 +stem 728 +fork 0 729 +stem 730 +stem 731 +fork 0 732 +stem 733 +fork 5 545 +fork 9 735 +fork 9 736 +stem 737 +stem 738 +fork 0 739 +stem 740 +fork 70 624 +fork 585 742 +fork 22 743 stem 744 -fork 88 507 -fork 45 746 -fork 5 747 +stem 664 +fork 16 715 +fork 0 747 stem 748 -fork 0 508 -fork 749 750 -fork 745 751 -fork 42 752 -stem 753 -fork 754 38 -fork 0 755 -stem 756 -fork 757 100 -fork 0 758 -fork 19 759 +stem 749 +fork 0 750 +stem 751 +stem 752 +fork 0 753 +stem 754 +stem 686 +fork 0 756 +stem 757 +stem 594 +fork 0 759 stem 760 -stem 322 -fork 762 206 -fork 0 763 -fork 33 764 +stem 758 +fork 0 762 +stem 763 +stem 761 fork 0 765 stem 766 -stem 767 +stem 668 fork 0 768 stem 769 -fork 9 176 -stem 771 +stem 770 +fork 0 771 stem 772 -fork 0 773 -stem 774 +stem 773 +fork 0 774 stem 775 -fork 0 776 -stem 777 -stem 67 -fork 0 779 -stem 780 -stem 174 -stem 782 +stem 776 +fork 0 777 +stem 778 +stem 779 +fork 0 780 +stem 781 +stem 767 fork 0 783 stem 784 -stem 785 +stem 695 fork 0 786 stem 787 -fork 240 20 -stem 789 -fork 790 292 -fork 0 791 -stem 792 -fork 793 6 -stem 794 -stem 127 -fork 762 1 -fork 796 797 -fork 795 798 -fork 0 799 -stem 800 -stem 801 -fork 0 802 -stem 803 -stem 804 -fork 0 805 -stem 806 -fork 0 501 -fork 33 808 -fork 0 809 -stem 810 -fork 12 1 -fork 57 812 -fork 811 813 -fork 807 814 -fork 788 815 -fork 781 816 -fork 169 817 -fork 5 818 -stem 819 +stem 788 +fork 0 789 +stem 790 +stem 791 +fork 0 792 stem 793 -fork 0 821 -stem 822 -fork 823 0 -fork 5 824 -stem 825 -fork 308 797 +fork 9 697 +fork 9 795 +fork 9 796 +stem 797 +stem 796 +stem 795 +fork 0 705 +fork 800 801 +fork 0 802 +fork 799 803 +fork 0 804 +fork 798 805 +fork 794 806 +fork 785 807 +fork 782 808 +fork 767 809 +fork 764 810 +fork 761 811 +fork 758 812 +fork 594 813 +fork 755 814 +fork 450 815 +fork 752 816 +fork 64 817 +fork 749 818 +fork 22 819 +stem 820 +fork 821 715 +fork 0 822 +fork 746 823 +fork 314 824 +fork 9 825 +fork 9 826 fork 9 827 -stem 828 -fork 0 797 -fork 829 830 -fork 826 831 -fork 9 832 -fork 0 833 -fork 33 834 +fork 0 828 +fork 745 829 +fork 741 830 +fork 734 831 +fork 0 832 +stem 833 +fork 834 90 fork 0 835 -stem 836 -stem 837 -fork 0 838 -stem 839 -fork 840 510 -fork 820 841 -fork 42 842 -stem 843 -fork 844 191 -fork 778 845 -fork 384 846 -fork 770 847 -fork 0 848 -stem 849 -fork 850 100 -fork 0 851 -fork 19 852 -fork 62 853 +fork 32 836 +fork 0 837 +fork 88 838 +fork 726 839 +fork 726 840 +fork 0 841 +stem 842 +fork 94 359 +fork 0 844 +fork 16 845 +fork 0 846 +stem 847 +stem 848 +fork 0 849 +stem 850 +fork 9 110 +stem 852 +stem 853 fork 0 854 -fork 98 855 -fork 0 856 -fork 761 857 +stem 855 +stem 856 +fork 0 857 stem 858 -fork 859 1 +stem 103 fork 0 860 stem 861 -fork 1 125 -fork 0 863 -fork 1 864 -fork 862 865 -fork 742 866 -stem 867 +stem 108 +stem 863 +fork 0 864 +stem 865 +stem 866 +fork 0 867 stem 868 -fork 0 869 +fork 391 34 stem 870 -stem 59 +fork 871 40 fork 0 872 stem 873 -stem 258 -fork 0 875 -stem 876 -stem 148 -fork 0 878 -stem 879 +fork 874 6 +stem 875 +stem 71 +fork 877 95 +fork 876 878 +fork 0 879 stem 880 -fork 0 881 -stem 882 -fork 42 509 +stem 881 +fork 0 882 +stem 883 stem 884 -fork 9 771 -fork 9 886 -fork 0 887 -fork 885 888 -fork 620 889 -fork 883 890 -fork 877 891 -fork 874 892 -fork 0 893 -stem 894 -fork 895 100 -fork 0 896 -fork 19 897 -fork 0 898 -fork 98 899 -fork 62 900 +fork 0 885 +stem 886 +fork 0 617 +fork 16 888 +fork 0 889 +stem 890 +fork 25 1 +fork 67 892 +fork 891 893 +fork 887 894 +fork 869 895 +fork 862 896 +fork 293 897 +fork 5 898 +stem 899 +stem 874 fork 0 901 stem 902 -fork 0 128 -fork 1 904 -fork 0 905 -fork 1 906 -fork 0 907 -fork 1 908 -fork 862 909 -fork 903 910 -fork 0 911 -fork 98 912 -fork 871 913 -fork 201 914 -fork 5 915 +fork 903 0 +fork 5 904 +stem 905 +fork 76 95 +fork 9 907 +stem 908 +fork 0 95 +fork 909 910 +fork 906 911 +fork 9 912 +fork 0 913 +fork 16 914 +fork 0 915 stem 916 -fork 575 0 -fork 42 918 -fork 9 919 -stem 920 -fork 6 27 -fork 0 27 -fork 922 923 -fork 178 924 -stem 925 -fork 923 923 -fork 926 927 +stem 917 +fork 0 918 +stem 919 +fork 920 624 +fork 900 921 +fork 22 922 +stem 923 +fork 924 307 +fork 859 925 +fork 502 926 +fork 851 927 fork 0 928 -fork 98 929 -fork 0 930 -stem 931 -stem 932 -fork 0 933 -stem 934 -fork 42 122 -stem 936 -fork 5 508 +stem 929 +fork 930 90 +fork 0 931 +fork 32 932 +fork 726 933 +fork 0 934 +fork 88 935 +fork 0 936 +fork 348 937 stem 938 -fork 202 20 -fork 243 940 +fork 939 1 +fork 0 940 stem 941 -stem 942 +fork 1 6 fork 0 943 -stem 944 -fork 311 322 -fork 57 946 -fork 42 947 -fork 0 948 -fork 304 949 -fork 258 950 -fork 42 951 +fork 1 944 +fork 0 945 +fork 1 946 +fork 942 947 +fork 843 948 +stem 949 +stem 950 +fork 0 951 stem 952 -fork 953 191 -fork 945 954 -fork 42 955 -stem 956 -fork 957 38 -fork 0 958 -stem 959 -fork 960 100 -fork 0 961 -fork 19 962 -fork 9 963 +stem 433 +fork 0 954 +stem 955 +stem 409 +fork 0 957 +stem 958 +stem 300 +fork 0 960 +stem 961 +stem 962 +fork 0 963 stem 964 -fork 965 125 -fork 935 966 -fork 57 967 -fork 0 968 -fork 939 969 -fork 166 970 -fork 42 971 -stem 972 -fork 973 38 -fork 0 974 -stem 975 -fork 976 100 -fork 0 977 -fork 19 978 -fork 0 979 -fork 937 980 -fork 935 981 -fork 5 982 +fork 22 623 +stem 966 +fork 9 852 +fork 9 968 +fork 0 969 +fork 967 970 +fork 734 971 +fork 965 972 +fork 959 973 +fork 956 974 +fork 0 975 +stem 976 +fork 977 90 +fork 0 978 +fork 32 979 +fork 0 980 +fork 88 981 +fork 726 982 fork 0 983 -fork 98 984 -fork 163 985 -fork 0 986 -stem 987 -fork 988 100 +stem 984 +fork 0 72 +fork 1 986 +fork 0 987 +fork 1 988 fork 0 989 -fork 19 990 -fork 42 991 -stem 992 -fork 6 202 -fork 0 994 -stem 995 -fork 28 996 -fork 178 997 +fork 1 990 +fork 942 991 +fork 985 992 +fork 0 993 +fork 88 994 +fork 953 995 +fork 317 996 +fork 5 997 stem 998 -stem 999 -fork 0 1000 -stem 1001 -fork 1002 154 -fork 993 1003 -fork 201 1004 -fork 0 1005 -fork 921 1006 -fork 0 1007 -fork 917 1008 -fork 148 1009 -fork 42 1010 -fork 9 1011 +fork 323 0 +fork 22 1000 +fork 9 1001 +stem 1002 +stem 153 +fork 0 1004 +stem 1005 +fork 22 943 +stem 1007 +fork 5 336 +stem 1009 +fork 131 34 +fork 394 1011 stem 1012 -stem 948 -fork 42 963 -fork 0 1015 -fork 1014 1016 -fork 166 1017 -fork 0 1018 -fork 59 1019 -fork 942 1020 -stem 1021 -fork 0 249 -fork 1022 1023 -fork 0 1024 -stem 1025 -fork 1026 122 -fork 308 1027 -stem 1028 -fork 1022 292 -fork 0 1030 -stem 1031 -fork 1032 863 -fork 1029 1033 -fork 42 1034 +stem 1013 +fork 0 1014 +stem 1015 +fork 79 93 +fork 67 1017 +fork 22 1018 +fork 0 1019 +fork 62 1020 +fork 409 1021 +fork 22 1022 +stem 1023 +fork 1024 307 +fork 1016 1025 +fork 22 1026 +stem 1027 +fork 1028 21 +fork 0 1029 +stem 1030 +fork 1031 90 +fork 0 1032 +fork 32 1033 +fork 9 1034 stem 1035 -fork 181 154 -fork 5 1037 -stem 1038 -stem 209 -fork 28 1040 -fork 0 1041 -fork 1039 1042 -fork 1036 1043 -fork 201 1044 +fork 1036 944 +fork 1006 1037 +fork 67 1038 +fork 0 1039 +fork 1010 1040 +fork 290 1041 +fork 22 1042 +stem 1043 +fork 1044 21 fork 0 1045 -fork 1013 1046 -fork 612 1047 -fork 198 1048 -stem 1049 -stem 201 -fork 0 1051 -stem 1052 -fork 45 966 -fork 5 1054 -stem 1055 -stem 1056 +stem 1046 +fork 1047 90 +fork 0 1048 +fork 32 1049 +fork 0 1050 +fork 1008 1051 +fork 1006 1052 +fork 5 1053 +fork 0 1054 +fork 88 1055 +fork 287 1056 fork 0 1057 stem 1058 -fork 178 0 -fork 9 1060 -fork 0 1061 -fork 33 1062 -fork 0 1063 -stem 1064 -fork 1065 508 -fork 1059 1066 -fork 166 1067 -fork 42 1068 -stem 1069 -fork 1070 38 -fork 0 1071 -stem 1072 -fork 1073 100 +fork 1059 90 +fork 0 1060 +fork 32 1061 +fork 22 1062 +stem 1063 +stem 215 +fork 11 1065 +fork 112 1066 +stem 1067 +stem 1068 +fork 0 1069 +stem 1070 +fork 1071 121 +fork 1064 1072 +fork 317 1073 fork 0 1074 -fork 19 1075 -stem 1076 -fork 0 1023 -fork 0 1078 -fork 1077 1079 -fork 42 1080 -fork 9 1081 -stem 1082 -stem 204 -fork 28 1084 -fork 782 1085 -stem 1086 -stem 1087 -fork 0 1088 -stem 1089 -fork 0 865 -fork 1 1091 -fork 0 1092 -fork 1 1093 -fork 0 1094 -fork 1 1095 -fork 0 1096 -fork 1 1097 -fork 862 1098 -fork 42 1099 +fork 1003 1075 +fork 0 1076 +fork 999 1077 +fork 300 1078 +fork 22 1079 +fork 9 1080 +stem 1081 +stem 1019 +fork 22 1034 +fork 0 1084 +fork 1083 1085 +fork 290 1086 +fork 0 1087 +fork 433 1088 +fork 1013 1089 +stem 1090 +fork 0 400 +fork 1091 1092 +fork 0 1093 +stem 1094 +fork 1095 943 +fork 76 1096 +stem 1097 +fork 1091 40 +fork 0 1099 stem 1100 -stem 323 -stem 808 -stem 1103 -fork 28 1104 -fork 163 1105 -fork 0 1106 +fork 1101 945 +fork 1098 1102 +fork 22 1103 +stem 1104 +fork 115 121 +fork 5 1106 stem 1107 -fork 1108 100 +fork 11 263 fork 0 1109 -fork 19 1110 -fork 1102 1111 -fork 0 1112 -fork 1101 1113 -fork 42 1114 -stem 1115 -stem 436 -fork 98 155 -fork 181 1118 -fork 1117 1119 -fork 5 1120 -fork 0 1121 -fork 1116 1122 -fork 42 1123 -stem 1124 -fork 42 901 -stem 1126 -fork 5 442 -stem 1128 +fork 1108 1110 +fork 1105 1111 +fork 317 1112 +fork 0 1113 +fork 1082 1114 +fork 723 1115 +fork 314 1116 +stem 1117 +fork 64 1037 +fork 5 1119 +stem 1120 +stem 1121 +fork 0 1122 +stem 1123 +fork 112 0 +fork 9 1125 +fork 0 1126 +fork 16 1127 +fork 0 1128 stem 1129 -fork 0 1130 -stem 1131 -stem 823 -fork 0 1133 +fork 1130 336 +fork 1124 1131 +fork 290 1132 +fork 22 1133 stem 1134 -stem 1135 +fork 1135 21 fork 0 1136 stem 1137 -stem 1138 +fork 1138 90 fork 0 1139 -stem 1140 -fork 0 404 -fork 33 1142 +fork 32 1140 +stem 1141 +fork 0 1092 fork 0 1143 -stem 1144 -stem 1145 -fork 0 1146 +fork 1142 1144 +fork 22 1145 +fork 9 1146 stem 1147 -fork 1148 510 -fork 1141 1149 -fork 1132 1150 -fork 258 1151 -fork 42 1152 +fork 11 246 +fork 863 1149 +stem 1150 +stem 1151 +fork 0 1152 stem 1153 -fork 1154 191 -fork 0 1155 -stem 1156 -fork 1157 100 +fork 0 947 +fork 1 1155 +fork 0 1156 +fork 1 1157 fork 0 1158 -fork 19 1159 +fork 1 1159 fork 0 1160 -fork 98 1161 -fork 62 1162 -fork 9 1163 -fork 0 1164 -fork 1127 1165 -fork 0 1166 +fork 1 1161 +fork 942 1162 +fork 22 1163 +stem 1164 +stem 440 +stem 888 stem 1167 -fork 862 1096 -fork 1168 1169 -stem 1170 -fork 1171 1099 -fork 45 1172 -fork 1125 1173 -fork 0 1174 -fork 98 1175 -fork 42 1176 -stem 1177 -fork 1178 38 -fork 1090 1179 -fork 5 1180 -stem 1181 -fork 28 1085 -fork 28 1183 -fork 0 1184 -fork 1182 1185 -fork 0 1186 -fork 1083 1187 -fork 0 1188 -fork 98 1189 -fork 42 1190 -stem 1191 -fork 1192 38 -fork 1053 1193 -fork 169 1194 -fork 5 1195 -stem 1196 -fork 0 918 -fork 59 1198 -fork 880 1199 -fork 42 1200 +fork 11 1168 +fork 287 1169 +fork 0 1170 +stem 1171 +fork 1172 90 +fork 0 1173 +fork 32 1174 +fork 1166 1175 +fork 0 1176 +fork 1165 1177 +fork 22 1178 +stem 1179 +stem 549 +fork 88 122 +fork 115 1182 +fork 1181 1183 +fork 5 1184 +fork 0 1185 +fork 1180 1186 +fork 22 1187 +stem 1188 +fork 22 983 +stem 1190 +fork 5 555 +stem 1192 +stem 1193 +fork 0 1194 +stem 1195 +stem 903 +fork 0 1197 +stem 1198 +stem 1199 +fork 0 1200 stem 1201 stem 1202 fork 0 1203 stem 1204 -fork 203 0 -fork 0 1206 +fork 0 518 +fork 16 1206 fork 0 1207 -fork 0 1208 -fork 0 1209 -fork 1077 1210 -fork 42 1211 -fork 9 1212 -stem 1213 -fork 1214 1187 -fork 0 1215 -fork 98 1216 -fork 42 1217 -stem 1218 -fork 1219 38 -fork 1053 1220 -fork 88 1221 -fork 1205 1222 -fork 883 1223 -fork 1197 1224 -fork 148 1225 -fork 166 1226 -fork 42 1227 -stem 1228 -fork 1229 38 -fork 1050 1230 -fork 198 1231 -stem 1232 -fork 575 464 -fork 0 1234 -fork 59 1235 -fork 880 1236 -fork 42 1237 -fork 9 1238 -stem 1239 -fork 0 963 +stem 1208 +stem 1209 +fork 0 1210 +stem 1211 +fork 1212 624 +fork 1205 1213 +fork 1196 1214 +fork 409 1215 +fork 22 1216 +stem 1217 +fork 1218 307 +fork 0 1219 +stem 1220 +fork 1221 90 +fork 0 1222 +fork 32 1223 +fork 0 1224 +fork 88 1225 +fork 726 1226 +fork 9 1227 +fork 0 1228 +fork 1191 1229 +fork 0 1230 +stem 1231 +fork 942 1160 +fork 1232 1233 +stem 1234 +fork 1235 1163 +fork 64 1236 +fork 1189 1237 +fork 0 1238 +fork 88 1239 +fork 22 1240 stem 1241 -fork 1242 122 -stem 1243 -fork 0 995 -fork 0 1245 -fork 0 1246 -fork 6 1247 -fork 6 994 -fork 0 1249 +fork 1242 21 +fork 1154 1243 +fork 5 1244 +stem 1245 +fork 11 1149 +fork 11 1247 +fork 0 1248 +fork 1246 1249 fork 0 1250 -fork 6 1251 +fork 1148 1251 fork 0 1252 -fork 6 1246 -fork 0 1254 -fork 6 995 -fork 6 1256 -fork 6 1257 -fork 6 1258 -fork 0 1256 -fork 0 1260 -fork 6 1261 -fork 6 1254 -fork 6 1249 -fork 0 1264 -fork 0 1265 -fork 0 1266 -fork 6 224 -fork 0 1268 -fork 6 1250 +fork 88 1253 +fork 22 1254 +stem 1255 +fork 1256 21 +fork 320 1257 +fork 293 1258 +fork 5 1259 +stem 1260 +fork 0 1000 +fork 433 1262 +fork 962 1263 +fork 22 1264 +stem 1265 +stem 1266 +fork 0 1267 +stem 1268 +fork 132 0 fork 0 1270 -fork 6 1271 -fork 0 1258 -fork 6 1245 -fork 0 1274 -fork 0 1275 -fork 0 1257 -fork 0 1277 -fork 6 1275 -fork 6 1277 -fork 6 1274 -fork 0 1281 -fork 6 1252 fork 0 1271 -fork 6 1270 -fork 0 1285 -fork 1254 0 -fork 1286 1287 -fork 1269 1288 -fork 1284 1289 -fork 1283 1290 -fork 1279 1291 -fork 1282 1292 -fork 1262 1293 -fork 1273 1294 -fork 1248 1295 -fork 1280 1296 -fork 1269 1297 -fork 1279 1298 -fork 1278 1299 -fork 1276 1300 -fork 1273 1301 -fork 1272 1302 -fork 1255 1303 -fork 1269 1304 -fork 1267 1305 -fork 1263 1306 -fork 1262 1307 -fork 1253 1308 -fork 1259 1309 -fork 1255 1310 -fork 1253 1311 -fork 1248 1312 -fork 0 1313 -fork 1244 1314 -fork 308 1315 -stem 1316 -fork 1242 129 -stem 1318 -fork 6 225 -fork 6 1260 -fork 0 1321 -fork 1284 0 -fork 1263 1323 -fork 1279 1324 -fork 1322 1325 -fork 1255 1326 -fork 1259 1327 -fork 1320 1328 -fork 1279 1329 -fork 1278 1330 -fork 1255 1331 -fork 1248 1332 -fork 1284 1333 -fork 1272 1334 -fork 1263 1335 -fork 1279 1336 -fork 1267 1337 -fork 1279 1338 -fork 1320 1339 -fork 1283 1340 -fork 1272 1341 -fork 1278 1342 -fork 1272 1343 -fork 1263 1344 -fork 1278 1345 -fork 1248 1346 -fork 1263 1347 -fork 1320 1348 -fork 1279 1349 -fork 1279 1350 -fork 1253 1351 -fork 1284 1352 -fork 0 1353 -fork 1319 1354 -fork 308 1355 -stem 1356 -stem 904 -fork 1358 128 -fork 126 1359 -fork 1242 1360 -stem 1361 -fork 1283 1289 -fork 1272 1363 -fork 1278 1364 -fork 1272 1365 -fork 1263 1366 -fork 1278 1367 -fork 1248 1368 -fork 1263 1369 -fork 1320 1370 -fork 1279 1371 -fork 1279 1372 -fork 1253 1373 -fork 1284 1374 +fork 0 1272 +fork 0 1273 +fork 1142 1274 +fork 22 1275 +fork 9 1276 +stem 1277 +fork 1278 1251 +fork 0 1279 +fork 88 1280 +fork 22 1281 +stem 1282 +fork 1283 21 +fork 320 1284 +fork 70 1285 +fork 1269 1286 +fork 965 1287 +fork 1261 1288 +fork 300 1289 +fork 290 1290 +fork 22 1291 +stem 1292 +fork 1293 21 +fork 1118 1294 +fork 314 1295 +stem 1296 +fork 323 577 +fork 0 1298 +fork 433 1299 +fork 962 1300 +fork 22 1301 +fork 9 1302 +stem 1303 +fork 0 1034 +stem 1305 +fork 1306 943 +stem 1307 +fork 0 215 +fork 0 1309 +fork 0 1310 +fork 6 1311 +fork 0 182 +fork 6 1313 +fork 0 1314 +fork 6 1310 +fork 0 1316 +fork 6 215 +fork 6 1318 +fork 6 1319 +fork 6 1320 +fork 0 1318 +fork 0 1322 +fork 6 1323 +fork 6 1316 +fork 6 181 +fork 0 1326 +fork 0 1327 +fork 0 1328 +fork 6 375 +fork 0 1330 +fork 6 182 +fork 0 1332 +fork 6 1333 +fork 0 1320 +fork 6 1309 +fork 0 1336 +fork 0 1337 +fork 0 1319 +fork 0 1339 +fork 6 1337 +fork 6 1339 +fork 6 1336 +fork 0 1343 +fork 6 1314 +fork 0 1333 +fork 6 1332 +fork 0 1347 +fork 1316 0 +fork 1348 1349 +fork 1331 1350 +fork 1346 1351 +fork 1345 1352 +fork 1341 1353 +fork 1344 1354 +fork 1324 1355 +fork 1335 1356 +fork 1312 1357 +fork 1342 1358 +fork 1331 1359 +fork 1341 1360 +fork 1340 1361 +fork 1338 1362 +fork 1335 1363 +fork 1334 1364 +fork 1317 1365 +fork 1331 1366 +fork 1329 1367 +fork 1325 1368 +fork 1324 1369 +fork 1315 1370 +fork 1321 1371 +fork 1317 1372 +fork 1315 1373 +fork 1312 1374 fork 0 1375 -fork 1362 1376 -fork 308 1377 +fork 1308 1376 +fork 76 1377 stem 1378 -fork 1358 1359 -fork 126 1380 -fork 1242 1381 +stem 944 +fork 1380 72 +fork 1306 1381 stem 1382 -fork 0 1261 -fork 1277 0 -fork 1321 1385 -fork 1275 1386 -fork 1248 1387 -fork 1384 1388 -fork 1283 1389 -fork 0 1390 -fork 1383 1391 -fork 308 1392 -stem 1393 -fork 1358 1380 -fork 126 1395 -fork 1242 1396 -stem 1397 -fork 6 1321 -fork 1279 1289 -fork 1276 1400 -fork 1259 1401 -fork 1273 1402 -fork 1269 1403 -fork 1279 1404 -fork 1278 1405 -fork 1399 1406 -fork 1253 1407 -fork 1279 1408 -fork 1280 1409 -fork 1269 1410 -fork 1267 1411 -fork 1263 1412 -fork 1262 1413 -fork 1253 1414 -fork 1259 1415 -fork 1255 1416 -fork 1253 1417 -fork 1248 1418 -fork 0 1419 -fork 1398 1420 -fork 308 1421 -stem 1422 -fork 1358 1395 -fork 126 1424 -fork 1242 1425 -stem 1426 -fork 0 1251 -fork 0 1428 -fork 6 1266 -fork 1276 1289 -fork 1248 1431 -fork 1259 1432 -fork 1278 1433 -fork 1430 1434 -fork 1248 1435 -fork 1429 1436 -fork 1269 1437 -fork 1279 1438 -fork 1278 1439 -fork 1399 1440 -fork 1253 1441 -fork 1279 1442 -fork 1280 1443 -fork 1269 1444 -fork 1267 1445 -fork 1263 1446 -fork 1262 1447 -fork 1253 1448 -fork 1259 1449 -fork 1255 1450 -fork 1253 1451 -fork 1248 1452 -fork 0 1453 -fork 1427 1454 -fork 308 1455 -stem 1456 -fork 1358 1424 -fork 126 1458 -fork 1242 1459 -stem 1460 -fork 1461 1376 -fork 308 1462 -stem 1463 -fork 1358 1458 -fork 126 1465 -fork 1242 1466 -stem 1467 -fork 1253 0 -fork 1279 1469 -fork 1276 1470 -fork 1253 1471 -fork 1259 1472 -fork 1320 1473 -fork 1278 1474 -fork 1248 1475 -fork 1280 1476 -fork 1253 1477 -fork 1259 1478 -fork 1273 1479 -fork 0 1480 -fork 1468 1481 -fork 308 1482 -stem 1483 -fork 1358 1465 -fork 126 1485 -fork 1242 1486 -stem 1487 -fork 1279 1400 -fork 1253 1489 -fork 1284 1490 -fork 1269 1491 -fork 1262 1492 -fork 1255 1493 -fork 1248 1494 -fork 1269 1495 -fork 1267 1496 -fork 1263 1497 -fork 1262 1498 -fork 1253 1499 -fork 1259 1500 -fork 1255 1501 -fork 1253 1502 -fork 1248 1503 -fork 0 1504 -fork 1488 1505 -fork 308 1506 -stem 1507 -fork 1358 1485 -fork 1358 1509 -fork 126 1510 -fork 1242 1511 -stem 1512 -fork 1513 20 -fork 308 1514 -stem 1515 -fork 0 139 -stem 1517 -fork 0 940 -stem 1519 -fork 1358 1510 -fork 126 1521 -fork 1520 1522 -fork 1518 1523 -fork 308 1524 +fork 6 376 +fork 6 1322 +fork 0 1385 +fork 1346 0 +fork 1325 1387 +fork 1341 1388 +fork 1386 1389 +fork 1317 1390 +fork 1321 1391 +fork 1384 1392 +fork 1341 1393 +fork 1340 1394 +fork 1317 1395 +fork 1312 1396 +fork 1346 1397 +fork 1334 1398 +fork 1325 1399 +fork 1341 1400 +fork 1329 1401 +fork 1341 1402 +fork 1384 1403 +fork 1345 1404 +fork 1334 1405 +fork 1340 1406 +fork 1334 1407 +fork 1325 1408 +fork 1340 1409 +fork 1312 1410 +fork 1325 1411 +fork 1384 1412 +fork 1341 1413 +fork 1341 1414 +fork 1315 1415 +fork 1346 1416 +fork 0 1417 +fork 1383 1418 +fork 76 1419 +stem 1420 +stem 986 +fork 1422 72 +fork 1380 1423 +fork 1306 1424 +stem 1425 +fork 1345 1351 +fork 1334 1427 +fork 1340 1428 +fork 1334 1429 +fork 1325 1430 +fork 1340 1431 +fork 1312 1432 +fork 1325 1433 +fork 1384 1434 +fork 1341 1435 +fork 1341 1436 +fork 1315 1437 +fork 1346 1438 +fork 0 1439 +fork 1426 1440 +fork 76 1441 +stem 1442 +fork 1422 1423 +fork 1380 1444 +fork 1306 1445 +stem 1446 +fork 1338 0 +fork 1341 1448 +fork 1329 1449 +fork 1341 1450 +fork 1338 1451 +fork 1335 1452 +fork 1324 1453 +fork 0 1454 +fork 1447 1455 +fork 76 1456 +stem 1457 +fork 1422 1444 +fork 1380 1459 +fork 1306 1460 +stem 1461 +fork 1341 1351 +fork 1338 1463 +fork 1321 1464 +fork 1335 1465 +fork 1331 1466 +fork 1338 1467 +fork 1341 1468 +fork 1329 1469 +fork 1341 1470 +fork 1338 1471 +fork 1335 1472 +fork 1324 1473 +fork 1331 1474 +fork 1329 1475 +fork 1325 1476 +fork 1324 1477 +fork 1315 1478 +fork 1321 1479 +fork 1317 1480 +fork 1315 1481 +fork 1312 1482 +fork 0 1483 +fork 1462 1484 +fork 76 1485 +stem 1486 +fork 1422 1459 +fork 1380 1488 +fork 1306 1489 +stem 1490 +fork 0 1313 +fork 0 1492 +fork 6 1328 +fork 1338 1351 +fork 1312 1495 +fork 1321 1496 +fork 1340 1497 +fork 1494 1498 +fork 1312 1499 +fork 1493 1500 +fork 1331 1501 +fork 1338 1502 +fork 1341 1503 +fork 1329 1504 +fork 1341 1505 +fork 1338 1506 +fork 1335 1507 +fork 1324 1508 +fork 1331 1509 +fork 1329 1510 +fork 1325 1511 +fork 1324 1512 +fork 1315 1513 +fork 1321 1514 +fork 1317 1515 +fork 1315 1516 +fork 1312 1517 +fork 0 1518 +fork 1491 1519 +fork 76 1520 +stem 1521 +fork 1422 1488 +fork 1380 1523 +fork 1306 1524 stem 1525 -fork 1358 1521 -fork 1520 1527 -fork 1518 1528 -fork 1526 1529 -fork 1516 1530 -fork 1508 1531 -fork 1484 1532 -fork 1464 1533 -fork 1457 1534 -fork 1423 1535 -fork 1394 1536 -fork 1379 1537 -fork 1357 1538 -fork 1317 1539 -fork 42 1540 -stem 1541 -stem 1256 -fork 28 1543 -fork 0 1544 -fork 1039 1545 -fork 1542 1546 -fork 0 1547 -fork 98 1548 -fork 201 1549 -fork 57 1550 -fork 0 1551 -fork 1240 1552 -fork 166 1553 -fork 42 1554 -stem 1555 -fork 1556 38 -fork 42 1557 -stem 1558 -fork 0 217 -fork 6 1560 -fork 0 218 -fork 6 205 -fork 6 1563 -fork 0 1564 -fork 6 212 -fork 6 210 -fork 0 1567 -fork 0 1568 -fork 1569 0 -fork 1566 1570 -fork 1565 1571 -fork 1562 1572 -fork 1561 1573 -fork 215 1574 -fork 213 1575 -fork 208 1576 -fork 0 1577 -fork 33 1578 -fork 0 1579 +fork 1526 1440 +fork 76 1527 +stem 1528 +fork 1422 1523 +fork 1380 1530 +fork 1306 1531 +stem 1532 +fork 1315 0 +fork 1341 1534 +fork 1338 1535 +fork 1315 1536 +fork 1321 1537 +fork 1384 1538 +fork 1340 1539 +fork 1312 1540 +fork 1342 1541 +fork 1315 1542 +fork 1321 1543 +fork 1335 1544 +fork 0 1545 +fork 1533 1546 +fork 76 1547 +stem 1548 +fork 1422 1530 +fork 1380 1550 +fork 1306 1551 +stem 1552 +fork 1341 1463 +fork 1315 1554 +fork 1346 1555 +fork 1331 1556 +fork 1324 1557 +fork 1317 1558 +fork 1312 1559 +fork 1331 1560 +fork 1329 1561 +fork 1325 1562 +fork 1324 1563 +fork 1315 1564 +fork 1321 1565 +fork 1317 1566 +fork 1315 1567 +fork 1312 1568 +fork 0 1569 +fork 1553 1570 +fork 76 1571 +stem 1572 +fork 1422 1550 +fork 1422 1574 +fork 1380 1575 +fork 1306 1576 +stem 1577 +fork 1578 34 +fork 76 1579 stem 1580 -fork 1581 370 +fork 0 35 stem 1582 -fork 1583 11 -fork 198 1584 -stem 1585 -fork 33 155 -fork 0 1587 -stem 1588 -fork 0 532 -fork 98 1590 -fork 62 1591 -fork 5 1592 -stem 1593 -fork 1594 127 -fork 62 1595 -fork 0 1596 -stem 1597 -fork 1598 860 -fork 201 1599 -fork 1589 1600 -fork 0 1601 -fork 632 1602 -fork 198 1603 -stem 1604 -fork 0 1604 -fork 33 1606 -fork 0 1607 -stem 1608 -stem 1609 -fork 0 1610 -stem 1611 -stem 1612 -fork 0 1613 -stem 1614 -stem 1615 +fork 0 1011 +stem 1584 +fork 1422 1575 +fork 1380 1586 +fork 1585 1587 +fork 1583 1588 +fork 76 1589 +stem 1590 +fork 1422 1586 +fork 1585 1592 +fork 1583 1593 +fork 1591 1594 +fork 1581 1595 +fork 1573 1596 +fork 1549 1597 +fork 1529 1598 +fork 1522 1599 +fork 1487 1600 +fork 1458 1601 +fork 1443 1602 +fork 1421 1603 +fork 1379 1604 +fork 22 1605 +stem 1606 +stem 1318 +fork 11 1608 +fork 0 1609 +fork 1108 1610 +fork 1607 1611 +fork 0 1612 +fork 88 1613 +fork 317 1614 +fork 67 1615 fork 0 1616 -stem 1617 -stem 1618 -fork 0 1619 +fork 1304 1617 +fork 290 1618 +fork 22 1619 stem 1620 -stem 1621 -fork 0 1622 +fork 1621 21 +fork 22 1622 stem 1623 -stem 1624 -fork 0 1625 -stem 1626 -stem 684 -fork 0 1628 -stem 1629 -stem 554 -fork 0 1631 -stem 1632 -stem 1633 -fork 0 1634 -stem 1635 -stem 1636 -fork 0 1637 -stem 1638 -stem 1639 -fork 0 1640 -stem 1641 -stem 1642 -fork 0 1643 -stem 1644 +fork 0 368 +fork 6 1625 +fork 0 369 +fork 6 233 +fork 6 1628 +fork 0 1629 +fork 6 363 +fork 6 258 +fork 0 1632 +fork 0 1633 +fork 1634 0 +fork 1631 1635 +fork 1630 1636 +fork 1627 1637 +fork 1626 1638 +fork 366 1639 +fork 364 1640 +fork 361 1641 +fork 0 1642 +fork 16 1643 +fork 0 1644 stem 1645 -fork 0 1646 +fork 1646 488 stem 1647 -stem 1648 -fork 0 1649 +fork 1648 24 +fork 314 1649 stem 1650 -stem 1630 +fork 16 122 fork 0 1652 stem 1653 -fork 9 1604 -fork 9 1655 -fork 9 1656 -fork 0 1657 -fork 631 1658 -fork 627 1659 -fork 620 1660 +fork 0 646 +fork 88 1655 +fork 726 1656 +fork 5 1657 +stem 1658 +fork 1659 71 +fork 726 1660 fork 0 1661 stem 1662 -fork 1663 100 -fork 0 1664 -fork 19 1665 +fork 1663 940 +fork 317 1664 +fork 1654 1665 fork 0 1666 -fork 98 1667 -fork 62 1668 -fork 62 1669 -fork 0 1670 -stem 1671 -fork 1672 860 -fork 201 1673 -fork 0 1674 -fork 33 1675 -fork 0 1676 +fork 746 1667 +fork 314 1668 +stem 1669 +fork 0 1669 +fork 16 1671 +fork 0 1672 +stem 1673 +stem 1674 +fork 0 1675 +stem 1676 stem 1677 -stem 1678 -fork 0 1679 +fork 0 1678 +stem 1679 stem 1680 -stem 1681 -fork 0 1682 +fork 0 1681 +stem 1682 stem 1683 -stem 1684 -fork 0 1685 +fork 0 1684 +stem 1685 stem 1686 -stem 1687 -fork 0 1688 +fork 0 1687 +stem 1688 stem 1689 -stem 1690 -fork 0 1691 -stem 1692 -stem 1693 -fork 0 1694 -stem 1695 -stem 1696 -fork 0 1697 +fork 0 1690 +stem 1691 +stem 785 +fork 0 1693 +stem 1694 +stem 782 +fork 0 1696 +stem 1697 stem 1698 -stem 478 -fork 0 1700 -stem 1701 -stem 1702 -fork 0 1703 -stem 1704 -stem 1705 -fork 0 1706 -stem 1707 -stem 1708 -fork 0 1709 -stem 1710 -stem 1711 -fork 0 1712 -stem 1713 -fork 533 203 -stem 1715 -fork 1716 1 -stem 1717 -fork 1718 11 -stem 1719 -fork 1720 1 -fork 198 1721 -fork 0 1722 -fork 33 1723 +fork 0 1699 +stem 1700 +stem 1695 +fork 0 1702 +stem 1703 +fork 9 1669 +fork 9 1705 +fork 9 1706 +fork 0 1707 +fork 745 1708 +fork 741 1709 +fork 734 1710 +fork 0 1711 +stem 1712 +fork 1713 90 +fork 0 1714 +fork 32 1715 +fork 0 1716 +fork 88 1717 +fork 726 1718 +fork 726 1719 +fork 0 1720 +stem 1721 +fork 1722 940 +fork 317 1723 fork 0 1724 -stem 1725 -stem 1726 -fork 0 1727 +fork 16 1725 +fork 0 1726 +stem 1727 stem 1728 -stem 1729 -fork 0 1730 +fork 0 1729 +stem 1730 stem 1731 -stem 1732 -fork 0 1733 +fork 0 1732 +stem 1733 stem 1734 -stem 1735 -fork 0 1736 +fork 0 1735 +stem 1736 stem 1737 -stem 1738 -fork 0 1739 +fork 0 1738 +stem 1739 stem 1740 -stem 1741 -fork 0 1742 +fork 0 1741 +stem 1742 stem 1743 -stem 1744 -fork 0 1745 +fork 0 1744 +stem 1745 stem 1746 -stem 1747 -fork 0 1748 -stem 1749 -stem 1654 -fork 0 1751 +fork 0 1747 +stem 1748 +stem 591 +fork 0 1750 +stem 1751 stem 1752 -stem 1651 -fork 0 1754 +fork 0 1753 +stem 1754 stem 1755 -stem 1756 -fork 0 1757 +fork 0 1756 +stem 1757 stem 1758 -stem 1753 -fork 0 1760 +fork 0 1759 +stem 1760 stem 1761 -stem 662 -stem 919 -fork 1764 1606 -fork 0 1765 -fork 1763 1766 -fork 198 1767 -fork 9 1768 -fork 9 1769 -fork 9 1770 -fork 0 1771 -fork 631 1772 -fork 627 1773 -fork 620 1774 -fork 0 1775 +fork 0 1762 +stem 1763 +fork 647 132 +stem 1765 +fork 1766 1 +stem 1767 +fork 1768 24 +stem 1769 +fork 1770 1 +fork 314 1771 +fork 0 1772 +fork 16 1773 +fork 0 1774 +stem 1775 stem 1776 -fork 1777 100 -fork 0 1778 -fork 19 1779 +fork 0 1777 +stem 1778 +stem 1779 fork 0 1780 -fork 98 1781 -fork 62 1782 -fork 62 1783 -fork 0 1784 +stem 1781 +stem 1782 +fork 0 1783 +stem 1784 stem 1785 -fork 1786 860 -fork 201 1787 -fork 0 1788 -fork 33 1789 -fork 0 1790 +fork 0 1786 +stem 1787 +stem 1788 +fork 0 1789 +stem 1790 stem 1791 -stem 1792 -fork 0 1793 +fork 0 1792 +stem 1793 stem 1794 -stem 1795 -fork 0 1796 +fork 0 1795 +stem 1796 stem 1797 -stem 1798 -fork 0 1799 -stem 1800 -stem 1801 -fork 0 1802 -stem 1803 -stem 1804 -fork 0 1805 +fork 0 1798 +stem 1799 +stem 1704 +fork 0 1801 +stem 1802 +stem 1701 +fork 0 1804 +stem 1805 stem 1806 -stem 1807 -fork 0 1808 -stem 1809 -stem 1810 -fork 0 1811 -stem 1812 -stem 1813 +fork 0 1807 +stem 1808 +stem 1803 +fork 0 1810 +stem 1811 +stem 1001 +fork 1813 1671 fork 0 1814 -stem 1815 -stem 1816 -fork 0 1817 -stem 1818 -stem 1714 +fork 746 1815 +fork 314 1816 +fork 9 1817 +fork 9 1818 +fork 9 1819 fork 0 1820 -stem 1821 -stem 1822 -fork 0 1823 -stem 1824 -stem 1759 -fork 0 1826 -stem 1827 -stem 1762 -fork 0 1829 -stem 1830 -fork 581 592 -fork 332 1832 -fork 1612 1833 -fork 45 1834 -fork 1609 1835 -fork 42 1836 -stem 1837 -fork 1838 663 -fork 0 1839 -fork 1605 1840 -fork 198 1841 -fork 9 1842 -fork 9 1843 -fork 9 1844 -fork 0 1845 -fork 631 1846 -fork 627 1847 -fork 620 1848 -fork 0 1849 -stem 1850 -fork 1851 100 -fork 0 1852 -fork 19 1853 -fork 0 1854 -fork 98 1855 -fork 62 1856 -fork 62 1857 -fork 0 1858 -stem 1859 -fork 1860 860 -fork 201 1861 -fork 0 1862 -fork 33 1863 -fork 0 1864 -stem 1865 -stem 1866 -fork 0 1867 -stem 1868 -stem 1869 -fork 0 1870 -stem 1871 -stem 1872 -fork 0 1873 -stem 1874 -stem 1875 -fork 0 1876 -stem 1877 -stem 1878 -fork 0 1879 -stem 1880 -stem 1881 -fork 0 1882 -stem 1883 -stem 1884 -fork 0 1885 -stem 1886 -stem 1887 -fork 0 1888 -stem 1889 -stem 1890 -fork 0 1891 -stem 1892 -stem 1893 -fork 0 1894 -stem 1895 +fork 745 1821 +fork 741 1822 +fork 734 1823 +fork 0 1824 stem 1825 -fork 0 1897 -stem 1898 -stem 693 -fork 0 1900 -stem 1901 -stem 1902 +fork 1826 90 +fork 0 1827 +fork 32 1828 +fork 0 1829 +fork 88 1830 +fork 726 1831 +fork 726 1832 +fork 0 1833 +stem 1834 +fork 1835 940 +fork 317 1836 +fork 0 1837 +fork 16 1838 +fork 0 1839 +stem 1840 +stem 1841 +fork 0 1842 +stem 1843 +stem 1844 +fork 0 1845 +stem 1846 +stem 1847 +fork 0 1848 +stem 1849 +stem 1850 +fork 0 1851 +stem 1852 +stem 1853 +fork 0 1854 +stem 1855 +stem 1856 +fork 0 1857 +stem 1858 +stem 1859 +fork 0 1860 +stem 1861 +stem 1862 +fork 0 1863 +stem 1864 +stem 1865 +fork 0 1866 +stem 1867 +stem 1764 +fork 0 1869 +stem 1870 +stem 1871 +fork 0 1872 +stem 1873 +stem 1809 +fork 0 1875 +stem 1876 +stem 1812 +fork 0 1878 +stem 1879 +fork 692 703 +fork 450 1881 +fork 1677 1882 +fork 64 1883 +fork 1674 1884 +fork 22 1885 +stem 1886 +fork 1887 665 +fork 0 1888 +fork 1670 1889 +fork 314 1890 +fork 9 1891 +fork 9 1892 +fork 9 1893 +fork 0 1894 +fork 745 1895 +fork 741 1896 +fork 734 1897 +fork 0 1898 +stem 1899 +fork 1900 90 +fork 0 1901 +fork 32 1902 fork 0 1903 -stem 1904 -stem 1905 -fork 0 1906 -stem 1907 +fork 88 1904 +fork 726 1905 +fork 726 1906 +fork 0 1907 stem 1908 -fork 0 1909 -stem 1910 -stem 1911 -fork 0 1912 -stem 1913 -fork 9 696 -fork 9 1915 -fork 9 1916 -fork 9 1917 -fork 9 1918 -stem 1919 -stem 1918 -stem 1917 -stem 1916 +fork 1909 940 +fork 317 1910 +fork 0 1911 +fork 16 1912 +fork 0 1913 +stem 1914 stem 1915 -fork 0 705 -fork 1924 1925 -fork 0 1926 -fork 1923 1927 +fork 0 1916 +stem 1917 +stem 1918 +fork 0 1919 +stem 1920 +stem 1921 +fork 0 1922 +stem 1923 +stem 1924 +fork 0 1925 +stem 1926 +stem 1927 fork 0 1928 -fork 1922 1929 -fork 0 1930 -fork 1921 1931 -fork 0 1932 -fork 1920 1933 -fork 1914 1934 -fork 1831 1935 -fork 1899 1936 -fork 1896 1937 -fork 1831 1938 -fork 1828 1939 -fork 1762 1940 -fork 1825 1941 -fork 1819 1942 -fork 1762 1943 -fork 1759 1944 -fork 1753 1945 -fork 1750 1946 -fork 1654 1947 -fork 1714 1948 -fork 1699 1949 -fork 1654 1950 -fork 1651 1951 -fork 1630 1952 -fork 1627 1953 -fork 684 1954 -fork 1624 1955 -fork 653 1956 -fork 1621 1957 -fork 647 1958 -fork 1618 1959 -fork 481 1960 -fork 1615 1961 -fork 332 1962 -fork 1612 1963 -fork 45 1964 -fork 1609 1965 -fork 42 1966 +stem 1929 +stem 1930 +fork 0 1931 +stem 1932 +stem 1933 +fork 0 1934 +stem 1935 +stem 1936 +fork 0 1937 +stem 1938 +stem 1939 +fork 0 1940 +stem 1941 +stem 1942 +fork 0 1943 +stem 1944 +stem 1874 +fork 0 1946 +stem 1947 +stem 794 +fork 0 1949 +stem 1950 +stem 1951 +fork 0 1952 +stem 1953 +stem 1954 +fork 0 1955 +stem 1956 +stem 1957 +fork 0 1958 +stem 1959 +stem 1960 +fork 0 1961 +stem 1962 +fork 9 797 +fork 9 1964 +fork 9 1965 +fork 9 1966 +fork 9 1967 +stem 1968 stem 1967 -fork 1968 1606 -fork 0 1969 -fork 1605 1970 -fork 0 1971 -fork 0 1972 -fork 542 1973 -fork 0 1974 +stem 1966 +stem 1965 +stem 1964 +fork 0 806 +fork 1973 1974 fork 0 1975 -fork 542 1976 +fork 1972 1976 fork 0 1977 -fork 0 1978 -fork 1586 1979 -fork 198 1980 -stem 1981 -fork 42 860 -stem 1983 -fork 0 324 -fork 33 1985 -fork 0 1986 -stem 1987 -fork 178 244 -fork 9 1989 -fork 9 1990 -stem 1991 -stem 1992 -fork 0 1993 -stem 1994 -fork 880 509 -fork 42 1996 -stem 1997 -fork 0 1600 -fork 921 1999 -fork 42 2000 -stem 2001 -fork 2002 551 -fork 0 2003 -fork 542 2004 -fork 198 2005 -fork 9 2006 -fork 9 2007 -fork 0 2008 -fork 1998 2009 -fork 1995 2010 -fork 384 2011 -fork 169 2012 -fork 1988 2013 -fork 0 2014 -stem 2015 -fork 2016 100 -fork 0 2017 -fork 19 2018 -fork 0 2019 -fork 1984 2020 -fork 62 2021 -fork 201 2022 +fork 1971 1978 +fork 0 1979 +fork 1970 1980 +fork 0 1981 +fork 1969 1982 +fork 1963 1983 +fork 1880 1984 +fork 1948 1985 +fork 1945 1986 +fork 1880 1987 +fork 1877 1988 +fork 1812 1989 +fork 1874 1990 +fork 1868 1991 +fork 1812 1992 +fork 1809 1993 +fork 1803 1994 +fork 1800 1995 +fork 1704 1996 +fork 1764 1997 +fork 1749 1998 +fork 1704 1999 +fork 1701 2000 +fork 1695 2001 +fork 1692 2002 +fork 785 2003 +fork 1689 2004 +fork 767 2005 +fork 1686 2006 +fork 761 2007 +fork 1683 2008 +fork 594 2009 +fork 1680 2010 +fork 450 2011 +fork 1677 2012 +fork 64 2013 +fork 1674 2014 +fork 22 2015 +stem 2016 +fork 2017 1671 +fork 0 2018 +fork 1670 2019 +fork 0 2020 +fork 0 2021 +fork 656 2022 fork 0 2023 -fork 33 2024 -fork 0 2025 -stem 2026 -fork 33 127 -fork 0 2028 -stem 2029 -fork 578 590 -fork 169 2031 -fork 2030 2032 -fork 880 2033 -fork 45 2034 -fork 57 2035 -fork 2027 2036 -fork 45 2037 -fork 554 2038 -fork 42 2039 -fork 9 2040 +fork 0 2024 +fork 656 2025 +fork 0 2026 +fork 0 2027 +fork 1651 2028 +fork 314 2029 +stem 2030 +fork 22 940 +stem 2032 +fork 0 441 +fork 16 2034 +fork 0 2035 +stem 2036 +fork 112 395 +fork 9 2038 +fork 9 2039 +stem 2040 stem 2041 -fork 9 2008 -fork 0 2043 -fork 631 2044 -fork 627 2045 -fork 620 2046 -fork 0 2047 -stem 2048 -fork 2049 100 -fork 0 2050 -fork 19 2051 +fork 0 2042 +stem 2043 +fork 962 623 +fork 22 2045 +stem 2046 +fork 0 1665 +fork 1003 2048 +fork 22 2049 +stem 2050 +fork 2051 665 fork 0 2052 -fork 98 2053 -fork 62 2054 -fork 62 2055 -fork 0 2056 -stem 2057 -fork 2058 860 -fork 201 2059 -fork 0 2060 -fork 2042 2061 -fork 42 2062 -stem 2063 -fork 2064 551 -fork 0 2065 -fork 1982 2066 -fork 198 2067 +fork 656 2053 +fork 314 2054 +fork 9 2055 +fork 9 2056 +fork 0 2057 +fork 2047 2058 +fork 2044 2059 +fork 502 2060 +fork 293 2061 +fork 2037 2062 +fork 0 2063 +stem 2064 +fork 2065 90 +fork 0 2066 +fork 32 2067 fork 0 2068 -fork 1559 2069 -fork 148 2070 -fork 163 2071 +fork 2033 2069 +fork 726 2070 +fork 317 2071 fork 0 2072 -fork 98 2073 +fork 16 2073 fork 0 2074 -fork 1233 2075 -fork 198 2076 -stem 2077 -fork 148 1118 -fork 0 2079 -fork 33 2080 -fork 0 2081 -stem 2082 -fork 216 0 -fork 0 2084 -fork 0 2085 -fork 0 2086 -fork 0 2087 -fork 33 2088 -fork 0 2089 +stem 2075 +fork 16 71 +fork 0 2077 +stem 2078 +fork 689 701 +fork 293 2080 +fork 2079 2081 +fork 962 2082 +fork 64 2083 +fork 67 2084 +fork 2076 2085 +fork 64 2086 +fork 668 2087 +fork 22 2088 +fork 9 2089 stem 2090 -stem 2085 -fork 2 2092 -fork 0 2093 -stem 2094 -stem 2095 +fork 9 2057 +fork 0 2092 +fork 745 2093 +fork 741 2094 +fork 734 2095 fork 0 2096 stem 2097 -fork 33 906 +fork 2098 90 fork 0 2099 -stem 2100 -fork 1022 20 -fork 0 2102 -fork 33 2103 -fork 0 2104 -stem 2105 -fork 0 138 -fork 0 2107 -fork 240 2108 -stem 2109 -fork 0 2108 -fork 2110 2111 -fork 0 2112 -stem 2113 -stem 2114 -fork 0 2115 -stem 2116 -stem 2117 -fork 0 2118 -stem 2119 -fork 762 203 +fork 32 2100 +fork 0 2101 +fork 88 2102 +fork 726 2103 +fork 726 2104 +fork 0 2105 +stem 2106 +fork 2107 940 +fork 317 2108 +fork 0 2109 +fork 2091 2110 +fork 22 2111 +stem 2112 +fork 2113 665 +fork 0 2114 +fork 2031 2115 +fork 314 2116 +fork 0 2117 +fork 1624 2118 +fork 300 2119 +fork 287 2120 fork 0 2121 -fork 2122 20 +fork 88 2122 fork 0 2123 -stem 2124 -fork 28 0 -fork 1 2126 -fork 2125 2127 -fork 308 2128 +fork 1297 2124 +fork 314 2125 +stem 2126 +stem 2127 +fork 0 2128 stem 2129 -stem 1112 -fork 2131 215 -fork 2130 2132 -fork 0 2133 -fork 33 2134 -fork 0 2135 +fork 197 0 +fork 0 2131 +fork 2132 0 +stem 121 +fork 2134 1 +fork 112 2135 stem 2136 -fork 156 0 -stem 2138 -fork 2139 1 -fork 28 2140 -fork 0 2141 -fork 33 2142 -fork 0 2143 +fork 0 2132 +fork 2137 2138 +fork 9 2139 +stem 2140 +fork 2141 1584 +fork 2133 2142 +fork 863 2143 stem 2144 -stem 2145 -fork 0 2146 +fork 0 2135 +fork 2145 2146 stem 2147 -stem 7 -stem 1160 -fork 2150 654 +fork 0 2138 +fork 0 2149 +fork 2148 2150 stem 2151 -fork 2152 1 -fork 0 2153 -stem 2154 -stem 898 -fork 2156 203 -stem 2157 -fork 2158 1 -fork 2155 2159 -fork 0 2160 -fork 2149 2161 -fork 42 2162 -stem 2163 -fork 2164 38 -fork 1053 2165 -fork 2148 2166 -fork 282 2167 -fork 169 2168 -fork 5 2169 -stem 2170 -fork 169 2166 -fork 5 2172 -stem 2173 -fork 62 918 -fork 148 2175 -fork 0 2176 -fork 33 2177 -fork 0 2178 -stem 2179 -stem 2180 -fork 0 2181 -stem 2182 -fork 2156 215 -stem 2184 -fork 2185 1 -fork 2155 2186 -fork 0 2187 -fork 2149 2188 -fork 42 2189 -stem 2190 -fork 2191 38 -fork 1053 2192 -fork 88 2193 -fork 2183 2194 -fork 883 2195 -fork 2174 2196 -fork 2171 2197 -fork 169 2198 -fork 2137 2199 -fork 2120 2200 -fork 169 2201 -fork 2106 2202 -fork 57 2203 -fork 2101 2204 -fork 2098 2205 -fork 2091 2206 -fork 5 2207 -fork 9 2208 +fork 1335 0 +fork 1324 2153 +fork 1312 2154 +fork 1342 2155 +fork 0 2156 +fork 1142 2157 +fork 2152 2158 +fork 112 2159 +fork 9 2160 +stem 2161 +fork 2132 2135 +fork 2163 2149 +fork 0 2164 +stem 2165 +stem 2166 +fork 0 2167 +stem 2168 +fork 2169 1141 +fork 2162 2170 +fork 5 2171 +stem 2172 +fork 2173 1584 +fork 0 2174 +stem 2175 +fork 2176 1592 +fork 0 2177 +fork 88 2178 +fork 317 2179 +fork 22 2180 +stem 2181 +fork 300 1182 +fork 0 2183 +fork 16 2184 +fork 0 2185 +stem 2186 +fork 1583 95 +fork 0 2188 +fork 16 2189 +fork 0 2190 +stem 2191 +fork 367 0 +fork 0 2193 +fork 0 2194 +fork 16 2195 +fork 0 2196 +stem 2197 +stem 2198 +fork 0 2199 +stem 2200 +fork 1091 34 +fork 0 2202 +fork 16 2203 +fork 0 2204 +stem 2205 +fork 0 33 +fork 0 2207 +fork 391 2208 stem 2209 -fork 0 1076 -fork 98 2211 +fork 0 2208 +fork 2210 2211 fork 0 2212 -fork 2210 2213 -fork 0 2214 -stem 2215 -fork 2216 100 -fork 0 2217 -fork 19 2218 -fork 9 2219 -stem 2220 -fork 2221 904 -fork 201 2222 -fork 57 2223 -fork 2083 2224 -fork 42 2225 -stem 2226 -fork 42 2079 +stem 2213 +stem 2214 +fork 0 2215 +stem 2216 +stem 2217 +fork 0 2218 +stem 2219 +fork 0 133 +fork 2221 34 +fork 0 2222 +stem 2223 +fork 11 0 +fork 1 2225 +fork 2224 2226 +fork 76 2227 stem 2228 -stem 562 -fork 932 2102 -stem 2231 -fork 932 2133 -stem 2233 -fork 762 204 -fork 0 2235 -fork 2236 20 -fork 0 2237 -stem 2238 -fork 2239 2127 -fork 308 2240 -stem 2241 -fork 0 208 -fork 2131 2243 -fork 2242 2244 -fork 2234 2245 -fork 2232 2246 -fork 0 2247 -stem 2248 -fork 2249 905 -fork 1518 2250 -fork 932 2251 -fork 9 2252 -fork 0 2253 -fork 98 2254 -fork 163 2255 -fork 0 2256 -stem 2257 -fork 2258 100 -fork 0 2259 -fork 19 2260 -fork 42 2261 -stem 2262 -stem 205 -fork 28 2264 -fork 178 2265 -stem 2266 -stem 2267 -fork 0 2268 -stem 2269 -stem 1249 -fork 28 2271 -fork 178 2272 +stem 1176 +fork 2230 258 +fork 2229 2231 +fork 0 2232 +fork 16 2233 +fork 0 2234 +stem 2235 +fork 123 0 +stem 2237 +fork 2238 1 +fork 11 2239 +fork 0 2240 +fork 16 2241 +fork 0 2242 +stem 2243 +stem 2244 +fork 0 2245 +stem 2246 +stem 7 +stem 1224 +fork 2249 233 +stem 2250 +fork 2251 1 +fork 0 2252 +stem 2253 +stem 980 +fork 2255 132 +stem 2256 +fork 2257 1 +fork 2254 2258 +fork 942 2259 +fork 0 2260 +fork 2248 2261 +fork 22 2262 +stem 2263 +fork 2264 21 +fork 320 2265 +fork 2247 2266 +fork 432 2267 +fork 293 2268 +fork 5 2269 +stem 2270 +fork 293 2266 +fork 5 2272 stem 2273 -stem 2274 -fork 0 2275 -stem 2276 -fork 0 434 -stem 2278 +fork 726 1000 +fork 300 2275 +fork 0 2276 +fork 16 2277 +fork 0 2278 stem 2279 -fork 0 2280 -stem 2281 -fork 98 980 -fork 0 2283 -fork 98 2284 -fork 311 2285 -fork 57 2286 -fork 0 2287 -fork 939 2288 -fork 2282 2289 -fork 42 2290 +stem 2280 +fork 0 2281 +stem 2282 +fork 2255 258 +stem 2284 +fork 2285 1 +fork 2254 2286 +fork 942 2287 +fork 0 2288 +fork 2248 2289 +fork 22 2290 stem 2291 -fork 2292 38 -fork 0 2293 -stem 2294 -fork 2295 100 -fork 0 2296 -fork 19 2297 -fork 0 2298 -stem 2299 -fork 2 2160 -stem 2301 -fork 2302 1 -fork 178 2303 -stem 2304 -fork 2 2187 -stem 2306 -fork 2307 1 -fork 2302 2308 -fork 2305 2309 +fork 2292 21 +fork 320 2293 +fork 70 2294 +fork 2283 2295 +fork 965 2296 +fork 2274 2297 +fork 2271 2298 +fork 293 2299 +fork 2236 2300 +fork 2220 2301 +fork 293 2302 +fork 2206 2303 +fork 432 2304 +fork 2201 2305 +fork 293 2306 +fork 2192 2307 +fork 5 2308 +fork 9 2309 stem 2310 -fork 2311 2133 -fork 793 2312 -stem 2313 -fork 2314 2102 -fork 0 2315 -stem 2316 -fork 2317 905 -fork 2300 2318 -fork 0 2319 -fork 98 2320 -fork 311 2321 -fork 57 2322 -fork 0 2323 -fork 939 2324 -fork 2282 2325 -fork 42 2326 -stem 2327 -fork 2328 38 -fork 0 2329 -stem 2330 -fork 2331 100 -fork 0 2332 -fork 19 2333 +fork 0 852 +fork 16 2312 +fork 0 2313 +stem 2314 +stem 2315 +fork 0 2316 +stem 2317 +fork 622 60 +fork 450 2319 +fork 2318 2320 +fork 1196 2321 +fork 409 2322 +fork 64 2323 +fork 597 2324 +fork 0 2325 +stem 2326 +fork 2327 90 +fork 0 2328 +fork 32 2329 +fork 726 2330 +fork 0 2331 +fork 2311 2332 +fork 0 2333 stem 2334 -fork 2335 11 -fork 42 2336 -stem 2337 -stem 216 -fork 28 2339 -fork 0 2340 -fork 1039 2341 -fork 2338 2342 -fork 2277 2343 -fork 993 2344 -fork 2270 2345 -fork 2263 2346 -fork 201 2347 -fork 0 2348 -fork 921 2349 -fork 42 2350 -stem 2351 -fork 45 2031 -fork 5 2353 -stem 2354 -fork 2355 1999 -fork 42 2356 -stem 2357 -fork 2358 551 -fork 0 2359 -fork 1763 2360 -fork 198 2361 -fork 9 2362 -fork 9 2363 -fork 9 2364 -fork 0 2365 -fork 631 2366 -fork 627 2367 -fork 620 2368 -fork 0 2369 -stem 2370 -fork 2371 100 -fork 0 2372 -fork 19 2373 -fork 0 2374 -fork 98 2375 -fork 62 2376 -fork 62 2377 -fork 0 2378 +fork 2335 90 +fork 0 2336 +fork 32 2337 +fork 0 2338 +stem 2339 +fork 2340 940 +fork 9 2341 +stem 2342 +fork 2343 986 +fork 317 2344 +fork 67 2345 +fork 2187 2346 +fork 22 2347 +stem 2348 +fork 22 2183 +stem 2350 +stem 676 +fork 153 2202 +stem 2353 +fork 153 2232 +stem 2355 +fork 0 198 +fork 2357 34 +fork 0 2358 +stem 2359 +fork 2360 2226 +fork 76 2361 +stem 2362 +fork 2230 371 +fork 2363 2364 +fork 2356 2365 +fork 2354 2366 +fork 1583 2367 +fork 153 2368 +fork 9 2369 +fork 0 2370 +fork 88 2371 +fork 287 2372 +fork 0 2373 +stem 2374 +fork 2375 90 +fork 0 2376 +fork 32 2377 +fork 22 2378 stem 2379 -fork 2380 860 -fork 201 2381 -fork 2352 2382 -fork 0 2383 -fork 2230 2384 -fork 198 2385 +stem 233 +fork 11 2381 +fork 112 2382 +stem 2383 +stem 2384 +fork 0 2385 stem 2386 -fork 0 277 -fork 1039 2388 -fork 5 2389 +stem 139 +fork 0 2388 +stem 2389 stem 2390 -fork 2391 1519 -fork 0 2392 -fork 2387 2393 -fork 198 2394 -fork 0 2395 -fork 2229 2396 -fork 148 2397 -fork 42 2398 -stem 2399 -stem 2400 -fork 0 2401 -stem 2402 -fork 2403 1221 -fork 148 2404 -fork 166 2405 -fork 42 2406 -stem 2407 -fork 2408 38 -fork 1050 2409 -fork 198 2410 -fork 0 2411 -fork 2227 2412 +fork 0 2391 +stem 2392 +stem 726 +fork 0 2394 +stem 2395 +stem 2396 +fork 0 2397 +stem 2398 +fork 456 2319 +fork 2399 2400 +fork 450 2401 +fork 293 2402 +fork 5 2403 +stem 2404 +fork 597 336 +fork 456 2406 +fork 2399 2407 +fork 450 2408 +fork 293 2409 +fork 5 2410 +stem 2411 +fork 16 440 fork 0 2413 stem 2414 -fork 2415 863 +stem 2415 fork 0 2416 -fork 98 2417 -fork 148 2418 -fork 42 2419 -stem 2420 -fork 204 0 +stem 2417 +fork 391 144 +stem 2419 +fork 0 144 +fork 2420 2421 fork 0 2422 -fork 2423 0 -stem 154 -fork 2425 1 -fork 178 2426 +stem 2423 +stem 2424 +fork 0 2425 +stem 2426 stem 2427 -fork 0 2423 -fork 2428 2429 -fork 9 2430 -stem 2431 -fork 2432 1519 -fork 2424 2433 -fork 782 2434 +fork 0 2428 +stem 2429 +stem 2430 +fork 0 2431 +stem 2432 +stem 2433 +fork 0 2434 stem 2435 -fork 0 2426 +fork 600 2320 fork 2436 2437 -stem 2438 -fork 0 2429 -fork 0 2440 -fork 2439 2441 -stem 2442 -fork 1273 0 -fork 1262 2444 -fork 1248 2445 -fork 1280 2446 +fork 1196 2438 +fork 965 2439 +fork 959 2440 +fork 64 2441 +fork 412 2442 +fork 1205 2443 +fork 505 2444 +fork 502 2445 +fork 2418 2446 fork 0 2447 -fork 1077 2448 -fork 2443 2449 -fork 178 2450 -fork 9 2451 -stem 2452 -fork 2423 2426 -fork 2454 2440 -fork 0 2455 +stem 2448 +fork 2449 90 +fork 0 2450 +fork 32 2451 +fork 0 2452 +fork 88 2453 +fork 726 2454 +fork 22 2455 stem 2456 -stem 2457 -fork 0 2458 -stem 2459 -fork 2460 1076 -fork 2453 2461 -fork 5 2462 +fork 2457 21 +fork 450 2458 +fork 0 2459 +fork 2412 2460 +fork 2393 2461 +fork 22 2462 stem 2463 -fork 2464 1519 +fork 2464 307 fork 0 2465 stem 2466 -fork 2467 1527 -stem 2468 -fork 2469 1 -fork 198 2470 -fork 0 2471 -fork 2421 2472 -fork 148 2473 -fork 163 2474 -fork 148 2475 -fork 2078 2476 -fork 198 2477 -fork 9 2478 -stem 2479 -fork 761 127 -fork 0 2481 -fork 98 2482 -fork 0 2483 -fork 98 2484 -fork 575 2485 -fork 0 2486 -fork 2480 2487 -fork 201 2488 -fork 161 2489 -fork 148 2490 -fork 65 2491 -fork 62 2492 -fork 145 2493 -fork 65 2494 -fork 5 2495 -stem 2496 -fork 1517 20 -fork 308 2498 -fork 9 2499 -stem 2500 -fork 2501 1517 -fork 20 2502 -stem 2503 +fork 2467 90 +fork 0 2468 +fork 32 2469 +fork 0 2470 +stem 2471 +fork 2 2260 +stem 2473 +fork 2474 1 +fork 112 2475 +stem 2476 +fork 2 2288 +stem 2478 +fork 2479 1 +fork 2474 2480 +fork 2477 2481 +stem 2482 +fork 2483 2232 +fork 874 2484 +stem 2485 +fork 2486 2202 +fork 2472 2487 +fork 22 2488 +stem 2489 +fork 2490 21 +fork 0 2491 +fork 88 2492 +fork 0 2493 +fork 433 2494 +fork 450 2495 +fork 0 2496 +fork 2405 2497 +fork 2393 2498 +fork 64 2499 +fork 597 2500 +fork 0 2501 +stem 2502 +fork 2503 90 fork 0 2504 -stem 2505 +fork 32 2505 stem 2506 -fork 0 2507 +fork 2507 24 stem 2508 -stem 138 -fork 0 2510 +fork 2509 1 +fork 22 2510 stem 2511 -fork 932 797 -stem 2513 -fork 762 27 -fork 2514 2515 -fork 308 2516 -fork 9 2517 -fork 0 2518 -fork 98 2519 -fork 2512 2520 -fork 0 2521 -stem 2522 -fork 2523 100 -fork 0 2524 -fork 19 2525 -fork 308 2526 -fork 9 2527 -fork 0 2528 -fork 98 2529 -fork 2279 2530 -fork 0 2531 +stem 367 +fork 11 2513 +fork 0 2514 +fork 1108 2515 +fork 2512 2516 +fork 2387 2517 +fork 2380 2518 +fork 317 2519 +fork 0 2520 +fork 1003 2521 +fork 22 2522 +stem 2523 +fork 2524 1724 +fork 0 2525 +fork 2352 2526 +fork 314 2527 +stem 2528 +fork 0 427 +fork 1108 2530 +fork 5 2531 stem 2532 -fork 2533 100 +fork 2533 1584 fork 0 2534 -fork 19 2535 -fork 42 2536 -stem 2537 -stem 1250 -fork 28 2539 -fork 42 2540 +fork 2529 2535 +fork 314 2536 +fork 0 2537 +fork 2351 2538 +fork 300 2539 +fork 22 2540 stem 2541 -fork 156 434 -fork 181 2543 -fork 2542 2544 -fork 2538 2545 -fork 0 2546 -fork 33 2547 -fork 0 2548 +stem 2542 +fork 0 2543 +stem 2544 +fork 2545 1285 +fork 300 2546 +fork 290 2547 +fork 22 2548 stem 2549 -fork 2550 2489 -fork 148 2551 -fork 65 2552 -fork 62 2553 -fork 2509 2554 -fork 65 2555 -fork 5 2556 -stem 2557 -fork 308 139 -fork 9 2559 -stem 2560 -fork 0 2503 -fork 2561 2562 -fork 20 2563 -stem 2564 -fork 0 2565 -stem 2566 -stem 2567 -fork 0 2568 -stem 2569 -fork 42 2526 -stem 2571 -fork 156 276 -fork 181 2573 -fork 2542 2574 -fork 2572 2575 -fork 0 2576 -fork 33 2577 -fork 0 2578 -stem 2579 -fork 2580 2489 -fork 148 2581 -fork 65 2582 -fork 62 2583 -fork 2570 2584 -fork 65 2585 -fork 5 2586 -stem 2587 -fork 2501 2562 -fork 20 2589 -stem 2590 -fork 0 2591 -stem 2592 -stem 2593 -fork 0 2594 -stem 2595 -fork 42 2516 -stem 2597 -stem 994 -fork 156 2599 -fork 181 2600 -fork 2542 2601 -fork 2598 2602 -fork 0 2603 -fork 33 2604 -fork 0 2605 -stem 2606 -fork 2607 2489 -fork 148 2608 -fork 65 2609 -fork 62 2610 -fork 2596 2611 -fork 65 2612 -fork 5 2613 -stem 2614 -fork 0 2564 -fork 2561 2616 -fork 20 2617 -stem 2618 -fork 0 2619 -stem 2620 -stem 2621 -fork 0 2622 -stem 2623 -fork 2512 6 -fork 0 2625 -stem 2626 -fork 2627 100 -fork 0 2628 -fork 19 2629 -fork 42 2630 -stem 2631 -fork 156 1084 -fork 181 2633 -fork 2542 2634 -fork 2632 2635 -fork 0 2636 -fork 33 2637 -fork 0 2638 -stem 2639 -fork 2640 2489 -fork 148 2641 -fork 65 2642 -fork 62 2643 -fork 2624 2644 -fork 65 2645 -fork 5 2646 -stem 2647 -fork 0 28 -stem 2649 -stem 1263 -fork 28 2651 -fork 28 2652 -fork 2 2653 -stem 2654 -fork 2655 1 -fork 2650 2656 -fork 2 2657 -stem 2658 -fork 2659 1 -fork 42 2660 -stem 2661 -fork 2501 2616 -fork 20 2663 -stem 2664 -fork 0 2665 -stem 2666 -stem 2667 -fork 0 2668 -stem 2669 -fork 156 1040 -fork 181 2671 -fork 2542 2672 -fork 2538 2673 -fork 0 2674 -fork 33 2675 -fork 0 2676 -stem 2677 -fork 2678 2489 -fork 148 2679 -fork 65 2680 -fork 62 2681 -fork 2670 2682 -fork 65 2683 -fork 2662 2684 -fork 2648 2685 -fork 2615 2686 -fork 2588 2687 -fork 2558 2688 -fork 2497 2689 -fork 0 2690 -fork 137 2691 +fork 2550 21 +fork 1118 2551 +fork 314 2552 +fork 0 2553 +fork 2349 2554 +fork 0 2555 +stem 2556 +fork 2557 945 +fork 0 2558 +fork 88 2559 +fork 300 2560 +fork 22 2561 +fork 0 2562 +fork 2182 2563 +fork 962 2564 +fork 290 2565 +fork 962 2566 +fork 2130 2567 +fork 317 2568 +fork 67 2569 +fork 358 2570 +fork 320 2571 +fork 0 2572 +fork 285 2573 +stem 2574 +fork 2575 1 diff --git a/ext/zig/result b/ext/zig/result new file mode 120000 index 0000000..edbdfb4 --- /dev/null +++ b/ext/zig/result @@ -0,0 +1 @@ +/nix/store/2sg31y0vamz5bz19aakxagi702glwh24-tricu-zig-0.1.0 \ No newline at end of file diff --git a/ext/zig/src/bundle.zig b/ext/zig/src/bundle.zig index 399997b..49f8294 100644 --- a/ext/zig/src/bundle.zig +++ b/ext/zig/src/bundle.zig @@ -2,19 +2,15 @@ const std = @import("std"); const tree = @import("tree.zig"); const Arena = @import("arena.zig").Arena; -pub const Hash = [32]u8; - pub const Error = error{ InvalidMagic, InvalidVersion, Truncated, InvalidManifest, InvalidNodePayload, - HashMismatch, ExportNotFound, MissingChild, UnexpectedFormat, - DigestMismatch, OutOfMemory, }; @@ -57,13 +53,6 @@ const Parser = struct { return std.mem.readInt(u64, b[0..8], .big); } - fn readHash(self: *Parser) Error!Hash { - const b = try self.expect(32); - var h: Hash = undefined; - @memcpy(&h, b); - return h; - } - fn readLengthPrefixedBytes(self: *Parser, allocator: std.mem.Allocator) Error![]const u8 { const len = try self.readU32(); const bytes = try self.expect(len); @@ -77,7 +66,6 @@ const SectionEntry = struct { section_type: u32, offset: u64, length: u64, - digest: Hash, }; fn parseHeader(p: *Parser) Error!struct { major: u16, minor: u16, section_count: u32, dir_offset: u64 } { @@ -104,25 +92,16 @@ fn parseSectionEntries(p: *Parser, count: u32, allocator: std.mem.Allocator) Err _ = try p.readU16(); // section_version _ = try p.readU16(); // section_flags const compression = try p.readU16(); - const digest_alg = try p.readU16(); + _ = try p.readU16(); // reserved (was digest_alg) entry.offset = try p.readU64(); entry.length = try p.readU64(); - entry.digest = try p.readHash(); + _ = try p.readU32(); // reserved padding if (compression != 0) return error.UnexpectedFormat; - if (digest_alg != 1) return error.UnexpectedFormat; } return entries; } -fn sha256Digest(data: []const u8) Hash { - var h = std.crypto.hash.sha2.Sha256.init(.{}); - h.update(data); - var out: Hash = undefined; - h.final(&out); - return out; -} - fn parseManifest(p: *Parser, allocator: std.mem.Allocator) Error!struct { exports: []Export, roots: []Root } { const magic = try p.expect(8); if (!std.mem.eql(u8, magic, "ARBMNFST")) return error.InvalidManifest; @@ -145,15 +124,15 @@ fn parseManifest(p: *Parser, allocator: std.mem.Allocator) Error!struct { export const hash_alg = try p.readLengthPrefixedBytes(allocator); defer allocator.free(hash_alg); - if (!std.mem.eql(u8, hash_alg, "sha256")) return error.UnexpectedFormat; + if (!std.mem.eql(u8, hash_alg, "indexed")) return error.UnexpectedFormat; const hash_domain = try p.readLengthPrefixedBytes(allocator); defer allocator.free(hash_domain); - if (!std.mem.eql(u8, hash_domain, "arboricx.merkle.node.v1")) return error.UnexpectedFormat; + if (!std.mem.eql(u8, hash_domain, "arboricx.indexed.node.v1")) return error.UnexpectedFormat; const payload_type = try p.readLengthPrefixedBytes(allocator); defer allocator.free(payload_type); - if (!std.mem.eql(u8, payload_type, "arboricx.merkle.payload.v1")) return error.UnexpectedFormat; + if (!std.mem.eql(u8, payload_type, "arboricx.indexed.payload.v1")) return error.UnexpectedFormat; const sem = try p.readLengthPrefixedBytes(allocator); defer allocator.free(sem); @@ -182,7 +161,7 @@ fn parseManifest(p: *Parser, allocator: std.mem.Allocator) Error!struct { export const roots = try allocator.alloc(Root, root_count); errdefer allocator.free(roots); for (roots) |*r| { - r.hash = try p.readHash(); + r.index = try p.readU32(); r.role = try p.readLengthPrefixedBytes(allocator); } @@ -198,7 +177,7 @@ fn parseManifest(p: *Parser, allocator: std.mem.Allocator) Error!struct { export } for (exports) |*e| { e.name = try p.readLengthPrefixedBytes(allocator); - e.root = try p.readHash(); + e.root = try p.readU32(); e.kind = try p.readLengthPrefixedBytes(allocator); e.abi = try p.readLengthPrefixedBytes(allocator); if (!std.mem.eql(u8, e.abi, "arboricx.abi.tree.v1")) return error.UnexpectedFormat; @@ -225,135 +204,62 @@ fn parseManifest(p: *Parser, allocator: std.mem.Allocator) Error!struct { export const Export = struct { name: []const u8, - root: Hash, + root: u32, kind: []const u8, abi: []const u8, }; const Root = struct { - hash: Hash, + index: u32, role: []const u8, }; -fn parseNodeSection(p: *Parser, allocator: std.mem.Allocator) Error!std.AutoHashMap(Hash, []const u8) { +/// Parse the node section and build nodes directly into the arena. +/// Returns a slice mapping node-section index -> arena index. +/// The caller owns the returned slice and must free it with the arena's allocator. +fn parseNodeSection(p: *Parser, arena: *Arena) Error![]u32 { const node_count = try p.readU64(); - var map = std.AutoHashMap(Hash, []const u8).init(allocator); - errdefer map.deinit(); + const indices = try arena.allocator.alloc(u32, node_count); + errdefer arena.allocator.free(indices); var i: u64 = 0; while (i < node_count) : (i += 1) { - const hash = try p.readHash(); const plen = try p.readU32(); const payload = try p.expect(plen); - const expected_hash = blk: { - var h = std.crypto.hash.sha2.Sha256.init(.{}); - h.update("arboricx.merkle.node.v1"); - h.update(&[_]u8{0}); - h.update(payload); - var out: Hash = undefined; - h.final(&out); - break :blk out; - }; - if (!std.mem.eql(u8, &hash, &expected_hash)) return error.HashMismatch; + if (payload.len == 0) return error.InvalidNodePayload; - try map.put(hash, payload); + const idx: u32 = switch (payload[0]) { + 0x00 => blk: { + if (plen != 1) return error.InvalidNodePayload; + break :blk try arena.alloc(.leaf); + }, + 0x01 => blk: { + if (plen != 5) return error.InvalidNodePayload; + const child_idx = std.mem.readInt(u32, payload[1..5], .big); + if (child_idx >= i) return error.InvalidNodePayload; + break :blk try arena.alloc(.{ .stem = .{ .child = indices[child_idx] } }); + }, + 0x02 => blk: { + if (plen != 9) return error.InvalidNodePayload; + const left_idx = std.mem.readInt(u32, payload[1..5], .big); + const right_idx = std.mem.readInt(u32, payload[5..9], .big); + if (left_idx >= i or right_idx >= i) return error.InvalidNodePayload; + break :blk try arena.alloc(.{ .fork = .{ .left = indices[left_idx], .right = indices[right_idx] } }); + }, + else => return error.InvalidNodePayload, + }; + indices[i] = idx; } - return map; + return indices; } -fn loadNode( - arena: *Arena, - payloads: std.AutoHashMap(Hash, []const u8), - cache: *std.AutoHashMap(Hash, u32), - root_hash: Hash, -) Error!u32 { - const Frame = struct { - hash: Hash, - state: u2, - }; - - const max_stack = payloads.count() * 2; - var stack = try arena.allocator.alloc(Frame, max_stack); - defer arena.allocator.free(stack); - var sp: usize = 0; - - stack[sp] = .{ .hash = root_hash, .state = 0 }; - sp += 1; - - while (sp > 0) { - const frame = &stack[sp - 1]; - - if (cache.get(frame.hash)) |_| { - sp -= 1; - continue; - } - - if (frame.state == 0) { - frame.state = 1; - const payload = payloads.get(frame.hash) orelse return error.MissingChild; - if (payload.len == 0) return error.InvalidNodePayload; - - switch (payload[0]) { - 0x00 => { - if (payload.len != 1) return error.InvalidNodePayload; - }, - 0x01 => { - if (payload.len != 33) return error.InvalidNodePayload; - var child_hash: Hash = undefined; - @memcpy(&child_hash, payload[1..33]); - if (cache.get(child_hash) == null) { - stack[sp] = .{ .hash = child_hash, .state = 0 }; - sp += 1; - } - }, - 0x02 => { - if (payload.len != 65) return error.InvalidNodePayload; - var left_hash: Hash = undefined; - var right_hash: Hash = undefined; - @memcpy(&left_hash, payload[1..33]); - @memcpy(&right_hash, payload[33..65]); - const need_right = cache.get(right_hash) == null; - const need_left = cache.get(left_hash) == null; - if (need_right) { - stack[sp] = .{ .hash = right_hash, .state = 0 }; - sp += 1; - } - if (need_left) { - stack[sp] = .{ .hash = left_hash, .state = 0 }; - sp += 1; - } - }, - else => return error.InvalidNodePayload, - } - } else { - const payload = payloads.get(frame.hash).?; - const idx: u32 = switch (payload[0]) { - 0x00 => try arena.alloc(.leaf), - 0x01 => blk: { - var child_hash: Hash = undefined; - @memcpy(&child_hash, payload[1..33]); - const child_idx = cache.get(child_hash).?; - break :blk try arena.alloc(.{ .stem = .{ .child = child_idx } }); - }, - 0x02 => blk: { - var left_hash: Hash = undefined; - var right_hash: Hash = undefined; - @memcpy(&left_hash, payload[1..33]); - @memcpy(&right_hash, payload[33..65]); - const left_idx = cache.get(left_hash).?; - const right_idx = cache.get(right_hash).?; - break :blk try arena.alloc(.{ .fork = .{ .left = left_idx, .right = right_idx } }); - }, - else => unreachable, - }; - try cache.put(frame.hash, idx); - sp -= 1; - } +fn findSection(entries: []SectionEntry, section_type: u32) ?SectionEntry { + for (entries) |entry| { + if (entry.section_type == section_type) return entry; } - - return cache.get(root_hash) orelse return error.MissingChild; + return null; } /// Parse an Arboricx bundle and load the named export into the arena. @@ -372,20 +278,11 @@ pub fn loadBundleExport( const entries = try parseSectionEntries(&p, header.section_count, allocator); defer allocator.free(entries); - var manifest_entry: ?SectionEntry = null; - var nodes_entry: ?SectionEntry = null; - for (entries) |entry| { - if (entry.section_type == 1) manifest_entry = entry; - if (entry.section_type == 2) nodes_entry = entry; - } - const manifest_section = manifest_entry orelse return error.InvalidManifest; - const nodes_section = nodes_entry orelse return error.InvalidNodePayload; + const manifest_section = findSection(entries, 1) orelse return error.InvalidManifest; + const nodes_section = findSection(entries, 2) orelse return error.InvalidNodePayload; const manifest_bytes = bundle_bytes[@intCast(manifest_section.offset)..@intCast(manifest_section.offset + manifest_section.length)]; - if (!std.mem.eql(u8, &sha256Digest(manifest_bytes), &manifest_section.digest)) return error.DigestMismatch; - const nodes_bytes = bundle_bytes[@intCast(nodes_section.offset)..@intCast(nodes_section.offset + nodes_section.length)]; - if (!std.mem.eql(u8, &sha256Digest(nodes_bytes), &nodes_section.digest)) return error.DigestMismatch; var mp = Parser.init(manifest_bytes); const manifest = try parseManifest(&mp, allocator); @@ -402,23 +299,21 @@ pub fn loadBundleExport( allocator.free(manifest.roots); } - var export_hash: ?Hash = null; + var export_root: ?u32 = null; for (manifest.exports) |e| { if (std.mem.eql(u8, e.name, export_name)) { - export_hash = e.root; + export_root = e.root; break; } } - const root_hash = export_hash orelse return error.ExportNotFound; + const root_index = export_root orelse return error.ExportNotFound; var np = Parser.init(nodes_bytes); - var payloads = try parseNodeSection(&np, allocator); - defer payloads.deinit(); + const node_indices = try parseNodeSection(&np, arena); + defer allocator.free(node_indices); - var cache = std.AutoHashMap(Hash, u32).init(allocator); - defer cache.deinit(); - - return try loadNode(arena, payloads, &cache, root_hash); + if (root_index >= node_indices.len) return error.InvalidNodePayload; + return node_indices[root_index]; } /// Parse an Arboricx bundle and load the default (first) root into the arena. @@ -435,20 +330,11 @@ pub fn loadBundleDefaultRoot( const entries = try parseSectionEntries(&p, header.section_count, allocator); defer allocator.free(entries); - var manifest_entry: ?SectionEntry = null; - var nodes_entry: ?SectionEntry = null; - for (entries) |entry| { - if (entry.section_type == 1) manifest_entry = entry; - if (entry.section_type == 2) nodes_entry = entry; - } - const manifest_section = manifest_entry orelse return error.InvalidManifest; - const nodes_section = nodes_entry orelse return error.InvalidNodePayload; + const manifest_section = findSection(entries, 1) orelse return error.InvalidManifest; + const nodes_section = findSection(entries, 2) orelse return error.InvalidNodePayload; const manifest_bytes = bundle_bytes[@intCast(manifest_section.offset)..@intCast(manifest_section.offset + manifest_section.length)]; - if (!std.mem.eql(u8, &sha256Digest(manifest_bytes), &manifest_section.digest)) return error.DigestMismatch; - const nodes_bytes = bundle_bytes[@intCast(nodes_section.offset)..@intCast(nodes_section.offset + nodes_section.length)]; - if (!std.mem.eql(u8, &sha256Digest(nodes_bytes), &nodes_section.digest)) return error.DigestMismatch; var mp = Parser.init(manifest_bytes); const manifest = try parseManifest(&mp, allocator); @@ -466,14 +352,12 @@ pub fn loadBundleDefaultRoot( } if (manifest.roots.len == 0) return error.ExportNotFound; - const root_hash = manifest.roots[0].hash; + const root_index = manifest.roots[0].index; var np = Parser.init(nodes_bytes); - var payloads = try parseNodeSection(&np, allocator); - defer payloads.deinit(); + const node_indices = try parseNodeSection(&np, arena); + defer allocator.free(node_indices); - var cache = std.AutoHashMap(Hash, u32).init(allocator); - defer cache.deinit(); - - return try loadNode(arena, payloads, &cache, root_hash); + if (root_index >= node_indices.len) return error.InvalidNodePayload; + return node_indices[root_index]; } diff --git a/ext/zig/src/main.zig b/ext/zig/src/main.zig index 6a9d0ba..fc4b451 100644 --- a/ext/zig/src/main.zig +++ b/ext/zig/src/main.zig @@ -6,16 +6,16 @@ const codecs = @import("codecs.zig"); const kernel = @import("kernel.zig"); const bundle = @import("bundle.zig"); -fn runNative(arena: *Arena, tag: u64, bundle_bytes: []const u8, args_raw: []const []const u8, io: std.Io) !void { +fn runNative(arena: *Arena, tag: u64, bundle_bytes: []const u8, args_raw: []const []const u8, fuel: u64, io: std.Io) !void { const term = try bundle.loadBundleDefaultRoot(arena, bundle_bytes); var current = term; for (args_raw) |arg| { - const arg_tree = try parseArg(arena, arg); + const arg_tree = try parseArg(arena, io, arg); current = try arena.alloc(.{ .app = .{ .func = current, .arg = arg_tree } }); } - const result = try reduce.reduce(current, arena, 1_000_000_000); + const result = try reduce.reduce(current, arena, fuel); var stdout_buf: [4096]u8 = undefined; var stdout = std.Io.File.stdout().writer(io, &stdout_buf); @@ -56,7 +56,7 @@ fn runNative(arena: *Arena, tag: u64, bundle_bytes: []const u8, args_raw: []cons try stdout.flush(); } -fn runBundle(arena: *Arena, tag: u64, bundle_bytes: []const u8, args_raw: []const []const u8, io: std.Io) !void { +fn runBundle(arena: *Arena, tag: u64, bundle_bytes: []const u8, args_raw: []const []const u8, fuel: u64, io: std.Io) !void { const kernel_root = try kernel.loadKernel(arena); const tag_tree = try codecs.ofNumber(arena, tag); @@ -65,7 +65,7 @@ fn runBundle(arena: *Arena, tag: u64, bundle_bytes: []const u8, args_raw: []cons var arg_items = try arena.allocator.alloc(u32, args_raw.len); defer arena.allocator.free(arg_items); for (args_raw, 0..) |arg, i| { - arg_items[i] = try parseArg(arena, arg); + arg_items[i] = try parseArg(arena, io, arg); } const args_tree = try codecs.ofList(arena, arg_items); @@ -74,7 +74,7 @@ fn runBundle(arena: *Arena, tag: u64, bundle_bytes: []const u8, args_raw: []cons const app1 = try arena.alloc(.{ .app = .{ .func = app0, .arg = bundle_tree } }); const app2 = try arena.alloc(.{ .app = .{ .func = app1, .arg = args_tree } }); - const result = try reduce.reduce(app2, arena, 1_000_000_000); + const result = try reduce.reduce(app2, arena, fuel); const unwrapped = try codecs.unwrapResult(arena, result) orelse { var stderr = std.Io.File.stderr().writer(io, &[_]u8{}); @@ -137,7 +137,13 @@ fn runBundle(arena: *Arena, tag: u64, bundle_bytes: []const u8, args_raw: []cons try stdout.flush(); } -fn parseArg(arena: *Arena, s: []const u8) !u32 { +fn parseArg(arena: *Arena, io: std.Io, s: []const u8) !u32 { + if (std.mem.endsWith(u8, s, ".arboricx")) { + const bundle_bytes = try std.Io.Dir.cwd().readFileAlloc(io, s, arena.allocator, .limited(10 * 1024 * 1024)); + defer arena.allocator.free(bundle_bytes); + return try bundle.loadBundleDefaultRoot(arena, bundle_bytes); + } + if (std.fmt.parseInt(u64, s, 10)) |n| { return try codecs.ofNumber(arena, n); } else |_| {} @@ -156,7 +162,7 @@ pub fn main(init: std.process.Init) !void { const args = try init.minimal.args.toSlice(init.arena.allocator()); if (args.len < 2) { var stderr = std.Io.File.stderr().writer(io, &[_]u8{}); - try stderr.interface.writeAll("Usage: tricu-zig [--type TYPE] [--kernel] [arg1 arg2 ...]\n"); + try stderr.interface.writeAll("Usage: tricu-zig [--type TYPE] [--kernel] [--fuel N] [arg1 arg2 ...]\n"); try stderr.flush(); std.process.exit(1); } @@ -167,13 +173,14 @@ pub fn main(init: std.process.Init) !void { var arg_start: usize = 2; var use_kernel = false; + var fuel: u64 = std.math.maxInt(u64); var i: usize = 1; while (i < args.len) : (i += 1) { if (std.mem.eql(u8, args[i], "--type")) { if (i + 1 >= args.len) { var stderr = std.Io.File.stderr().writer(io, &[_]u8{}); - try stderr.interface.writeAll("Usage: tricu-zig --type [args...]\n"); + try stderr.interface.writeAll("Usage: tricu-zig --type [--fuel N] [args...]\n"); try stderr.flush(); std.process.exit(1); } @@ -194,6 +201,21 @@ pub fn main(init: std.process.Init) !void { i += 1; } else if (std.mem.eql(u8, args[i], "--kernel")) { use_kernel = true; + } else if (std.mem.eql(u8, args[i], "--fuel")) { + if (i + 1 >= args.len) { + var stderr = std.Io.File.stderr().writer(io, &[_]u8{}); + try stderr.interface.writeAll("Usage: tricu-zig --fuel [args...]\n"); + try stderr.flush(); + std.process.exit(1); + } + const n = std.fmt.parseInt(u64, args[i + 1], 10) catch { + var stderr = std.Io.File.stderr().writer(io, &[_]u8{}); + try stderr.interface.print("Invalid fuel: {s}\n", .{args[i + 1]}); + try stderr.flush(); + std.process.exit(1); + }; + fuel = std.math.mul(u64, n, 1_000_000) catch std.math.maxInt(u64); + i += 1; } else { bundle_idx = i; arg_start = i + 1; @@ -203,7 +225,7 @@ pub fn main(init: std.process.Init) !void { if (bundle_idx >= args.len) { var stderr = std.Io.File.stderr().writer(io, &[_]u8{}); - try stderr.interface.writeAll("Usage: tricu-zig [--type TYPE] [--kernel] [arg1 arg2 ...]\n"); + try stderr.interface.writeAll("Usage: tricu-zig [--type TYPE] [--kernel] [--fuel N] [arg1 arg2 ...]\n"); try stderr.flush(); std.process.exit(1); } @@ -218,14 +240,14 @@ pub fn main(init: std.process.Init) !void { const call_args = if (arg_start < args.len) args[arg_start..] else &[_][]const u8{}; if (use_kernel) { - runBundle(&arena, tag, bundle_bytes, call_args, io) catch |err| { + runBundle(&arena, tag, bundle_bytes, call_args, fuel, io) catch |err| { var stderr = std.Io.File.stderr().writer(io, &[_]u8{}); try stderr.interface.print("Execution failed: {s}\n", .{@errorName(err)}); try stderr.flush(); std.process.exit(1); }; } else { - runNative(&arena, tag, bundle_bytes, call_args, io) catch |err| { + runNative(&arena, tag, bundle_bytes, call_args, fuel, io) catch |err| { var stderr = std.Io.File.stderr().writer(io, &[_]u8{}); try stderr.interface.print("Execution failed: {s}\n", .{@errorName(err)}); try stderr.flush(); diff --git a/ext/zig/src/reduce.zig b/ext/zig/src/reduce.zig index 9626587..cd7f99f 100644 --- a/ext/zig/src/reduce.zig +++ b/ext/zig/src/reduce.zig @@ -15,21 +15,21 @@ pub fn reduce(root: u32, arena: *Arena, fuel: u64) ReduceError!u32 { } fn whnf(term: u32, arena: *Arena, fuel: *u64) ReduceError!u32 { - if (fuel.* == 0) return error.FuelExhausted; var current = term; while (true) { switch (arena.get(current).*) { .leaf, .stem, .fork => return current, .app => |app| { + if (fuel.* == 0) return error.FuelExhausted; + fuel.* -= 1; + const orig = current; const func_idx = app.func; const arg_idx = app.arg; // Reduce function to WHNF const f = try whnf(func_idx, arena, fuel); - if (fuel.* == 0) return error.FuelExhausted; - fuel.* -= 1; switch (arena.get(f).*) { // apply Leaf b = Stem b @@ -49,15 +49,11 @@ fn whnf(term: u32, arena: *Arena, fuel: *u64) ReduceError!u32 { // Reduce left child of Fork const left = try whnf(left_idx, arena, fuel); - if (fuel.* == 0) return error.FuelExhausted; - fuel.* -= 1; switch (arena.get(left).*) { // apply (Fork Leaf a) _ = a .leaf => { const result = try whnf(right_idx, arena, fuel); - if (fuel.* == 0) return error.FuelExhausted; - fuel.* -= 1; if (orig != result) { arena.get(orig).* = arena.get(result).*; } @@ -70,23 +66,17 @@ fn whnf(term: u32, arena: *Arena, fuel: *u64) ReduceError!u32 { const inner2 = try arena.alloc(.{ .app = .{ .func = right_idx, .arg = arg_idx } }); arena.get(orig).* = .{ .app = .{ .func = inner1, .arg = inner2 } }; current = orig; - if (fuel.* == 0) return error.FuelExhausted; - fuel.* -= 1; continue; }, .fork => { // Reduce argument const arg = try whnf(arg_idx, arena, fuel); - if (fuel.* == 0) return error.FuelExhausted; - fuel.* -= 1; switch (arena.get(arg).*) { // apply (Fork (Fork a b) c) Leaf = a .leaf => { const a_idx = arena.get(left).fork.left; const result = try whnf(a_idx, arena, fuel); - if (fuel.* == 0) return error.FuelExhausted; - fuel.* -= 1; if (orig != result) { arena.get(orig).* = arena.get(result).*; } @@ -98,8 +88,6 @@ fn whnf(term: u32, arena: *Arena, fuel: *u64) ReduceError!u32 { const u = s.child; arena.get(orig).* = .{ .app = .{ .func = b_idx, .arg = u } }; current = orig; - if (fuel.* == 0) return error.FuelExhausted; - fuel.* -= 1; continue; }, // apply (Fork (Fork a b) c) (Fork u v) = (c u) v @@ -110,8 +98,6 @@ fn whnf(term: u32, arena: *Arena, fuel: *u64) ReduceError!u32 { const inner = try arena.alloc(.{ .app = .{ .func = c_idx, .arg = u } }); arena.get(orig).* = .{ .app = .{ .func = inner, .arg = v } }; current = orig; - if (fuel.* == 0) return error.FuelExhausted; - fuel.* -= 1; continue; }, .app => return error.InvalidApply, diff --git a/ext/zig/tests/native_bundle_append_test.c b/ext/zig/tests/native_bundle_append_test.c index fec94c1..b4296c1 100644 --- a/ext/zig/tests/native_bundle_append_test.c +++ b/ext/zig/tests/native_bundle_append_test.c @@ -27,7 +27,7 @@ int main() { printf("bundle size=%zu\n", bundle_len); clock_t t0 = clock(); - uint32_t term = arb_load_bundle(ctx, bundle, bundle_len, "root"); + uint32_t term = arb_load_bundle(ctx, bundle, bundle_len, "append"); clock_t t1 = clock(); printf("load_bundle took %.3f ms, term=%u\n", (double)(t1 - t0) * 1000.0 / CLOCKS_PER_SEC, term); if (term == 0) { diff --git a/ext/zig/tests/native_bundle_bools_test.c b/ext/zig/tests/native_bundle_bools_test.c index d6834bf..f2c5d66 100644 --- a/ext/zig/tests/native_bundle_bools_test.c +++ b/ext/zig/tests/native_bundle_bools_test.c @@ -16,12 +16,12 @@ static uint8_t *read_file(const char *path, size_t *out_len) { return buf; } -int test_bundle(arb_ctx_t *ctx, const char *path, int expect_val) { +int test_bundle(arb_ctx_t *ctx, const char *path, const char *name, int expect_val) { size_t bundle_len; uint8_t *bundle = read_file(path, &bundle_len); if (!bundle) { printf("bundle not found: %s\n", path); return 1; } - uint32_t term = arb_load_bundle(ctx, bundle, bundle_len, "root"); + uint32_t term = arb_load_bundle(ctx, bundle, bundle_len, name); if (term == 0) { printf("load_bundle failed for %s\n", path); free(bundle); @@ -51,8 +51,8 @@ int main() { arb_ctx_t *ctx = arboricx_init(); if (!ctx) { printf("init failed\n"); return 1; } - if (test_bundle(ctx, "../../test/fixtures/true.arboricx", 1) != 0) return 1; - if (test_bundle(ctx, "../../test/fixtures/false.arboricx", 0) != 0) return 1; + if (test_bundle(ctx, "../../test/fixtures/true.arboricx", "true", 1) != 0) return 1; + if (test_bundle(ctx, "../../test/fixtures/false.arboricx", "false", 0) != 0) return 1; arboricx_free(ctx); printf("All bool tests passed.\n"); diff --git a/ext/zig/tests/native_bundle_id_test.c b/ext/zig/tests/native_bundle_id_test.c index c71eb5c..15b7d49 100644 --- a/ext/zig/tests/native_bundle_id_test.c +++ b/ext/zig/tests/native_bundle_id_test.c @@ -26,7 +26,7 @@ int main() { printf("bundle size=%zu\n", bundle_len); clock_t t0 = clock(); - uint32_t term = arb_load_bundle(ctx, bundle, bundle_len, "root"); + uint32_t term = arb_load_bundle(ctx, bundle, bundle_len, "id"); clock_t t1 = clock(); printf("load_bundle took %.3f ms, term=%u\n", (double)(t1 - t0) * 1000.0 / CLOCKS_PER_SEC, term); if (term == 0) { diff --git a/ext/zig/tests/python_ffi_test.py b/ext/zig/tests/python_ffi_test.py index a5bfdfe..a525528 100644 --- a/ext/zig/tests/python_ffi_test.py +++ b/ext/zig/tests/python_ffi_test.py @@ -217,7 +217,7 @@ print(f" time: {(t1 - t0) * 1000:.1f} ms") # Test 5: append via native named export print("\n--- Test 5: append via named export 'root' ---") t0 = time.time() -result = native_run_named(bundle, "root", ["Hello, ", "world!"]) +result = native_run_named(bundle, "append", ["Hello, ", "world!"]) t1 = time.time() check("append named", result, "Hello, world!") print(f" time: {(t1 - t0) * 1000:.1f} ms") diff --git a/flake.nix b/flake.nix index 367a7e4..db89d69 100644 --- a/flake.nix +++ b/flake.nix @@ -122,6 +122,48 @@ ''; }; + # ------------------------------------------------------------------ + # JS FFI host + # ------------------------------------------------------------------ + tricuJs = pkgs.buildNpmPackage { + pname = "tricu-js"; + version = "0.1.0"; + src = ./ext/js; + npmDepsHash = "sha256-81C7tsNcbyZVhm3uqiWdDQxp5LAXXO9aueHdMDztCfM="; + nativeBuildInputs = [ pkgs.nodejs tricuZig ]; + dontNpmBuild = true; + installPhase = '' + mkdir -p $out/lib/ + cp -r . $out/lib/ + cp ${tricuZig}/lib/libarboricx.so $out/lib/src + ''; + }; + + # ------------------------------------------------------------------ + # JS FFI host tests (separate target) + # ------------------------------------------------------------------ + tricuJsTests = pkgs.stdenv.mkDerivation { + pname = "tricu-js-tests"; + version = "0.1.0"; + src = ./.; + nativeBuildInputs = [ pkgs.nodejs tricuZig ]; + buildPhase = "true"; + doCheck = true; + checkPhase = '' + export ARBORICX_LIB=${tricuZig}/lib/libarboricx.so + export LD_LIBRARY_PATH=${tricuZig}/lib:$LD_LIBRARY_PATH + ulimit -s 32768 + + cd ext/js + # node_modules are pre-fetched by buildNpmPackage; copy them in + cp -r ${tricuJs}/lib/tricu-js/node_modules . + npm test + + mkdir -p $out + echo "All JS tests passed" > $out/result + ''; + }; + # ------------------------------------------------------------------ # PHP FFI tests (separate target) # ------------------------------------------------------------------ @@ -157,6 +199,8 @@ packages.tricu-zig-tests = tricuZigTests; packages.tricu-php = tricuPhp; packages.tricu-php-tests = tricuPhpTests; + packages.tricu-js = tricuJs; + packages.tricu-js-tests = tricuJsTests; checks.${packageName} = tricuPackageTests; checks.default = tricuPackageTests; diff --git a/lib/arboricx-common.tri b/lib/arboricx-common.tri index 90dc91d..ad8f923 100644 --- a/lib/arboricx-common.tri +++ b/lib/arboricx-common.tri @@ -61,22 +61,22 @@ readSectionRecord = (bs : bindResult (readBytes 2 afterSectionFlags) (compression afterCompression : bindResult (readBytes 2 afterCompression) - (digestAlgorithm afterDigestAlgorithm : - bindResult (readBytes 8 afterDigestAlgorithm) + (reserved1 afterReserved1 : + bindResult (readBytes 8 afterReserved1) (offset afterOffset : bindResult (readBytes 8 afterOffset) (length afterLength : - bindResult (readBytes 32 afterLength) - (digest afterDigest : + bindResult (readBytes 4 afterLength) + (reserved2 afterReserved2 : ok (pair sectionId (pair sectionVersion (pair sectionFlags (pair compression - (pair digestAlgorithm + (pair reserved1 (pair offset - (pair length digest))))))) - afterDigest))))))))) + (pair length reserved2))))))) + afterReserved2))))))))) readSectionDirectory_ = y (self bs sectionCount i acc : matchBool @@ -126,7 +126,7 @@ sectionRecordCompression = (sectionRecord : payload) sectionRecord) -sectionRecordDigestAlgorithm = (sectionRecord : +sectionRecordReserved1 = (sectionRecord : matchPair (_ payload : matchPair @@ -136,7 +136,7 @@ sectionRecordDigestAlgorithm = (sectionRecord : matchPair (_ payload4 : matchPair - (digestAlgorithm _ : digestAlgorithm) + (reserved1 _ : reserved1) payload4) payload3) payload2) @@ -186,7 +186,7 @@ sectionRecordLength = (sectionRecord : payload) sectionRecord) -sectionRecordDigest = (sectionRecord : +sectionRecordReserved2 = (sectionRecord : matchPair (_ payload : matchPair @@ -200,7 +200,7 @@ sectionRecordDigest = (sectionRecord : matchPair (_ payload6 : matchPair - (_ digest : digest) + (_ reserved2 : reserved2) payload6) payload5) payload4) diff --git a/lib/arboricx-dispatch.tri b/lib/arboricx-dispatch.tri index bddc1cb..0eea7a5 100644 --- a/lib/arboricx-dispatch.tri +++ b/lib/arboricx-dispatch.tri @@ -1,23 +1,6 @@ !import "arboricx.tri" !Local -!import "patterns.tri" !Local -- Multi-purpose kernel dispatch. --- -- runArboricxTyped tag bundleBytes args --- tag 0 → hostTree (runArboricxToTree) --- tag 1 → hostString (runArboricxToString) --- tag 2 → hostNumber (runArboricxToNumber) --- tag 3 → hostBool (runArboricxToBool) --- tag 4 → hostList (runArboricxToList) --- tag 5 → hostBytes (runArboricxToBytes) --- otherwise → err 99 bundleBytes - runArboricxTyped = (tag bs args : - match tag - [[(equal? hostTreeTag) (_ : runArboricxToTree bs args)] - [(equal? hostStringTag) (_ : runArboricxToString bs args)] - [(equal? hostNumberTag) (_ : runArboricxToNumber bs args)] - [(equal? hostBoolTag) (_ : runArboricxToBool bs args)] - [(equal? hostListTag) (_ : runArboricxToList bs args)] - [(equal? hostBytesTag) (_ : runArboricxToBytes bs args)] - [otherwise (_ : err 99 bs)]]) + runArboricxByNameToTyped tag [] bs args) diff --git a/lib/arboricx-manifest.tri b/lib/arboricx-manifest.tri index 5fd30e5..f148af4 100644 --- a/lib/arboricx-manifest.tri +++ b/lib/arboricx-manifest.tri @@ -29,13 +29,13 @@ readCapabilities_ = y (self bs count i acc : readCapabilities = (count bs : readCapabilities_ bs count 0 t) --- Helper: read a single root entry (32-byte raw hash + length-prefixed role) +-- Helper: read a single root entry (4-byte u32 BE index + length-prefixed role) readRootEntry = (bs : - bindResult (readBytes 32 bs) - (hashRaw afterHash : - bindResult (readLengthPrefixedString afterHash) + bindResult (readBytes 4 bs) + (indexRaw afterIndex : + bindResult (readLengthPrefixedString afterIndex) (role afterRole : - ok (pair hashRaw role) afterRole))) + ok (pair indexRaw role) afterRole))) -- Helper worker: read N root entries (counts up from 0) readRoots_ = y (self bs count i acc : @@ -54,13 +54,13 @@ readRoots = (count bs : readExportEntry = (bs : bindResult (readLengthPrefixedString bs) (name afterName : - bindResult (readBytes 32 afterName) - (rootHashRaw afterRootHash : - bindResult (readLengthPrefixedString afterRootHash) + bindResult (readBytes 4 afterName) + (rootIndexRaw afterRootIndex : + bindResult (readLengthPrefixedString afterRootIndex) (kind afterKind : bindResult (readLengthPrefixedString afterKind) (abi afterAbi : - ok (pair name (pair rootHashRaw (pair kind abi))) afterAbi))))) + ok (pair name (pair rootIndexRaw (pair kind abi))) afterAbi))))) -- Helper worker: read N export entries (counts up from 0) readExports_ = y (self bs count i acc : @@ -200,7 +200,7 @@ lookupMetadata_ = y (self tlvs tag : lookupMetadata = (tlvs tag : lookupMetadata_ tlvs tag) --- Get export name from an export entry (pair name (pair rootHash (pair kind abi))) +-- Get export name from an export entry (pair name (pair rootIndex (pair kind abi))) exportName = (exp : matchPair (name _ : name) @@ -284,9 +284,9 @@ selectExportOpt = (exports optNameBytes : expectedSchema = "arboricx.bundle.manifest.v1" expectedBundleType = "tree-calculus-executable-object" expectedTreeCalculus = "tree-calculus.v1" -expectedTreeHashAlgorithm = "sha256" -expectedTreeHashDomain = "arboricx.merkle.node.v1" -expectedTreeNodePayload = "arboricx.merkle.payload.v1" +expectedTreeHashAlgorithm = "indexed" +expectedTreeHashDomain = "arboricx.indexed.node.v1" +expectedTreeNodePayload = "arboricx.indexed.payload.v1" expectedRuntimeSemantics = "tree-calculus.v1" expectedRuntimeEvaluation = "normal-order" expectedRuntimeAbi = "arboricx.abi.tree.v1" diff --git a/lib/arboricx-nodes.tri b/lib/arboricx-nodes.tri index abda776..0ad5b67 100644 --- a/lib/arboricx-nodes.tri +++ b/lib/arboricx-nodes.tri @@ -1,37 +1,21 @@ !import "arboricx-common.tri" !Local +-- Indexed Arboricx node section reader. +-- +-- Node records in the indexed format are just length-prefixed payloads: +-- u32 payloadLength || payload +-- A payload is one of: +-- 0x00 +-- 0x01 || childIndex:u32be +-- 0x02 || leftIndex:u32be || rightIndex:u32be +-- Child indices must point strictly backward in the node array. + readNodeRecord = (bs : - bindResult (readBytes 32 bs) - (nodeHash afterNodeHash : - bindResult (readBytes 4 afterNodeHash) - (payloadLength afterPayloadLength : - bindResult (readBytes (u32BEBytesToNat payloadLength) afterPayloadLength) - (payload afterPayload : - ok - (pair nodeHash - (pair payloadLength payload)) - afterPayload)))) - -nodeRecordHash = (nodeRecord : - matchPair - (nodeHash _ : nodeHash) - nodeRecord) - -nodeRecordPayloadLength = (nodeRecord : - matchPair - (_ payload : - matchPair - (payloadLength _ : payloadLength) - payload) - nodeRecord) - -nodeRecordPayload = (nodeRecord : - matchPair - (_ payload : - matchPair - (_ nodePayload : nodePayload) - payload) - nodeRecord) + bindResult (readBytes 4 bs) + (payloadLength afterPayloadLength : + bindResult (readBytes (u32BEBytesToNat payloadLength) afterPayloadLength) + (payload afterPayload : + ok payload afterPayload))) nodePayloadKind = (nodePayload : bytesHead nodePayload) @@ -42,17 +26,18 @@ nodePayloadHasTag? = (tag nodePayload : (_ _ : false) (nodePayloadKind nodePayload)) -nodePayloadLeaf? = (nodePayload : bytesEq? [(0)] nodePayload) +nodePayloadLeaf? = (nodePayload : + bytesEq? [(0)] nodePayload) nodePayloadStem? = (nodePayload : and? (nodePayloadHasTag? nodePayloadStemTag nodePayload) - (equal? (bytesLength nodePayload) 33)) + (equal? (bytesLength nodePayload) 5)) nodePayloadFork? = (nodePayload : and? (nodePayloadHasTag? nodePayloadForkTag nodePayload) - (equal? (bytesLength nodePayload) 65)) + (equal? (bytesLength nodePayload) 9)) nodePayloadValid? = (nodePayload : or? @@ -61,96 +46,87 @@ nodePayloadValid? = (nodePayload : (nodePayloadStem? nodePayload) (nodePayloadFork? nodePayload))) -nodePayloadStemChildHash = (nodePayload : bytesTake 32 (bytesDrop 1 nodePayload)) -nodePayloadForkLeftHash = (nodePayload : bytesTake 32 (bytesDrop 1 nodePayload)) -nodePayloadForkRightHash = (nodePayload : bytesTake 32 (bytesDrop 33 nodePayload)) +nodePayloadStemChildIndex = (nodePayload : + u32BEBytesToNat (bytesTake 4 (bytesDrop 1 nodePayload))) -nodeRecordPayloadValid? = (nodeRecord : nodePayloadValid? (nodeRecordPayload nodeRecord)) +nodePayloadForkLeftIndex = (nodePayload : + u32BEBytesToNat (bytesTake 4 (bytesDrop 1 nodePayload))) + +nodePayloadForkRightIndex = (nodePayload : + u32BEBytesToNat (bytesTake 4 (bytesDrop 5 nodePayload))) nodeRecordsHaveInvalidPayload? = y (self nodeRecords : matchList false - (nodeRecord rest : + (nodePayload rest : or? - (not? (nodeRecordPayloadValid? nodeRecord)) + (not? (nodePayloadValid? nodePayload)) (self rest)) nodeRecords) -nodeRecordsHaveHash? = y (self nodeRecords nodeHash : - matchList - false - (nodeRecord rest : - or? - (bytesEq? nodeHash (nodeRecordHash nodeRecord)) - (self rest nodeHash)) - nodeRecords) +nodePayloadChildIndices = (nodePayload : + matchBool + t + (matchBool + (pair (nodePayloadStemChildIndex nodePayload) t) + (pair (nodePayloadForkLeftIndex nodePayload) + (pair (nodePayloadForkRightIndex nodePayload) t)) + (nodePayloadStem? nodePayload)) + (nodePayloadLeaf? nodePayload)) -nodeRecordsHaveDuplicateHashes? = y (self nodeRecords : - matchList +-- True iff index n names an element before limit in records. +-- For topologically sorted indexed bundles, every child of record i must +-- satisfy childIndex < i, so searching only the prefix [0, i) validates both +-- bounds and acyclicity. +nodeIndexInPrefix? = y (self n records i limit : + matchBool false - (nodeRecord rest : - or? - (nodeRecordsHaveHash? rest (nodeRecordHash nodeRecord)) - (self rest)) - nodeRecords) + (matchList + false + (_ rest : + matchBool + true + (self n rest (succ i) limit) + (equal? i n)) + records) + (equal? i limit)) -lookupNodeRecord_ = y (self nodeRecords nodeHash : +nodeChildIndicesInPrefix? = y (self childIndices records limit : matchList - nothing - (nodeRecord rest : + true + (childIndex rest : matchBool - (just nodeRecord) - (self rest nodeHash) - (bytesEq? nodeHash (nodeRecordHash nodeRecord))) - nodeRecords) + (self rest records limit) + false + (nodeIndexInPrefix? childIndex records 0 limit)) + childIndices) -lookupNodeRecord = (nodeHash nodeRecords : lookupNodeRecord_ nodeRecords nodeHash) +nodePayloadIndicesValid? = (nodePayload i records : + nodeChildIndicesInPrefix? + (nodePayloadChildIndices nodePayload) + records + i) -nodeRecordChildHashes = (nodeRecord : - (nodePayload : - matchBool - t - (matchBool - (pair (nodePayloadStemChildHash nodePayload) t) - (pair (nodePayloadForkLeftHash nodePayload) - (pair (nodePayloadForkRightHash nodePayload) t)) - (nodePayloadStem? nodePayload)) - (nodePayloadLeaf? nodePayload)) - (nodeRecordPayload nodeRecord)) - -nodeHashPresent? = (nodeHash nodeRecords : nodeRecordsHaveHash? nodeRecords nodeHash) - -nodeChildHashesPresent? = y (self childHashes nodeRecords : +nodeRecordsValidIndicesFrom? = y (self allRecords remainingRecords i : matchList true - (childHash rest : - and? - (nodeHashPresent? childHash nodeRecords) - (self rest nodeRecords)) - childHashes) + (nodePayload rest : + matchBool + (self allRecords rest (succ i)) + false + (nodePayloadIndicesValid? nodePayload i allRecords)) + remainingRecords) -nodeRecordChildrenPresent? = (nodeRecord nodeRecords : - nodeChildHashesPresent? (nodeRecordChildHashes nodeRecord) nodeRecords) - -nodeRecordsClosed? = y (self nodeRecords allNodeRecords : - matchList - true - (nodeRecord rest : - and? - (nodeRecordChildrenPresent? nodeRecord allNodeRecords) - (self rest allNodeRecords)) - nodeRecords) +nodeRecordsValidIndices? = (nodeRecords i : + nodeRecordsValidIndicesFrom? nodeRecords nodeRecords i) validateNodeRecords = (nodeRecords rest : matchBool (err errInvalidNodePayload rest) (matchBool - (err errDuplicateNode rest) - (matchBool - (ok nodeRecords rest) - (err errMissingNode rest) - (nodeRecordsClosed? nodeRecords nodeRecords)) - (nodeRecordsHaveDuplicateHashes? nodeRecords)) + (ok nodeRecords rest) + (err errMissingNode rest) + (nodeRecordsValidIndices? nodeRecords 0)) (nodeRecordsHaveInvalidPayload? nodeRecords)) readNodeRecords_ = y (self bs nodeCount i acc : @@ -161,7 +137,8 @@ readNodeRecords_ = y (self bs nodeCount i acc : self afterNodeRecord nodeCount (succ i) (pair nodeRecord acc))) (equal? i nodeCount)) -readNodeRecords = (nodeCount bs : readNodeRecords_ bs nodeCount 0 t) +readNodeRecords = (nodeCount bs : + readNodeRecords_ bs nodeCount 0 t) readNodesSection = (bs : bindResult (readBytes 8 bs) @@ -201,32 +178,31 @@ nodesSectionRecords = (nodesSection : (_ nodeRecords : nodeRecords) nodesSection) -nodeRecordToTreeWith = (self nodeRecords nodeRecord : +nodePayloadToTreeWith = (self nodeRecords nodePayload : + matchBool + (ok t t) + (matchBool + (bindResult (self (nodePayloadStemChildIndex nodePayload) nodeRecords) + (child _ : ok (t child) t)) + (bindResult (self (nodePayloadForkLeftIndex nodePayload) nodeRecords) + (left _ : + bindResult (self (nodePayloadForkRightIndex nodePayload) nodeRecords) + (right _ : ok (pair left right) t))) + (nodePayloadStem? nodePayload)) + (nodePayloadLeaf? nodePayload)) + +nodeIndexToTree = y (self nodeIndex nodeRecords : (nodePayload : matchBool - (ok t t) - (matchBool - (bindResult (self (nodePayloadStemChildHash nodePayload) nodeRecords) - (child _ : ok (t child) t)) - (bindResult (self (nodePayloadForkLeftHash nodePayload) nodeRecords) - (left _ : - bindResult (self (nodePayloadForkRightHash nodePayload) nodeRecords) - (right _ : ok (pair left right) t))) - (nodePayloadStem? nodePayload)) - (nodePayloadLeaf? nodePayload)) - (nodeRecordPayload nodeRecord)) + (nodePayloadToTreeWith self nodeRecords nodePayload) + (err errMissingNode t) + (not? (equal? nodePayload t))) + (nth nodeIndex nodeRecords)) -nodeHashToTree = y (self nodeHash nodeRecords : - triage - (err errMissingNode t) - (nodeRecord : nodeRecordToTreeWith self nodeRecords nodeRecord) - (_ _ : err errMissingNode t) - (lookupNodeRecord nodeHash nodeRecords)) - -readArboricxTreeFromHash = (rootHash bs : +readArboricxTreeFromIndex = (rootIndexBytes bs : bindResult (readArboricxNodesSection bs) (nodesSection afterContainer : - bindResult (nodeHashToTree rootHash (nodesSectionRecords nodesSection)) + bindResult (nodeIndexToTree (u32BEBytesToNat rootIndexBytes) (nodesSectionRecords nodesSection)) (tree _ : ok tree afterContainer))) -readArboricxExecutableFromHash = readArboricxTreeFromHash +readArboricxExecutableFromIndex = readArboricxTreeFromIndex diff --git a/lib/arboricx.tri b/lib/arboricx.tri index 4de151d..546f428 100644 --- a/lib/arboricx.tri +++ b/lib/arboricx.tri @@ -26,7 +26,7 @@ readArboricxExecutableByName = (nameBytes bs : (validCore _ : bindResult (selectExport (manifestExports validCore) nameBytes) (selectedExport _ : - readArboricxTreeFromHash (exportRoot selectedExport) bs)) + readArboricxTreeFromIndex (exportRoot selectedExport) bs)) bundleResult)) readArboricxExecutable = (bs : @@ -104,33 +104,52 @@ wrapHostValue = (validator wrapper resultValue rest : (err errHostCodecFailed resultValue) (validator resultValue)) -runArboricxByNameToTree = (nameBytes bs args : +wrapHostValueByTag = (tag value rest : + matchBool + (ok (hostTree value) rest) + (matchBool + (wrapHostValue hostString? hostString value rest) + (matchBool + (wrapHostValue hostNumber? hostNumber value rest) + (matchBool + (wrapHostValue hostBool? hostBool value rest) + (matchBool + (wrapHostValue hostList? hostList value rest) + (matchBool + (wrapHostValue hostBytes? hostBytes value rest) + (err errHostCodecFailed value) + (equal? tag hostBytesTag)) + (equal? tag hostListTag)) + (equal? tag hostBoolTag)) + (equal? tag hostNumberTag)) + (equal? tag hostStringTag)) + (equal? tag hostTreeTag)) + +runArboricxByNameToTyped = (tag nameBytes bs args : bindResult (runArboricxArgsByName nameBytes bs args) - (value rest : ok (hostTree value) rest)) + (value rest : wrapHostValueByTag tag value rest)) + +runArboricxByNameToTree = (nameBytes bs args : + runArboricxByNameToTyped hostTreeTag nameBytes bs args) runArboricxByNameToString = (nameBytes bs args : - bindResult (runArboricxArgsByName nameBytes bs args) - (value rest : wrapHostValue hostString? hostString value rest)) + runArboricxByNameToTyped hostStringTag nameBytes bs args) runArboricxByNameToNumber = (nameBytes bs args : - bindResult (runArboricxArgsByName nameBytes bs args) - (value rest : wrapHostValue hostNumber? hostNumber value rest)) + runArboricxByNameToTyped hostNumberTag nameBytes bs args) runArboricxByNameToBool = (nameBytes bs args : - bindResult (runArboricxArgsByName nameBytes bs args) - (value rest : wrapHostValue hostBool? hostBool value rest)) + runArboricxByNameToTyped hostBoolTag nameBytes bs args) runArboricxByNameToList = (nameBytes bs args : - bindResult (runArboricxArgsByName nameBytes bs args) - (value rest : wrapHostValue hostList? hostList value rest)) + runArboricxByNameToTyped hostListTag nameBytes bs args) runArboricxByNameToBytes = (nameBytes bs args : - bindResult (runArboricxArgsByName nameBytes bs args) - (value rest : wrapHostValue hostBytes? hostBytes value rest)) + runArboricxByNameToTyped hostBytesTag nameBytes bs args) -runArboricxToTree = (bs args : runArboricxByNameToTree [] bs args) -runArboricxToString = (bs args : runArboricxByNameToString [] bs args) -runArboricxToNumber = (bs args : runArboricxByNameToNumber [] bs args) -runArboricxToBool = (bs args : runArboricxByNameToBool [] bs args) -runArboricxToList = (bs args : runArboricxByNameToList [] bs args) -runArboricxToBytes = (bs args : runArboricxByNameToBytes [] bs args) +runArboricxToTree = (bs args : runArboricxByNameToTyped hostTreeTag [] bs args) +runArboricxToString = (bs args : runArboricxByNameToTyped hostStringTag [] bs args) +runArboricxToNumber = (bs args : runArboricxByNameToTyped hostNumberTag [] bs args) +runArboricxToBool = (bs args : runArboricxByNameToTyped hostBoolTag [] bs args) +runArboricxToList = (bs args : runArboricxByNameToTyped hostListTag [] bs args) +runArboricxToBytes = (bs args : runArboricxByNameToTyped hostBytesTag [] bs args) diff --git a/lib/list.tri b/lib/list.tri index c2c8276..04a5976 100644 --- a/lib/list.tri +++ b/lib/list.tri @@ -68,3 +68,15 @@ any? = y (self pred : matchList (h z : or? (pred h) (self pred z))) intersect = xs ys : filter (x : lExist? x ys) xs + +nth_ = y (self n xs i : + matchList + t + (h r : + matchBool + h + (self n r (succ i)) + (equal? i n)) + xs) + +nth = n xs : nth_ n xs 0 diff --git a/src/FileEval.hs b/src/FileEval.hs index 6d4c49c..550b9f6 100644 --- a/src/FileEval.hs +++ b/src/FileEval.hs @@ -11,20 +11,19 @@ import Eval (evalTricu, evalTricuWithStore) import Lexer import Parser import Research -import ContentStore (newContentStore, storeTerm, hashTerm) +import Wire (buildBundle, encodeBundle, decodeBundle, verifyBundle, Bundle(..)) import Database.SQLite.Simple (Connection) -import Wire (exportNamedBundle, defaultExportNames) -import Control.Monad (forM_) import Data.List (partition) import Data.Maybe (mapMaybe) import System.FilePath (takeDirectory, normalise, ()) import System.Exit (die) -import Database.SQLite.Simple (close) +import qualified Data.ByteString as BS import qualified Data.ByteString.Lazy as BL import qualified Data.Map as Map import qualified Data.Set as Set +import qualified Data.Sequence as Seq import qualified Data.Text as T extractMain :: Env -> Either String T @@ -176,37 +175,27 @@ nsVariable "" name = name nsVariable moduleName name = moduleName ++ "." ++ name -- | Compile a tricu source file to a standalone Arboricx bundle. --- Uses a temp content store so it does not collide with the global one. --- Supports multiple named exports; each is stored separately in the --- temp store so that resolveExportTarget can look them up by name. +-- Emits a canonical indexed bundle with no SHA-256 hashing. compileFile :: FilePath -> FilePath -> [T.Text] -> IO () compileFile inputPath outputPath maybeNames = do - -- Evaluate the file to get the full environment env <- evaluateFile inputPath - -- Look up each requested definition name let defaultNames = ["main"] wantedNames = if null maybeNames then defaultNames else maybeNames wantedNamesUnpacked = map T.unpack wantedNames compiledTerms <- mapM (\n -> case Map.lookup n env of Nothing -> die $ "No definition '" ++ n ++ "' found in " ++ inputPath - Just t -> return (n, t)) wantedNamesUnpacked - let compiledMap :: Map.Map T.Text T = Map.fromList - $ map (\(n,t) -> (T.pack n, t)) compiledTerms - compiledNames :: [T.Text] = Map.keys compiledMap - compiledTermsList :: [T] = Map.elems compiledMap - -- Create a temp in-memory content store - conn <- newContentStore - -- Store each term in the temp store under its requested name - forM_ (zip compiledNames compiledTermsList) $ \(n, t) -> - storeTerm conn [T.unpack n] t - -- Generate default export names when none were supplied - let expNames = if null maybeNames - then defaultExportNames (length compiledNames) - else compiledNames - exports :: [(T.Text, MerkleHash)] = zip expNames (map hashTerm compiledTermsList) - -- Export the bundle (exportNamedBundle returns already-encoded bytes) - bundleData <- exportNamedBundle conn exports + Just t -> return (T.pack n, t)) wantedNamesUnpacked + let bundle = buildBundle compiledTerms + bundleData = encodeBundle bundle + nodeCount = Seq.length (bundleNodes bundle) + bundleSize = BS.length bundleData BL.writeFile outputPath (BL.fromStrict bundleData) - close conn putStrLn $ "Compiled " ++ inputPath ++ " -> " ++ outputPath - putStrLn $ " exports: " ++ T.unpack (T.intercalate ", " expNames) + putStrLn $ " exports: " ++ T.unpack (T.intercalate ", " (map fst compiledTerms)) + putStrLn $ " nodes: " ++ show nodeCount + putStrLn $ " size: " ++ show bundleSize ++ " bytes" + case decodeBundle bundleData of + Left err -> putStrLn $ " round-trip decode failed: " ++ err + Right decoded -> case verifyBundle decoded of + Left err -> putStrLn $ " round-trip verify failed: " ++ err + Right () -> putStrLn $ " round-trip: OK" diff --git a/src/Main.hs b/src/Main.hs index ea6cc96..deeb768 100644 --- a/src/Main.hs +++ b/src/Main.hs @@ -1,6 +1,6 @@ module Main where -import ContentStore (initContentStoreWithPath, loadEnvironment, loadTerm, resolveExportTarget) +import ContentStore (initContentStoreWithPath, loadEnvironment, loadTerm, loadTree, resolveExportTarget) import System.Exit (die) import Server (runServerWithPath) import Eval (evalTricu, evalTricuWithStore, mainResult, result) @@ -8,7 +8,7 @@ import FileEval (evaluateFileWithContext, evaluateFileWithStore, compileFile) import Parser (parseTricu) import REPL (repl) import Research (T, EvaluatedForm(..), Env, formatT, exportDag) -import Wire (exportNamedBundle, defaultExportNames, importBundle) +import Wire (buildBundle, encodeBundle, importBundle, defaultExportNames, Bundle(..)) import Control.Monad (foldM, unless, when) import Data.Text (unpack, pack) @@ -17,7 +17,9 @@ import Data.Version (showVersion) import Paths_tricu (version) import Options.Applicative +import qualified Data.ByteString as BS import qualified Data.ByteString.Lazy as BL +import qualified Data.Sequence as Seq import Database.SQLite.Simple (Connection, close) import qualified Data.Map as Map @@ -36,10 +38,10 @@ data TricuArgs , evalDb :: Maybe FilePath } | ArboricxCompile - { compileInput :: FilePath - , compileOutput :: FilePath - , compileNames :: [String] - , compileDb :: Maybe FilePath + { compileInput :: FilePath + , compileOutput :: FilePath + , compileNames :: [String] + , compileDb :: Maybe FilePath } | ArboricxImport { importFile :: FilePath @@ -292,9 +294,9 @@ runImport opts = do when (null file) $ die "tricu arboricx import: input file is required" withContentStore (importDb opts) $ \conn -> do bundleData <- BL.readFile file - roots <- importBundle conn (BL.toStrict bundleData) + roots <- map T.unpack <$> importBundle conn (BL.toStrict bundleData) putStrLn $ "Imported " ++ show (length roots) ++ " root(s):" - mapM_ (\r -> putStrLn $ " " ++ unpack r) roots + mapM_ (\r -> putStrLn $ " " ++ r) roots runExport :: TricuArgs -> IO () runExport opts = @@ -310,18 +312,24 @@ runExportBundle opts = do when (null out) $ die "tricu arboricx export: --output is required" when (null targets) $ die "tricu arboricx export: at least one --target is required" withContentStore (exportDb opts) $ \conn -> do - hashes <- mapM (\t -> do + terms <- mapM (\t -> do (h, _) <- resolveExportTarget conn t - return h) targets + maybeTree <- loadTree conn h + case maybeTree of + Nothing -> die $ "Term not found in store: " ++ t + Just tree -> return tree) targets let expNames = if null names - then defaultExportNames (length hashes) + then defaultExportNames (length terms) else map T.pack names - when (length expNames /= length hashes) $ + when (length expNames /= length terms) $ die "tricu arboricx export: number of --name values must match number of TARGETs" - let exports = zip expNames hashes - bundleData <- exportNamedBundle conn exports + let namedTerms = zip expNames terms + bundle = buildBundle namedTerms + bundleData = encodeBundle bundle BL.writeFile out (BL.fromStrict bundleData) - putStrLn $ "Exported bundle with " ++ show (length exports) ++ " export(s) to " ++ out + putStrLn $ "Exported bundle with " ++ show (length namedTerms) ++ " export(s) to " ++ out + putStrLn $ " nodes: " ++ show (Seq.length (bundleNodes bundle)) + putStrLn $ " size: " ++ show (BS.length bundleData) ++ " bytes" runExportDag :: TricuArgs -> IO () runExportDag opts = do diff --git a/src/REPL.hs b/src/REPL.hs index f35c067..871d1f2 100644 --- a/src/REPL.hs +++ b/src/REPL.hs @@ -6,7 +6,7 @@ import FileEval import Lexer () import Parser import Research -import Wire +import Wire (buildBundle, encodeBundle, importBundle) import Control.Concurrent (forkIO, threadDelay, killThread, ThreadId) import Control.Exception (SomeException, catch, displayException) @@ -483,13 +483,20 @@ repl = do _ -> do printError $ "Ambiguous match for: " ++ cleanHash return h - bundleData <- liftIO $ exportBundle conn [hash] - liftIO $ BL.writeFile outFile (BL.fromStrict bundleData) - liftIO $ do - printSuccess $ "Exported bundle with root " - displayColoredHash hash - putStrLn $ " to " ++ outFile - loop state + maybeTree <- liftIO $ loadTree conn hash + case maybeTree of + Nothing -> do + liftIO $ printError $ "Term not found in store: " ++ T.unpack hash + loop state + Just tree -> do + let bundle = buildBundle [(T.pack "root", tree)] + bundleData = encodeBundle bundle + liftIO $ BL.writeFile outFile (BL.fromStrict bundleData) + liftIO $ do + printSuccess $ "Exported bundle with root " + displayColoredHash hash + putStrLn $ " to " ++ outFile + loop state handleBundleImport :: REPLState -> InputT IO () handleBundleImport state = do diff --git a/src/Server.hs b/src/Server.hs index b85c60c..5302cdc 100644 --- a/src/Server.hs +++ b/src/Server.hs @@ -4,9 +4,9 @@ module Server ) where import ContentStore (initContentStore, initContentStoreWithPath, nameToTerm, hashToTerm, listStoredTerms, - parseNameList, StoredTerm(..), termHash) + parseNameList, StoredTerm(..), termHash, loadTree) import Database.SQLite.Simple (Connection, close) -import Wire (exportNamedBundle) +import Wire (buildBundle, encodeBundle) import Control.Monad (when, void) import Data.Maybe (catMaybes) @@ -19,6 +19,7 @@ import Data.String (fromString) import Data.Text (Text) import Data.Text.Encoding (encodeUtf8, decodeUtf8) import Data.Char (isHexDigit, toLower) +import Data.ByteString (ByteString) import Data.ByteString.Char8 (unpack) import Data.ByteString.Lazy (fromStrict) import qualified Data.Text as T @@ -103,7 +104,7 @@ rootsHandler mkConn request respond = do close conn void $ respond resp -- Build and return the bundle - bundleData <- exportNamedBundle conn allNamedHashes + bundleData <- buildAndEncodeBundle conn allNamedHashes let firstHash = snd (head allNamedHashes) cd = T.pack "attachment; filename=roots.bundle" close conn @@ -123,7 +124,7 @@ nameHandler mkConn nameText = do Just term' -> do let th = termHash term' namedHashes = [(firstOrRoot (termNames term'), th)] - bundleData <- exportNamedBundle conn namedHashes + bundleData <- buildAndEncodeBundle conn namedHashes let cd = T.pack $ "attachment; filename=" ++ safeFileName (T.unpack nameText) ++ ".bundle" close conn return $ responseLBS status200 (bundleHeaders th cd) (fromStrict bundleData) @@ -144,12 +145,24 @@ hashHandler mkConn hashText = Just term' -> do let th = termHash term' namedHashes' = [(firstOrRoot (termNames term'), th)] - bundleData <- exportNamedBundle conn namedHashes' + bundleData <- buildAndEncodeBundle conn namedHashes' close conn return $ responseLBS status200 (bundleHeaders th "attachment; filename=hash.bundle") (fromStrict bundleData) +-- | Helper: load terms by hash and build an indexed bundle. +buildAndEncodeBundle :: Connection -> [(Text, Text)] -> IO ByteString +buildAndEncodeBundle conn namedHashes = do + terms <- mapM (\(_, h) -> do + maybeTree <- loadTree conn h + case maybeTree of + Nothing -> error $ "Server: hash not found in store: " ++ T.unpack h + Just tree -> return tree) namedHashes + let namedTerms = zip (map fst namedHashes) terms + bundle = buildBundle namedTerms + return $ encodeBundle bundle + -- | GET /terms termsResponse :: IO Connection -> IO Response termsResponse mkConn = do diff --git a/src/Wire.hs b/src/Wire.hs index 1211a33..82c77d5 100644 --- a/src/Wire.hs +++ b/src/Wire.hs @@ -11,55 +11,60 @@ module Wire , BundleExport (..) , BundleMetadata , ClosureMode (..) + , BundleNode (..) , encodeBundle , decodeBundle , verifyBundle - , collectReachableNodes - , exportBundle - , exportNamedBundle + , buildBundle , importBundle , defaultExportNames ) where -import ContentStore (getNodeMerkle, loadTree, putMerkleNode, storeTerm) -import Research +import ContentStore (storeTerm) +import Research hiding (Node) -import Control.Exception (SomeException, evaluate, try) -import Control.Monad (foldM, unless, when) -import Crypto.Hash (Digest, SHA256, hash) -import Data.Bits ((.|.), (.&.), shiftL, shiftR) -import Data.ByteArray (convert) +import Control.Monad (foldM, forM_, unless, when) +import Data.Bits (shiftL, shiftR, (.|.), (.&.)) import Data.ByteString (ByteString) import Data.Foldable (traverse_) +import qualified Data.Foldable as Foldable +import Data.List (mapAccumL) import Data.Map (Map) +import qualified Data.Map as Map +import Data.Sequence (Seq, (|>)) +import qualified Data.Sequence as Seq +import Data.Set (Set) +import qualified Data.Set as Set import Data.Text (Text, unpack) -import Data.Text.Encoding (decodeUtf8, decodeUtf8', encodeUtf8) +import Data.Text.Encoding (decodeUtf8', encodeUtf8) +import Data.Vector (Vector) +import qualified Data.Vector as V +import qualified Data.Vector.Mutable as MV import Data.Word (Word16, Word32, Word64, Word8) import Database.SQLite.Simple (Connection) import GHC.Generics (Generic) import qualified Data.ByteString as BS -import qualified Data.ByteString.Base16 as Base16 -import qualified Data.Map as Map -import qualified Data.Set as Set import qualified Data.Text as T --- | Portable bundle major/minor version supported by this module. +-- --------------------------------------------------------------------------- +-- Container constants +-- --------------------------------------------------------------------------- + bundleMajorVersion :: Word16 bundleMajorVersion = 1 bundleMinorVersion :: Word16 bundleMinorVersion = 0 --- | Header magic for the portable executable-object container. bundleMagic :: ByteString -bundleMagic = BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x43, 0x58] -- "ARBORICX" +bundleMagic = BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x43, 0x58] headerLength :: Int headerLength = 32 sectionEntryLength :: Int -sectionEntryLength = 60 +sectionEntryLength = 32 sectionManifest, sectionNodes :: Word32 sectionManifest = 1 @@ -68,27 +73,22 @@ sectionNodes = 2 flagCritical :: Word16 flagCritical = 0x0001 -compressionNone, digestSha256 :: Word16 +compressionNone :: Word16 compressionNone = 0 -digestSha256 = 1 -- --------------------------------------------------------------------------- --- Manifest binary constants +-- Manifest constants -- --------------------------------------------------------------------------- --- | Magic prefix identifying the fixed-order manifest v1 format. manifestMagic :: ByteString manifestMagic = "ARBMNFST" --- | Manifest major version. manifestMajorVersion :: Word16 manifestMajorVersion = 1 --- | Manifest minor version. manifestMinorVersion :: Word16 -manifestMinorVersion = 0 +manifestMinorVersion = 1 --- | Closure mode to byte. closureToByte :: ClosureMode -> Word8 closureToByte = \case ClosureComplete -> 0 @@ -100,7 +100,6 @@ closureFromByte = \case 1 -> Right ClosurePartial n -> Left $ "unsupported closure byte: " ++ show n --- | Metadata tag constants. tagPackage, tagVersion, tagDescription, tagLicense, tagCreatedBy :: Word16 tagPackage = 1 tagVersion = 2 @@ -109,38 +108,31 @@ tagLicense = 4 tagCreatedBy = 5 -- --------------------------------------------------------------------------- --- Fixed-order manifest binary helpers +-- Text encoding helpers -- --------------------------------------------------------------------------- --- | Encode a UTF-8 text string as: u32 length + UTF-8 bytes. encodeLengthPrefixedText :: Text -> ByteString encodeLengthPrefixedText t = encode32 (fromIntegral $ BS.length bs) <> bs where bs = encodeUtf8 t --- | Decode a length-prefixed UTF-8 text string. --- Returns the decoded Text and the remaining ByteString. decodeLengthPrefixedText :: ByteString -> Either String (Text, ByteString) -decodeLengthPrefixedText bs = - case decode32be "text_length" bs of - Left err -> Left $ "decodeLengthPrefixedText: " ++ err - Right (len, rest) -> do - let payloadLen = fromIntegral len - when (BS.length rest < payloadLen) $ - Left "decodeLengthPrefixedText: string extends beyond input" - let (textBytes, after) = BS.splitAt payloadLen rest - case decodeUtf8' textBytes of - Right txt -> Right (txt, after) - Left _ -> Left "decodeLengthPrefixedText: invalid UTF-8" +decodeLengthPrefixedText bs = do + (len, rest) <- decode32be "text_length" bs + let payloadLen = fromIntegral len + when (BS.length rest < payloadLen) $ + Left "decodeLengthPrefixedText: string extends beyond input" + let (textBytes, after) = BS.splitAt payloadLen rest + case decodeUtf8' textBytes of + Right txt -> Right (txt, after) + Left _ -> Left "decodeLengthPrefixedText: invalid UTF-8" --- | Encode a metadata value as a TLV entry: u16 tag + u32 length + raw bytes. encodeMetadataTLV :: Word16 -> ByteString -> ByteString encodeMetadataTLV tag val = encode16 tag <> encode32 (fromIntegral $ BS.length val) <> val -- --------------------------------------------------------------------------- --- Fixed-order manifest encoders +-- Manifest encoders -- --------------------------------------------------------------------------- --- | Encode the entire manifest in fixed-order core + TLV tail layout. encodeManifest :: BundleManifest -> ByteString encodeManifest m = manifestMagic @@ -163,18 +155,16 @@ encodeManifest m = <> encode32 (fromIntegral $ length (manifestExports m)) <> encodeExports (manifestExports m) <> encodeMetadataTLVs (manifestMetadata m) - <> encode32 0 -- zero extension fields + <> encode32 0 encodeCapabilities :: [Text] -> ByteString -encodeCapabilities caps = mconcat (map encodeLengthPrefixedText caps) +encodeCapabilities = mconcat . map encodeLengthPrefixedText encodeRoots :: [BundleRoot] -> ByteString encodeRoots = mconcat . map encodeRoot encodeRoot :: BundleRoot -> ByteString -encodeRoot root = - merkleHashToRaw (rootHash root) - <> encodeLengthPrefixedText (rootRole root) +encodeRoot root = encode32 (rootIndex root) <> encodeLengthPrefixedText (rootRole root) encodeExports :: [BundleExport] -> ByteString encodeExports = mconcat . map encodeExport @@ -182,12 +172,10 @@ encodeExports = mconcat . map encodeExport encodeExport :: BundleExport -> ByteString encodeExport exp = encodeLengthPrefixedText (exportName exp) - <> merkleHashToRaw (exportRoot exp) + <> encode32 (exportRoot exp) <> encodeLengthPrefixedText (exportKind exp) <> encodeLengthPrefixedText (exportAbi exp) --- | Encode metadata as: u32 field count + TLV entries for present fields. --- Metadata TLV values are raw UTF-8 bytes; the TLV length already carries size. encodeMetadataTLVs :: BundleMetadata -> ByteString encodeMetadataTLVs m = let entries = metadataTLVEntries m @@ -205,60 +193,48 @@ metadataTLVEntries m = maybeEntry tag (Just value) = [(tag, encodeUtf8 value)] encodeTLVs :: [(Word16, ByteString)] -> ByteString -encodeTLVs tlvs = mconcat (map (uncurry encodeMetadataTLV) tlvs) +encodeTLVs = mconcat . map (uncurry encodeMetadataTLV) -- --------------------------------------------------------------------------- --- Fixed-order manifest decoders +-- Manifest decoders -- --------------------------------------------------------------------------- --- | Decode the manifest from fixed-order core + TLV tail bytes. --- All remaining bytes after the core fields are treated as the TLV tail. decodeManifest :: ByteString -> Either String BundleManifest decodeManifest bs = do - -- Header when (BS.length bs < 8) $ Left "manifest too short for magic" when (BS.take 8 bs /= manifestMagic) $ Left "invalid manifest magic" let rest = BS.drop 8 bs (major, rest') <- decode16be "major" rest - when (major /= manifestMajorVersion) $ Left $ "unsupported manifest major version: " ++ show major - (_minor, rest'') <- decode16be "minor" rest' + (minor, rest'') <- decode16be "minor" rest' + when (major /= manifestMajorVersion) $ + Left $ "unsupported manifest major version: " ++ show major + when (minor /= manifestMinorVersion) $ + Left $ "unsupported manifest minor version: " ++ show minor - -- Core strings - (schema, rest''') <- decodeLengthPrefixedText rest'' - (bundleType, rest'''') <- decodeLengthPrefixedText rest''' + (schema, r1) <- decodeLengthPrefixedText rest'' + (bundleType, r2) <- decodeLengthPrefixedText r1 + (calc, r3) <- decodeLengthPrefixedText r2 + (alg, r4) <- decodeLengthPrefixedText r3 + (domain, r5) <- decodeLengthPrefixedText r4 + (payload, r6) <- decodeLengthPrefixedText r5 + (sem, r7) <- decodeLengthPrefixedText r6 + (eval, r8) <- decodeLengthPrefixedText r7 + (abi, r9) <- decodeLengthPrefixedText r8 - -- Tree spec fields (flat) - (calc, rest1) <- decodeLengthPrefixedText rest'''' - (alg, rest2) <- decodeLengthPrefixedText rest1 - (domain, rest3) <- decodeLengthPrefixedText rest2 - (payload, rest4) <- decodeLengthPrefixedText rest3 + (capCount, r10) <- decode32be "capability_count" r9 + (caps, r11) <- decodeCapabilities (fromIntegral capCount) r10 - -- Runtime spec fields (flat) - (sem, restR1) <- decodeLengthPrefixedText rest4 - (eval, restR2) <- decodeLengthPrefixedText restR1 - (abi, restR3) <- decodeLengthPrefixedText restR2 - - (capCount, restR4) <- decode32be "capability_count" restR3 - let capLen = fromIntegral capCount - (caps, restR5) <- decodeCapabilities capLen restR4 - - -- Closure - when (BS.length restR5 < 1) $ Left "manifest truncated: missing closure byte" - let (closureByte, restR6) = BS.splitAt 1 restR5 + when (BS.length r11 < 1) $ Left "manifest truncated: missing closure byte" + let (closureByte, r12) = BS.splitAt 1 r11 closure <- closureFromByte (head $ BS.unpack closureByte) - -- Roots - (rootCount, restR7) <- decode32be "root_count" restR6 - let rootCountInt = fromIntegral rootCount - (roots, restR8) <- decodeRoots rootCountInt restR7 + (rootCount, r13) <- decode32be "root_count" r12 + (roots, r14) <- decodeRoots (fromIntegral rootCount) r13 - -- Exports - (exportCount, restR9) <- decode32be "export_count" restR8 - let exportCountInt = fromIntegral exportCount - (exports, restR10) <- decodeExports exportCountInt restR9 + (exportCount, r15) <- decode32be "export_count" r14 + (exports, r16) <- decodeExports (fromIntegral exportCount) r15 - -- TLV tail - (metadata, _ext) <- decodeMetadataAndExtensions restR10 + (metadata, _ext) <- decodeMetadataAndExtensions r16 pure BundleManifest { manifestSchema = schema @@ -283,7 +259,6 @@ decodeManifest bs = do , manifestMetadata = metadata } --- | Decode length-prefixed capability strings. decodeCapabilities :: Int -> ByteString -> Either String ([Text], ByteString) decodeCapabilities 0 bs = Right ([], bs) decodeCapabilities n bs = do @@ -291,31 +266,24 @@ decodeCapabilities n bs = do (restTxts, restFinal) <- decodeCapabilities (n - 1) rest Right (txt : restTxts, restFinal) --- | Decode root entries. decodeRoots :: Int -> ByteString -> Either String ([BundleRoot], ByteString) decodeRoots 0 bs = Right ([], bs) decodeRoots n bs = do - when (BS.length bs < 32) $ Left "decodeRoots: truncated root hash" - let (hashBytes, rest) = BS.splitAt 32 bs - role <- decodeLengthPrefixedText rest - (restRoots, restFinal) <- decodeRoots (n - 1) (snd role) - Right (BundleRoot (rawToMerkleHash hashBytes) (fst role) : restRoots, restFinal) + (idx, rest1) <- decode32be "root_index" bs + (role, rest2) <- decodeLengthPrefixedText rest1 + (restRoots, restFinal) <- decodeRoots (n - 1) rest2 + Right (BundleRoot idx role : restRoots, restFinal) --- | Decode export entries. decodeExports :: Int -> ByteString -> Either String ([BundleExport], ByteString) decodeExports 0 bs = Right ([], bs) decodeExports n bs = do - name <- decodeLengthPrefixedText bs - when (BS.length (snd name) < 32) $ Left "decodeExports: truncated export root hash" - let (hashBytes, rest) = BS.splitAt 32 (snd name) - kind <- decodeLengthPrefixedText rest - abi <- decodeLengthPrefixedText (snd kind) - (restExports, restFinal) <- decodeExports (n - 1) (snd abi) - Right (BundleExport (fst name) (rawToMerkleHash hashBytes) (fst kind) (fst abi) : restExports, restFinal) + (name, r1) <- decodeLengthPrefixedText bs + (idx, r2) <- decode32be "export_root" r1 + (kind, r3) <- decodeLengthPrefixedText r2 + (abi, r4) <- decodeLengthPrefixedText r3 + (restExports, restFinal) <- decodeExports (n - 1) r4 + Right (BundleExport name idx kind abi : restExports, restFinal) --- | Decode TLV tail into metadata and extensions. --- Layout: u32 metadata-count, metadata TLVs, u32 extension-count, extension TLVs. --- For now, known metadata tags are decoded and extension TLVs are skipped. decodeMetadataAndExtensions :: ByteString -> Either String (BundleMetadata, ByteString) decodeMetadataAndExtensions bs = do (metadataCount, rest1) <- decode32be "metadata_field_count" bs @@ -326,33 +294,30 @@ decodeMetadataAndExtensions bs = do unless (BS.null rest4) $ Left "trailing bytes after manifest TLV tail" Right (metadata, rest4) --- | Decode a fixed number of TLV entries. decodeTLVs :: Int -> ByteString -> Either String ([TLVEntry], ByteString) decodeTLVs 0 bs = Right ([], bs) decodeTLVs n bs = do - (tag, rest1) <- decode16be "tlv_tag" bs - (len, rest2) <- decode32be "tlv_length" rest1 + (tag, r1) <- decode16be "tlv_tag" bs + (len, r2) <- decode32be "tlv_length" r1 let payloadLen = fromIntegral len - when (BS.length rest2 < payloadLen) $ Left "TLV value extends beyond input" - let (value, after) = BS.splitAt payloadLen rest2 + when (BS.length r2 < payloadLen) $ Left "TLV value extends beyond input" + let (value, after) = BS.splitAt payloadLen r2 (restTlvs, restFinal) <- decodeTLVs (n - 1) after Right ((tag, value) : restTlvs, restFinal) --- | Decode known metadata TLV entries into BundleMetadata. --- Unknown tags are ignored. decodeMetadataTLVs :: [(Word16, ByteString)] -> Either String BundleMetadata decodeMetadataTLVs tlvs = do - pkg <- decodeOptionalMetadataText tagPackage - ver <- decodeOptionalMetadataText tagVersion - desc <- decodeOptionalMetadataText tagDescription - lic <- decodeOptionalMetadataText tagLicense - by <- decodeOptionalMetadataText tagCreatedBy + pkg <- lookupText tagPackage + ver <- lookupText tagVersion + desc <- lookupText tagDescription + lic <- lookupText tagLicense + by <- lookupText tagCreatedBy pure BundleMetadata - { metadataPackage = pkg - , metadataVersion = ver + { metadataPackage = pkg + , metadataVersion = ver , metadataDescription = desc - , metadataLicense = lic - , metadataCreatedBy = by + , metadataLicense = lic + , metadataCreatedBy = by } where lookupTag t = go t tlvs @@ -360,12 +325,12 @@ decodeMetadataTLVs tlvs = do go t ((tag, val):rest) | tag == t = Just val | otherwise = go t rest - decodeOptionalMetadataText tag = + lookupText tag = case lookupTag tag of Nothing -> Right Nothing Just raw -> case decodeUtf8' raw of Right txt -> Right (Just txt) - Left _ -> Left $ "metadata TLV has invalid UTF-8 for tag " ++ show tag + Left _ -> Left $ "metadata TLV has invalid UTF-8 for tag " ++ show tag type TLVEntry = (Word16, ByteString) @@ -373,24 +338,20 @@ type TLVEntry = (Word16, ByteString) -- Data types -- --------------------------------------------------------------------------- --- | Closure declaration. data ClosureMode = ClosureComplete | ClosurePartial deriving (Show, Eq, Ord, Generic) --- | Hash specification (algorithm + domain strings). data NodeHashSpec = NodeHashSpec { nodeHashAlgorithm :: Text , nodeHashDomain :: Text } deriving (Show, Eq, Ord, Generic) --- | Tree specification. data TreeSpec = TreeSpec { treeCalculus :: Text , treeNodeHash :: NodeHashSpec , treeNodePayload :: Text } deriving (Show, Eq, Ord, Generic) --- | Runtime specification. data RuntimeSpec = RuntimeSpec { runtimeSemantics :: Text , runtimeEvaluation :: Text @@ -398,21 +359,18 @@ data RuntimeSpec = RuntimeSpec , runtimeCapabilities :: [Text] } deriving (Show, Eq, Ord, Generic) --- | A root hash reference. data BundleRoot = BundleRoot - { rootHash :: MerkleHash + { rootIndex :: Word32 , rootRole :: Text } deriving (Show, Eq, Ord, Generic) --- | An export entry. data BundleExport = BundleExport { exportName :: Text - , exportRoot :: MerkleHash + , exportRoot :: Word32 , exportKind :: Text , exportAbi :: Text } deriving (Show, Eq, Ord, Generic) --- | Optional package metadata. data BundleMetadata = BundleMetadata { metadataPackage :: Maybe Text , metadataVersion :: Maybe Text @@ -421,7 +379,6 @@ data BundleMetadata = BundleMetadata , metadataCreatedBy :: Maybe Text } deriving (Show, Eq, Ord, Generic) --- | The manifest: top-level bundle metadata. data BundleManifest = BundleManifest { manifestSchema :: Text , manifestBundleType :: Text @@ -433,52 +390,157 @@ data BundleManifest = BundleManifest , manifestMetadata :: BundleMetadata } deriving (Show, Eq, Generic) --- | Portable executable-object bundle. --- --- Merkle node payloads remain the language-neutral executable core: --- Leaf = 0x00; Stem = 0x01 || child_hash; Fork = 0x02 || left_hash || right_hash. --- Names, exports, runtime metadata, and package metadata live in the manifest layer. +data BundleNode + = BNLeaf + | BNStem !Word32 + | BNFork !Word32 !Word32 + deriving (Show, Eq) + data Bundle = Bundle { bundleVersion :: Word16 - , bundleRoots :: [MerkleHash] - , bundleNodes :: Map MerkleHash ByteString + , bundleRoots :: [Word32] + , bundleNodes :: Seq BundleNode , bundleManifest :: BundleManifest , bundleManifestBytes :: ByteString } deriving (Show, Eq) -- --------------------------------------------------------------------------- --- Bundle encoding +-- Bundle construction +-- --------------------------------------------------------------------------- + +data NodeKey = KeyLeaf | KeyStem !Word32 | KeyFork !Word32 !Word32 + deriving (Eq, Ord, Show) + +buildBundle :: [(Text, T)] -> Bundle +buildBundle namedTerms = + let go :: T -> (Seq BundleNode, Map NodeKey Word32) -> (Word32, (Seq BundleNode, Map NodeKey Word32)) + go Leaf (nodes, seen) = + case Map.lookup KeyLeaf seen of + Just idx -> (idx, (nodes, seen)) + Nothing -> + let idx = fromIntegral (Seq.length nodes) + in (idx, (nodes |> BNLeaf, Map.insert KeyLeaf idx seen)) + go (Stem child) (nodes, seen) = + let (childIdx, state1) = go child (nodes, seen) + (nodes1, seen1) = state1 + in case Map.lookup (KeyStem childIdx) seen1 of + Just idx -> (idx, state1) + Nothing -> + let idx = fromIntegral (Seq.length nodes1) + in (idx, (nodes1 |> BNStem childIdx, Map.insert (KeyStem childIdx) idx seen1)) + go (Fork left right) (nodes, seen) = + let (leftIdx, state1) = go left (nodes, seen) + (rightIdx, state2) = go right state1 + (nodes2, seen2) = state2 + in case Map.lookup (KeyFork leftIdx rightIdx) seen2 of + Just idx -> (idx, state2) + Nothing -> + let idx = fromIntegral (Seq.length nodes2) + in (idx, (nodes2 |> BNFork leftIdx rightIdx, Map.insert (KeyFork leftIdx rightIdx) idx seen2)) + + processExport state (_, t) = let (idx, newState) = go t state in (newState, idx) + ((finalNodes, _), rootIndices) = mapAccumL processExport (Seq.empty, Map.empty) namedTerms + + roots = zipWith mkRoot [0 :: Int ..] rootIndices + exports = zipWith mkExport namedTerms rootIndices + manifest = makeManifest roots exports + manifestBytes = encodeManifest manifest + in Bundle + { bundleVersion = bundleMajorVersion * 1000 + bundleMinorVersion + , bundleRoots = rootIndices + , bundleNodes = finalNodes + , bundleManifest = manifest + , bundleManifestBytes = manifestBytes + } + where + mkRoot 0 idx = BundleRoot idx "default" + mkRoot _ idx = BundleRoot idx "root" + mkExport (name, _) idx = BundleExport name idx "term" "arboricx.abi.tree.v1" + +makeManifest :: [BundleRoot] -> [BundleExport] -> BundleManifest +makeManifest roots exports = BundleManifest + { manifestSchema = "arboricx.bundle.manifest.v1" + , manifestBundleType = "tree-calculus-executable-object" + , manifestTree = TreeSpec + { treeCalculus = "tree-calculus.v1" + , treeNodeHash = NodeHashSpec + { nodeHashAlgorithm = "indexed" + , nodeHashDomain = "arboricx.indexed.node.v1" + } + , treeNodePayload = "arboricx.indexed.payload.v1" + } + , manifestRuntime = RuntimeSpec + { runtimeSemantics = "tree-calculus.v1" + , runtimeEvaluation = "normal-order" + , runtimeAbi = "arboricx.abi.tree.v1" + , runtimeCapabilities = [] + } + , manifestClosure = ClosureComplete + , manifestRoots = roots + , manifestExports = exports + , manifestMetadata = BundleMetadata + { metadataPackage = Nothing + , metadataVersion = Nothing + , metadataDescription = Nothing + , metadataLicense = Nothing + , metadataCreatedBy = Just "arboricx" + } + } + +-- --------------------------------------------------------------------------- +-- Bundle encoding / decoding -- --------------------------------------------------------------------------- --- | Encode a Bundle to portable Bundle v1 bytes. --- The manifest is serialized using the fixed-order core + TLV tail format. encodeBundle :: Bundle -> ByteString encodeBundle bundle = let nodeSection = encodeNodeSection (bundleNodes bundle) - manifestBytes = if BS.null (bundleManifestBytes bundle) - then encodeManifest (bundleManifest bundle) - else bundleManifestBytes bundle + manifestBytes = bundleManifestBytes bundle sectionCount = 2 dirOffset = fromIntegral headerLength sectionDirLength = sectionCount * sectionEntryLength manifestOffset = fromIntegral (headerLength + sectionDirLength) nodesOffset = manifestOffset + fromIntegral (BS.length manifestBytes) manifestEntry = encodeSectionEntry sectionManifest 1 flagCritical compressionNone - manifestOffset (fromIntegral $ BS.length manifestBytes) manifestBytes + manifestOffset (fromIntegral $ BS.length manifestBytes) nodesEntry = encodeSectionEntry sectionNodes 1 flagCritical compressionNone - nodesOffset (fromIntegral $ BS.length nodeSection) nodeSection + nodesOffset (fromIntegral $ BS.length nodeSection) header = encodeHeader bundleMajorVersion bundleMinorVersion (fromIntegral sectionCount) 0 dirOffset in header <> manifestEntry <> nodesEntry <> manifestBytes <> nodeSection --- | Decode portable Bundle v1 bytes. decodeBundle :: ByteString -> Either String Bundle decodeBundle bs - | BS.take (BS.length bundleMagic) bs == bundleMagic = decodePortableBundle bs - | otherwise = Left "invalid magic" + | BS.take (BS.length bundleMagic) bs /= bundleMagic = Left "invalid magic" + | otherwise = do + (major, minor, sectionCount, _flags, dirOffset) <- decodePortableHeader bs + when (major /= bundleMajorVersion) $ + Left $ "unsupported bundle major version: " ++ show major + let dirStart = fromIntegral dirOffset + dirBytes = fromIntegral sectionCount * sectionEntryLength + when (BS.length bs < dirStart + dirBytes) $ + Left "bundle truncated in section directory" + let dirRaw = BS.take dirBytes $ BS.drop dirStart bs + entries <- decodeSectionEntries sectionCount dirRaw + traverse_ rejectUnknownCritical entries + manifestEntry <- requireSection sectionManifest entries + nodesEntry <- requireSection sectionNodes entries + manifestBytes <- readAndVerifySection bs manifestEntry + nodesBytes <- readAndVerifySection bs nodesEntry + manifest <- decodeManifest manifestBytes + when (treeNodePayload (manifestTree manifest) /= "arboricx.indexed.payload.v1") $ + Left "manifest does not use indexed payload" + nodes <- decodeNodeSection nodesBytes + let rootIndices = map rootIndex (manifestRoots manifest) + return Bundle + { bundleVersion = major * 1000 + minor + , bundleRoots = rootIndices + , bundleNodes = nodes + , bundleManifest = manifest + , bundleManifestBytes = manifestBytes + } -- --------------------------------------------------------------------------- --- Portable container encoding / decoding +-- Container encoding / decoding -- --------------------------------------------------------------------------- data SectionEntry = SectionEntry @@ -486,10 +548,8 @@ data SectionEntry = SectionEntry , seVersion :: Word16 , seFlags :: Word16 , seCompression :: Word16 - , seDigestAlgorithm :: Word16 , seOffset :: Word64 , seLength :: Word64 - , seDigest :: ByteString } deriving (Show, Eq) encodeHeader :: Word16 -> Word16 -> Word32 -> Word64 -> Word64 -> ByteString @@ -501,72 +561,16 @@ encodeHeader major minor sectionCount flags dirOffset = <> encode64 flags <> encode64 dirOffset -encodeSectionEntry :: Word32 -> Word16 -> Word16 -> Word16 -> Word64 -> Word64 -> ByteString -> ByteString -encodeSectionEntry sectionType sectionVersion sectionFlags compression offset lengthBytes sectionBytes = +encodeSectionEntry :: Word32 -> Word16 -> Word16 -> Word16 -> Word64 -> Word64 -> ByteString +encodeSectionEntry sectionType sectionVersion sectionFlags compression offset lengthBytes = encode32 sectionType <> encode16 sectionVersion <> encode16 sectionFlags <> encode16 compression - <> encode16 digestSha256 + <> encode16 0 -- reserved <> encode64 offset <> encode64 lengthBytes - <> sha256 sectionBytes - -decodePortableBundle :: ByteString -> Either String Bundle -decodePortableBundle bs = do - (major, minor, sectionCount, _flags, dirOffset) <- decodePortableHeader bs - when (major /= bundleMajorVersion) $ - Left $ "unsupported bundle major version: " ++ show major - let dirStart = fromIntegral dirOffset - dirBytes = fromIntegral sectionCount * sectionEntryLength - when (BS.length bs < dirStart + dirBytes) $ - Left "bundle truncated in section directory" - let dirRaw = BS.take dirBytes $ BS.drop dirStart bs - entries <- decodeSectionEntries sectionCount dirRaw - traverse_ rejectUnknownCritical entries - manifestEntry <- requireSection sectionManifest entries - nodesEntry <- requireSection sectionNodes entries - manifestBytes <- readAndVerifySection bs manifestEntry - nodesBytes <- readAndVerifySection bs nodesEntry - manifest <- decodeManifest manifestBytes - nodes <- decodeNodeSection nodesBytes - let roots = map rootHash (manifestRoots manifest) - return Bundle - { bundleVersion = major * 1000 + minor - , bundleRoots = roots - , bundleNodes = nodes - , bundleManifest = manifest - , bundleManifestBytes = manifestBytes - } - -rejectUnknownCritical :: SectionEntry -> Either String () -rejectUnknownCritical entry = - let known = seType entry `elem` [sectionManifest, sectionNodes] - critical = seFlags entry .&. flagCritical /= 0 - in when (critical && not known) $ - Left $ "unknown critical section type: " ++ show (seType entry) - -requireSection :: Word32 -> [SectionEntry] -> Either String SectionEntry -requireSection sectionType entries = - case filter ((== sectionType) . seType) entries of - [entry] -> Right entry - [] -> Left $ "missing required section type: " ++ show sectionType - _ -> Left $ "duplicate section type: " ++ show sectionType - -readAndVerifySection :: ByteString -> SectionEntry -> Either String ByteString -readAndVerifySection bs entry = do - when (seCompression entry /= compressionNone) $ - Left $ "unsupported compression codec in section " ++ show (seType entry) - when (seDigestAlgorithm entry /= digestSha256) $ - Left $ "unsupported digest algorithm in section " ++ show (seType entry) - let offset = fromIntegral (seOffset entry) - len = fromIntegral (seLength entry) - when (offset < 0 || len < 0 || BS.length bs < offset + len) $ - Left $ "section extends beyond bundle end: " ++ show (seType entry) - let sectionBytes = BS.take len $ BS.drop offset bs - when (sha256 sectionBytes /= seDigest entry) $ - Left $ "section digest mismatch: " ++ show (seType entry) - Right sectionBytes + <> encode32 0 -- reserved padding decodePortableHeader :: ByteString -> Either String (Word16, Word16, Word32, Word64, Word64) decodePortableHeader bs @@ -591,100 +595,102 @@ decodeSectionEntries count bytes = reverse <$> go count bytes [] (sectionVersion, r2) <- decode16be "section_version" r1 (sectionFlags, r3) <- decode16be "section_flags" r2 (compression, r4) <- decode16be "compression_codec" r3 - (digAlg, r5) <- decode16be "digest_algorithm" r4 + (_reserved, r5) <- decode16be "reserved" r4 (offset, r6) <- decode64be "section_offset" r5 (len, r7) <- decode64be "section_length" r6 - let (dig, rest) = BS.splitAt 32 r7 - when (BS.length dig /= 32) $ Left "section digest truncated" - let entry = SectionEntry sectionType sectionVersion sectionFlags compression digAlg offset len dig + (_reserved2, rest) <- decode32be "reserved" r7 + let entry = SectionEntry sectionType sectionVersion sectionFlags compression offset len go (n - 1) rest (entry : acc) --- --------------------------------------------------------------------------- --- Manifest construction --- --------------------------------------------------------------------------- +rejectUnknownCritical :: SectionEntry -> Either String () +rejectUnknownCritical entry = + let known = seType entry `elem` [sectionManifest, sectionNodes] + critical = seFlags entry .&. flagCritical /= 0 + in when (critical && not known) $ + Left $ "unknown critical section type: " ++ show (seType entry) -defaultManifest :: [(Text, MerkleHash)] -> BundleManifest -defaultManifest namedRoots = BundleManifest - { manifestSchema = "arboricx.bundle.manifest.v1" - , manifestBundleType = "tree-calculus-executable-object" - , manifestTree = TreeSpec - { treeCalculus = "tree-calculus.v1" - , treeNodeHash = NodeHashSpec - { nodeHashAlgorithm = "sha256" - , nodeHashDomain = "arboricx.merkle.node.v1" - } - , treeNodePayload = "arboricx.merkle.payload.v1" - } - , manifestRuntime = RuntimeSpec - { runtimeSemantics = "tree-calculus.v1" - , runtimeEvaluation = "normal-order" - , runtimeAbi = "arboricx.abi.tree.v1" - , runtimeCapabilities = [] - } - , manifestClosure = ClosureComplete - , manifestRoots = zipWith mkRoot [0 :: Int ..] (map snd namedRoots) - , manifestExports = map mkExport namedRoots - , manifestMetadata = BundleMetadata - { metadataPackage = Nothing - , metadataVersion = Nothing - , metadataDescription = Nothing - , metadataLicense = Nothing - , metadataCreatedBy = Just "arboricx" - } - } - where - mkRoot 0 h = BundleRoot h "default" - mkRoot _ h = BundleRoot h "root" - mkExport (name, h) = BundleExport - { exportName = name - , exportRoot = h - , exportKind = "term" - , exportAbi = "arboricx.abi.tree.v1" - } +requireSection :: Word32 -> [SectionEntry] -> Either String SectionEntry +requireSection sectionType entries = + case filter ((== sectionType) . seType) entries of + [entry] -> Right entry + [] -> Left $ "missing required section type: " ++ show sectionType + _ -> Left $ "duplicate section type: " ++ show sectionType + +readAndVerifySection :: ByteString -> SectionEntry -> Either String ByteString +readAndVerifySection bs entry = do + when (seCompression entry /= compressionNone) $ + Left $ "unsupported compression codec in section " ++ show (seType entry) + let offset = fromIntegral (seOffset entry) + len = fromIntegral (seLength entry) + when (offset < 0 || len < 0 || BS.length bs < offset + len) $ + Left $ "section extends beyond bundle end: " ++ show (seType entry) + Right $ BS.take len $ BS.drop offset bs -- --------------------------------------------------------------------------- -- Node section encoding / decoding -- --------------------------------------------------------------------------- -encodeNodeSection :: Map MerkleHash ByteString -> ByteString +serializeBundleNode :: BundleNode -> ByteString +serializeBundleNode BNLeaf = BS.pack [0x00] +serializeBundleNode (BNStem child) = BS.pack [0x01] <> encode32 child +serializeBundleNode (BNFork left right) = BS.pack [0x02] <> encode32 left <> encode32 right + +encodeNodeSection :: Seq BundleNode -> ByteString encodeNodeSection nodes = - encode64 (fromIntegral $ Map.size nodes) - <> mconcat (map nodeEntryToBinary $ Map.toAscList nodes) + encode64 (fromIntegral $ Seq.length nodes) + <> foldMap encodeNodeEntry nodes + where + encodeNodeEntry node = + let payload = serializeBundleNode node + in encode32 (fromIntegral $ BS.length payload) <> payload --- | Encode a single (hash, canonical-payload) node entry. -nodeEntryToBinary :: (MerkleHash, ByteString) -> ByteString -nodeEntryToBinary (h, payload) = - merkleHashToRaw h - <> encode32 (fromIntegral $ BS.length payload) - <> payload - -decodeNodeSection :: ByteString -> Either String (Map MerkleHash ByteString) +decodeNodeSection :: ByteString -> Either String (Seq BundleNode) decodeNodeSection bs = do (nodeCount, rest) <- decode64be "node_count" bs decodeNodeEntries nodeCount rest --- | Decode a sequence of node entries. -decodeNodeEntries :: Word64 -> ByteString -> Either String (Map MerkleHash ByteString) -decodeNodeEntries count bs = go count bs Map.empty +decodeNodeEntries :: Word64 -> ByteString -> Either String (Seq BundleNode) +decodeNodeEntries count bs = go count bs Seq.empty where go 0 rest acc | BS.null rest = Right acc | otherwise = Left "trailing bytes after node section" go n bytes acc - | BS.length bytes < 36 = - Left "not enough bytes for node entry header (hash + length)" + | BS.length bytes < 4 = + Left "not enough bytes for node entry length" | otherwise = do - let (hashBytes, rest) = BS.splitAt 32 bytes - (plen, rest') <- decode32be "payload_len" rest + (plen, rest) <- decode32be "payload_len" bytes let payloadLen = fromIntegral plen - if BS.length rest' < payloadLen + if BS.length rest < payloadLen then Left "payload extends beyond node section end" else do - let (payload, after) = BS.splitAt payloadLen rest' - h = rawToMerkleHash hashBytes - when (Map.member h acc) $ - Left $ "duplicate node entry: " ++ unpack h - go (n - 1) after (Map.insert h payload acc) + let (payload, after) = BS.splitAt payloadLen rest + node <- deserializeBundleNode payload + go (n - 1) after (acc |> node) + +deserializeBundleNode :: ByteString -> Either String BundleNode +deserializeBundleNode payload = + case BS.uncons payload of + Just (0x00, rest) + | BS.null rest -> Right BNLeaf + | otherwise -> Left "invalid leaf payload length" + Just (0x01, rest) + | BS.length rest == 4 -> Right $ BNStem (decodeU32 rest) + | otherwise -> Left "invalid stem payload length" + Just (0x02, rest) + | BS.length rest == 8 -> + let (leftBytes, rightBytes) = BS.splitAt 4 rest + in Right $ BNFork (decodeU32 leftBytes) (decodeU32 rightBytes) + | otherwise -> Left "invalid fork payload length" + _ -> Left "invalid node payload" + +decodeU32 :: ByteString -> Word32 +decodeU32 bs = + let b0 = fromIntegral (BS.index bs 0) :: Word32 + b1 = fromIntegral (BS.index bs 1) :: Word32 + b2 = fromIntegral (BS.index bs 2) :: Word32 + b3 = fromIntegral (BS.index bs 3) :: Word32 + in (b0 `shiftL` 24) .|. (b1 `shiftL` 16) .|. (b2 `shiftL` 8) .|. b3 -- --------------------------------------------------------------------------- -- Bundle verification @@ -693,27 +699,47 @@ decodeNodeEntries count bs = go count bs Map.empty verifyBundle :: Bundle -> Either String () verifyBundle bundle | bundleVersion bundle < 1 = Left $ "unsupported bundle version: " ++ show (bundleVersion bundle) - | Map.null (bundleNodes bundle) = Left "bundle has no nodes" + | Seq.null (bundleNodes bundle) = Left "bundle has no nodes" verifyBundle bundle = do - verifyManifest (bundleManifest bundle) - let nodeMap = bundleNodes bundle - rootSet = Set.fromList (bundleRoots bundle) - manifestRootSet = Set.fromList (map rootHash $ manifestRoots $ bundleManifest bundle) - exportRoots = map exportRoot $ manifestExports $ bundleManifest bundle - unless (rootSet == manifestRootSet) $ - Left "bundle root list does not match manifest roots" - traverse_ (requirePresent "root hash missing from bundle") (bundleRoots bundle) - traverse_ (requirePresent "export root hash missing from bundle") exportRoots - decoded <- traverse verifyNodePayload (Map.toList nodeMap) - traverse_ (verifyChildrenPresent nodeMap) decoded - verifyCompleteClosure nodeMap (bundleRoots bundle) - where - requirePresent label h = - unless (Map.member h (bundleNodes bundle)) $ - Left $ label ++ ": " ++ unpack h + verifyManifestConstraints (bundleManifest bundle) + let nodeCount = fromIntegral $ Seq.length (bundleNodes bundle) + traverse_ (\idx -> when (idx >= nodeCount) $ Left $ "root index out of bounds: " ++ show idx) + (bundleRoots bundle) + traverse_ (\exp -> when (exportRoot exp >= nodeCount) $ Left $ "export index out of bounds: " ++ show (exportRoot exp)) + (manifestExports $ bundleManifest bundle) -verifyManifest :: BundleManifest -> Either String () -verifyManifest manifest = do + let verifyNode i node = case node of + BNLeaf -> Right () + BNStem child -> do + when (child >= i) $ Left $ "stem at index " ++ show i ++ " references child " ++ show child + when (child >= nodeCount) $ Left $ "stem at index " ++ show i ++ " references child out of bounds" + Right () + BNFork left right -> do + when (left >= i) $ Left $ "fork at index " ++ show i ++ " references left " ++ show left + when (right >= i) $ Left $ "fork at index " ++ show i ++ " references right " ++ show right + when (left >= nodeCount) $ Left $ "fork at index " ++ show i ++ " references left out of bounds" + when (right >= nodeCount) $ Left $ "fork at index " ++ show i ++ " references right out of bounds" + Right () + + mapM_ (\i -> case Seq.lookup (fromIntegral i) (bundleNodes bundle) of + Nothing -> Left $ "internal error: node " ++ show i ++ " not found" + Just node -> verifyNode i node) [0 :: Word32 .. nodeCount - 1] + + let dupCheck = foldM (\seen (i, node) -> case node of + BNLeaf -> if Set.member (0 :: Word8, 0 :: Word32, 0 :: Word32) seen + then Left $ "duplicate leaf at index " ++ show i + else Right $ Set.insert (0, 0, 0) seen + BNStem child -> if Set.member (1, child, 0) seen + then Left $ "duplicate stem at index " ++ show i + else Right $ Set.insert (1, child, 0) seen + BNFork left right -> if Set.member (2, left, right) seen + then Left $ "duplicate fork at index " ++ show i + else Right $ Set.insert (2, left, right) seen) Set.empty (zip [0 :: Word32 ..] (Foldable.toList $ bundleNodes bundle)) + _ <- dupCheck + Right () + +verifyManifestConstraints :: BundleManifest -> Either String () +verifyManifestConstraints manifest = do when (manifestSchema manifest /= "arboricx.bundle.manifest.v1") $ Left $ "unsupported manifest schema: " ++ unpack (manifestSchema manifest) when (manifestBundleType manifest /= "tree-calculus-executable-object") $ @@ -723,11 +749,11 @@ verifyManifest manifest = do runtimeSpec = manifestRuntime manifest when (treeCalculus treeSpec /= "tree-calculus.v1") $ Left $ "unsupported calculus: " ++ unpack (treeCalculus treeSpec) - when (nodeHashAlgorithm hashSpec /= "sha256") $ + when (nodeHashAlgorithm hashSpec /= "indexed") $ Left $ "unsupported node hash algorithm: " ++ unpack (nodeHashAlgorithm hashSpec) - when (nodeHashDomain hashSpec /= "arboricx.merkle.node.v1") $ + when (nodeHashDomain hashSpec /= "arboricx.indexed.node.v1") $ Left $ "unsupported node hash domain: " ++ unpack (nodeHashDomain hashSpec) - when (treeNodePayload treeSpec /= "arboricx.merkle.payload.v1") $ + when (treeNodePayload treeSpec /= "arboricx.indexed.payload.v1") $ Left $ "unsupported node payload: " ++ unpack (treeNodePayload treeSpec) when (runtimeSemantics runtimeSpec /= "tree-calculus.v1") $ Left $ "unsupported runtime semantics: " ++ unpack (runtimeSemantics runtimeSpec) @@ -736,7 +762,7 @@ verifyManifest manifest = do when (not (null (runtimeCapabilities runtimeSpec))) $ Left "unsupported runtime capabilities" when (manifestClosure manifest /= ClosureComplete) $ - Left "bundle v1 requires closure = complete" + Left "bundle requires closure = complete" when (null $ manifestRoots manifest) $ Left "manifest has no roots" when (null $ manifestExports manifest) $ @@ -746,133 +772,38 @@ verifyManifest manifest = do verifyExport exported = do when (T.null $ exportName exported) $ Left "manifest export has empty name" - when (T.null $ exportRoot exported) $ - Left "manifest export has empty root" - -verifyNodePayload :: (MerkleHash, ByteString) -> Either String (MerkleHash, Node) -verifyNodePayload (h, payload) = do - node <- safeDeserializeNode payload - let actual = nodeHash node - unless (actual == h) $ - Left $ "node hash mismatch for " ++ unpack h ++ "; payload hashes to " ++ unpack actual - Right (h, node) - -verifyChildrenPresent :: Map MerkleHash ByteString -> (MerkleHash, Node) -> Either String () -verifyChildrenPresent nodeMap (h, node) = - case node of - NLeaf -> Right () - NStem child -> requireChild h child - NFork left right -> requireChild h left >> requireChild h right - where - requireChild parent child = - unless (Map.member child nodeMap) $ - Left $ "missing child node referenced by " ++ unpack parent ++ ": " ++ unpack child - -verifyCompleteClosure :: Map MerkleHash ByteString -> [MerkleHash] -> Either String () -verifyCompleteClosure nodeMap roots = do - _ <- foldM visit Set.empty roots - Right () - where - visit seen h - | Set.member h seen = Right seen - | otherwise = do - payload <- case Map.lookup h nodeMap of - Nothing -> Left $ "closure missing node: " ++ unpack h - Just p -> Right p - node <- safeDeserializeNode payload - let seen' = Set.insert h seen - case node of - NLeaf -> Right seen' - NStem child -> visit seen' child - NFork left right -> visit seen' left >>= \seenL -> visit seenL right - -safeDeserializeNode :: ByteString -> Either String Node -safeDeserializeNode payload = - case BS.uncons payload of - Just (0x00, rest) - | BS.null rest -> Right NLeaf - | otherwise -> Left "invalid leaf payload length" - Just (0x01, rest) - | BS.length rest == 32 -> Right $ NStem (rawToMerkleHash rest) - | otherwise -> Left "invalid stem payload length" - Just (0x02, rest) - | BS.length rest == 64 -> - let (left, right) = BS.splitAt 32 rest - in Right $ NFork (rawToMerkleHash left) (rawToMerkleHash right) - | otherwise -> Left "invalid fork payload length" - _ -> Left "invalid merkle node payload" -- --------------------------------------------------------------------------- --- Reachability traversal +-- Import into content store -- --------------------------------------------------------------------------- -collectReachableNodes :: Connection -> MerkleHash -> IO [(MerkleHash, ByteString)] -collectReachableNodes conn root = do - let go seen current = do - case Map.lookup current seen of - Just _ -> return seen - Nothing -> do - maybeNode <- getNodeMerkle conn current - case maybeNode of - Nothing -> error $ "exportBundle: missing Merkle node: " ++ unpack current - Just node -> do - let payload = serializeNode node - seen' = Map.insert current payload seen - case node of - NLeaf -> return seen' - NStem childHash -> go seen' childHash - NFork lHash rHash -> go seen' lHash >>= \seenL -> go seenL rHash - seen <- go Map.empty root - return $ Map.toAscList seen +reconstructTerms :: Seq BundleNode -> Vector T +reconstructTerms nodes = V.create $ do + let n = Seq.length nodes + vec <- MV.new n + forM_ (zip [0 :: Int ..] (Foldable.toList nodes)) $ \(i, node) -> do + t <- case node of + BNLeaf -> return Leaf + BNStem child -> Stem <$> MV.read vec (fromIntegral child) + BNFork left right -> do + l <- MV.read vec (fromIntegral left) + r <- MV.read vec (fromIntegral right) + return $ Fork l r + MV.write vec i t + return vec --- --------------------------------------------------------------------------- --- High-level export / import --- --------------------------------------------------------------------------- - -exportBundle :: Connection -> [MerkleHash] -> IO ByteString -exportBundle conn hashes = exportNamedBundle conn (zip (defaultExportNames $ length hashes) hashes) - -exportNamedBundle :: Connection -> [(Text, MerkleHash)] -> IO ByteString -exportNamedBundle conn namedHashes = do - let hashes = map snd namedHashes - entries <- concat <$> mapM (collectReachableNodes conn) hashes - let nodeMap = Map.fromList entries - manifest = defaultManifest namedHashes - manifestBytes = encodeManifest manifest - bundle = Bundle - { bundleVersion = bundleMajorVersion * 1000 + bundleMinorVersion - , bundleRoots = hashes - , bundleNodes = nodeMap - , bundleManifest = manifest - , bundleManifestBytes = manifestBytes - } - return $ encodeBundle bundle - -importBundle :: Connection -> ByteString -> IO [MerkleHash] +importBundle :: Connection -> ByteString -> IO [Text] importBundle conn bs = case decodeBundle bs of Left err -> error $ "Wire.importBundle: " ++ err Right bundle -> case verifyBundle bundle of Left err -> error $ "Wire.importBundle verify: " ++ err Right () -> do - traverse_ (\payload -> do - node <- deserializeForImport payload - putMerkleNode conn node - ) - (Map.elems $ bundleNodes bundle) - registerBundleExports conn bundle - return $ bundleRoots bundle - -registerBundleExports :: Connection -> Bundle -> IO () -registerBundleExports conn bundle = - traverse_ registerExport (manifestExports $ bundleManifest bundle) - where - registerExport exported = do - maybeTree <- loadTree conn (exportRoot exported) - case maybeTree of - Nothing -> error $ "Wire.importBundle: export root missing after node import: " ++ unpack (exportRoot exported) - Just tree -> do - _ <- storeTerm conn [unpack $ exportName exported] tree - return () + let terms = reconstructTerms (bundleNodes bundle) + forM_ (manifestExports $ bundleManifest bundle) $ \exp -> do + let term = terms V.! fromIntegral (exportRoot exp) + _ <- storeTerm conn [T.unpack $ exportName exp] term + return () + return $ map exportName $ manifestExports $ bundleManifest bundle -- --------------------------------------------------------------------------- -- Primitive binary helpers @@ -912,7 +843,6 @@ decode16be label bs b1 = fromIntegral (BS.index bs 1) :: Word16 in Right ((b0 `shiftL` 8) .|. b1, BS.drop 2 bs) --- | Decode a big-endian u32 from the head of a ByteString. decode32be :: String -> ByteString -> Either String (Word32, ByteString) decode32be label bs | BS.length bs < 4 = Left (label ++ ": not enough bytes for u32") @@ -921,53 +851,30 @@ decode32be label bs b1 = fromIntegral (BS.index bs 1) :: Word32 b2 = fromIntegral (BS.index bs 2) :: Word32 b3 = fromIntegral (BS.index bs 3) :: Word32 - val = (b0 `shiftL` 24) .|. (b1 `shiftL` 16) - .|. (b2 `shiftL` 8) .|. b3 - in Right (val, BS.drop 4 bs) + in Right ((b0 `shiftL` 24) .|. (b1 `shiftL` 16) .|. (b2 `shiftL` 8) .|. b3, BS.drop 4 bs) decode64be :: String -> ByteString -> Either String (Word64, ByteString) decode64be label bs | BS.length bs < 8 = Left (label ++ ": not enough bytes for u64") | otherwise = - let byte i = fromIntegral (BS.index bs i) :: Word64 - val = (byte 0 `shiftL` 56) .|. (byte 1 `shiftL` 48) - .|. (byte 2 `shiftL` 40) .|. (byte 3 `shiftL` 32) - .|. (byte 4 `shiftL` 24) .|. (byte 5 `shiftL` 16) - .|. (byte 6 `shiftL` 8) .|. byte 7 - in Right (val, BS.drop 8 bs) + let b0 = fromIntegral (BS.index bs 0) :: Word64 + b1 = fromIntegral (BS.index bs 1) :: Word64 + b2 = fromIntegral (BS.index bs 2) :: Word64 + b3 = fromIntegral (BS.index bs 3) :: Word64 + b4 = fromIntegral (BS.index bs 4) :: Word64 + b5 = fromIntegral (BS.index bs 5) :: Word64 + b6 = fromIntegral (BS.index bs 6) :: Word64 + b7 = fromIntegral (BS.index bs 7) :: Word64 + in Right ((b0 `shiftL` 56) .|. (b1 `shiftL` 48) .|. (b2 `shiftL` 40) .|. (b3 `shiftL` 32) + .|. (b4 `shiftL` 24) .|. (b5 `shiftL` 16) .|. (b6 `shiftL` 8) .|. b7, BS.drop 8 bs) -- --------------------------------------------------------------------------- --- Hash conversion +-- Helpers -- --------------------------------------------------------------------------- --- | Convert a hex MerkleHash to its raw 32-byte representation. -merkleHashToRaw :: MerkleHash -> ByteString -merkleHashToRaw h = - case Base16.decode (encodeUtf8 h) of - Left _ -> error $ "Wire.merkleHashToRaw: invalid hex: " ++ show h - Right bs - | BS.length bs == 32 -> bs - | otherwise -> error $ "Wire.merkleHashToRaw: expected 32 bytes: " ++ show h - --- | Convert raw 32 bytes back to a hex MerkleHash. -rawToMerkleHash :: ByteString -> MerkleHash -rawToMerkleHash bs = decodeUtf8 (Base16.encode bs) - -sha256 :: ByteString -> ByteString -sha256 bytes = convert ((hash bytes) :: Digest SHA256) - - - defaultExportNames :: Int -> [Text] defaultExportNames n = case n of 0 -> [] 1 -> ["root"] _ -> ["root" <> T.pack (show i) | i <- [0 :: Int .. n - 1]] - -deserializeForImport :: ByteString -> IO Node -deserializeForImport payload = do - result <- try (evaluate $ deserializeNode payload) :: IO (Either SomeException Node) - case result of - Left err -> error $ "Wire.importBundle: invalid merkle node payload: " ++ show err - Right node -> return node diff --git a/test/Spec.hs b/test/Spec.hs index d8ea3d5..8bf5cfc 100644 --- a/test/Spec.hs +++ b/test/Spec.hs @@ -10,6 +10,7 @@ import Wire import ContentStore import Control.Exception (evaluate, try, SomeException) +import Control.Monad (forM_) import Control.Monad.IO.Class (liftIO) import Data.Bits (xor) import Data.Char (digitToInt) @@ -23,6 +24,7 @@ import Text.Megaparsec (runParser) import Data.ByteString (ByteString) import qualified Data.ByteString as BS import qualified Data.Map as Map +import qualified Data.Sequence as Seq import qualified Data.Set as Set import Database.SQLite.Simple (close, Connection) @@ -47,9 +49,8 @@ tests = testGroup "Tricu Tests" , stressElimLambda , byteMarshallingTests , wireTests + , tricuReaderTests , byteListUtilities - , binaryReaderTests - , manifestReadingTests ] lexer :: TestTree @@ -756,275 +757,199 @@ byteMarshallingTests = testGroup "Byte Marshalling Tests" -- -------------------------------------------------------------------------- -- | Helper: create a temporary file-backed DB, store a term, return the --- connection and the term (so callers can compare after round-trip). -storeTermInTempDB :: String -> IO (Connection, Text, T) -storeTermInTempDB src = do - conn <- newContentStore - let asts = parseTricu src - finalEnv = evalTricu Map.empty asts - term = result finalEnv - -- storeMerkleNodes returns MerkleHash as Text; storeTerm expects [String] - _ <- storeTerm conn [] term - return (conn, hashTerm term, term) - --- | Load a term from a DB by its stored hash Text. -loadTermByHash :: Connection -> Text -> IO T -loadTermByHash conn h = do - maybeTerm <- loadTree conn h - case maybeTerm of - Just t -> return t - Nothing -> errorWithoutStackTrace $ "hash not found in store: " ++ Data.Text.unpack h - --- | Flip one byte in a ByteString at the given index. -corruptByte :: ByteString -> Int -> ByteString -corruptByte bs i = BS.take i bs <> BS.pack [(BS.index bs i `xor` 0x01)] <> BS.drop (i + 1) bs wireTests :: TestTree wireTests = testGroup "Wire Tests" - [ testCase "Portable bundle: header and manifest declare Tree Calculus object format" $ do - (srcConn, termHash, _) <- storeTermInTempDB $ unlines - [ "id = a : a" - , "main = id t" - ] - wireData <- exportBundle srcConn [termHash] + [ testCase "Indexed bundle: header and manifest declare indexed format" $ do + let term = result $ evalTricu Map.empty $ parseTricu "id = a : a\nmain = id t" + bundle = buildBundle [("main", term)] + wireData = encodeBundle bundle BS.take 8 wireData @?= BS.pack [0x41, 0x52, 0x42, 0x4f, 0x52, 0x49, 0x43, 0x58] case decodeBundle wireData of Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right bundle -> do - let manifest = bundleManifest bundle + Right decoded -> do + let manifest = bundleManifest decoded tree = manifestTree manifest hashSpec = treeNodeHash tree - runtime = manifestRuntime manifest manifestSchema manifest @?= "arboricx.bundle.manifest.v1" manifestBundleType manifest @?= "tree-calculus-executable-object" manifestClosure manifest @?= ClosureComplete treeCalculus tree @?= "tree-calculus.v1" - treeNodePayload tree @?= "arboricx.merkle.payload.v1" - nodeHashAlgorithm hashSpec @?= "sha256" - nodeHashDomain hashSpec @?= "arboricx.merkle.node.v1" - runtimeSemantics runtime @?= "tree-calculus.v1" - runtimeAbi runtime @?= "arboricx.abi.tree.v1" - runtimeCapabilities runtime @?= [] - bundleRoots bundle @?= [termHash] - map exportRoot (manifestExports manifest) @?= [termHash] - close srcConn - - , testCase "Portable bundle: named exports are manifest aliases for Merkle roots" $ do - (srcConn, termHash, _) <- storeTermInTempDB $ unlines - [ "validateEmail = a : a" - , "main = validateEmail t" - ] - wireData <- exportNamedBundle srcConn [("validateEmail", termHash)] - case decodeBundle wireData of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right bundle -> do - bundleRoots bundle @?= [termHash] - case manifestExports (bundleManifest bundle) of + treeNodePayload tree @?= "arboricx.indexed.payload.v1" + nodeHashAlgorithm hashSpec @?= "indexed" + nodeHashDomain hashSpec @?= "arboricx.indexed.node.v1" + bundleRoots decoded @?= bundleRoots bundle + case manifestExports manifest of [exported] -> do - exportName exported @?= "validateEmail" - exportRoot exported @?= termHash + exportName exported @?= "main" + exportRoot exported @?= head (bundleRoots bundle) exportKind exported @?= "term" exportAbi exported @?= "arboricx.abi.tree.v1" exports -> assertFailure $ "Expected one export, got: " ++ show exports - close srcConn - , testCase "Portable bundle: renaming an export changes bundle bytes but not tree identity" $ do - (srcConn, termHash, _) <- storeTermInTempDB $ unlines - [ "f = a : a" - , "main = f t" - ] - mainBundleData <- exportNamedBundle srcConn [("main", termHash)] - renamedBundleData <- exportNamedBundle srcConn [("validate", termHash)] - assertBool "Renaming an export should change the manifest/bundle bytes" - (mainBundleData /= renamedBundleData) - case (decodeBundle mainBundleData, decodeBundle renamedBundleData) of - (Right mainBundle, Right renamedBundle) -> do - bundleRoots mainBundle @?= [termHash] - bundleRoots renamedBundle @?= [termHash] - map exportRoot (manifestExports $ bundleManifest mainBundle) - @?= map exportRoot (manifestExports $ bundleManifest renamedBundle) - map exportName (manifestExports $ bundleManifest mainBundle) @?= ["main"] - map exportName (manifestExports $ bundleManifest renamedBundle) @?= ["validate"] - (Left err, _) -> assertFailure $ "decodeBundle main failed: " ++ err - (_, Left err) -> assertFailure $ "decodeBundle renamed failed: " ++ err - close srcConn + , testCase "Indexed bundle: deterministic encoding" $ do + let term = result $ evalTricu Map.empty $ parseTricu "x = t t\nmain = t x" + bundle1 = buildBundle [("main", term)] + bundle2 = buildBundle [("main", term)] + encodeBundle bundle1 @?= encodeBundle bundle2 - , testCase "Portable bundle: exact byte export is deterministic" $ do - (srcConn, termHash, _) <- storeTermInTempDB $ unlines - [ "x = t t" - , "main = t x" - ] - first <- exportBundle srcConn [termHash] - second <- exportBundle srcConn [termHash] - first @?= second - close srcConn + , testCase "Indexed bundle: renaming export changes bytes" $ do + let term = result $ evalTricu Map.empty $ parseTricu "f = a : a\nmain = f t" + mainBundle = buildBundle [("main", term)] + renamedBundle = buildBundle [("validate", term)] + encodeBundle mainBundle /= encodeBundle renamedBundle @? "different export names should produce different bytes" + -- But nodes are identical + bundleNodes mainBundle @?= bundleNodes renamedBundle - , testCase "Portable bundle: raw section tampering is rejected by digest verification" $ do - (srcConn, termHash, _) <- storeTermInTempDB $ unlines - [ "x = t" - , "main = t x" - ] - wireData <- exportBundle srcConn [termHash] - let tampered = corruptByte wireData (BS.length wireData - 1) - case decodeBundle tampered of - Left err -> assertBool ("Expected section digest mismatch, got: " ++ err) - ("digest mismatch" `isInfixOf` err) - Right _ -> assertFailure "Expected decodeBundle to reject tampered section bytes" - close srcConn + , testCase "Indexed bundle: verify rejects out-of-bounds root" $ do + let term = Leaf + bundle = buildBundle [("main", term)] + badBundle = bundle { bundleRoots = [99] } + case verifyBundle badBundle of + Left err -> assertBool ("Expected bounds error, got: " ++ err) ("out of bounds" `isInfixOf` err) + Right () -> assertFailure "Expected out-of-bounds root to be rejected" - , testCase "Portable bundle: unsupported manifest semantics are rejected" $ do - (srcConn, termHash, _) <- storeTermInTempDB $ unlines - [ "x = t" - , "main = t x" - ] - wireData <- exportBundle srcConn [termHash] + , testCase "Indexed bundle: verify rejects out-of-bounds child index" $ do + let bundle = Bundle + { bundleVersion = 1000 + , bundleRoots = [1] + , bundleNodes = Seq.fromList [BNLeaf, BNStem 99] + , bundleManifest = (bundleManifest $ buildBundle [("main", Leaf)]) + { manifestRoots = [BundleRoot 1 "default"] + , manifestExports = [BundleExport "main" 1 "term" "arboricx.abi.tree.v1"] + } + , bundleManifestBytes = BS.empty + } + case verifyBundle bundle of + Left err -> assertBool ("Expected bounds error, got: " ++ err) ("references child 99" `isInfixOf` err) + Right () -> assertFailure "Expected out-of-bounds child to be rejected" + + , testCase "Indexed bundle: verify rejects acyclic (forward reference)" $ do + let bundle = Bundle + { bundleVersion = 1000 + , bundleRoots = [1] + , bundleNodes = Seq.fromList [BNStem 1, BNLeaf] -- index 0 refers to 1 (forward) + , bundleManifest = (bundleManifest $ buildBundle [("main", Leaf)]) + { manifestRoots = [BundleRoot 1 "default"] + , manifestExports = [BundleExport "main" 1 "term" "arboricx.abi.tree.v1"] + } + , bundleManifestBytes = BS.empty + } + case verifyBundle bundle of + Left err -> assertBool ("Expected acyclicity error, got: " ++ err) ("references child 1" `isInfixOf` err) + Right () -> assertFailure "Expected forward reference to be rejected" + + , testCase "Indexed bundle: verify rejects duplicate nodes" $ do + let bundle = Bundle + { bundleVersion = 1000 + , bundleRoots = [0] + , bundleNodes = Seq.fromList [BNLeaf, BNLeaf] + , bundleManifest = (bundleManifest $ buildBundle [("main", Leaf)]) + { manifestRoots = [BundleRoot 0 "default"] + , manifestExports = [BundleExport "main" 0 "term" "arboricx.abi.tree.v1"] + } + , bundleManifestBytes = BS.empty + } + case verifyBundle bundle of + Left err -> assertBool ("Expected duplicate error, got: " ++ err) ("duplicate" `isInfixOf` err) + Right () -> assertFailure "Expected duplicate nodes to be rejected" + + , testCase "Indexed bundle: import into content store" $ do + let term = result $ evalTricu Map.empty $ parseTricu "validateEmail = a : a\nmain = validateEmail t" + bundle = buildBundle [("validateEmail", term)] + wireData = encodeBundle bundle + dstConn <- newContentStore + roots <- importBundle dstConn wireData + roots @?= ["validateEmail"] + loaded <- loadTerm dstConn "validateEmail" + loaded @?= Just term + close dstConn + + , testCase "Indexed bundle: round-trip decode and verify" $ do + let term = result $ evalTricu Map.empty $ parseTricu "x = t\ny = t x\nz = t y\nmain = z" + bundle = buildBundle [("main", term)] + wireData = encodeBundle bundle case decodeBundle wireData of Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right bundle -> do - let manifest = bundleManifest bundle - partialBundle = bundle - { bundleManifest = manifest { manifestClosure = ClosurePartial } - , bundleManifestBytes = BS.empty + Right decoded -> case verifyBundle decoded of + Left err -> assertFailure $ "verifyBundle failed: " ++ err + Right () -> do + bundleRoots decoded @?= bundleRoots bundle + Seq.length (bundleNodes decoded) @?= Seq.length (bundleNodes bundle) + + , testCase "Indexed bundle: unsupported manifest semantics rejected" $ do + let term = Leaf + bundle = buildBundle [("main", term)] + manifest = bundleManifest bundle + partialBundle = bundle + { bundleManifest = manifest { manifestClosure = ClosurePartial } + , bundleManifestBytes = BS.empty + } + capabilityBundle = bundle + { bundleManifest = manifest + { manifestRuntime = (manifestRuntime manifest) + { runtimeCapabilities = ["host.io"] } } - capabilityBundle = bundle - { bundleManifest = manifest - { manifestRuntime = (manifestRuntime manifest) - { runtimeCapabilities = ["host.io"] - } + , bundleManifestBytes = BS.empty + } + wrongHashBundle = bundle + { bundleManifest = manifest + { manifestTree = (manifestTree manifest) + { treeNodeHash = (treeNodeHash $ manifestTree manifest) + { nodeHashAlgorithm = "blake3" } } - , bundleManifestBytes = BS.empty } - wrongHashBundle = bundle - { bundleManifest = manifest - { manifestTree = (manifestTree manifest) - { treeNodeHash = (treeNodeHash $ manifestTree manifest) - { nodeHashAlgorithm = "blake3" } - } - } - , bundleManifestBytes = BS.empty - } - case verifyBundle partialBundle of - Left err -> assertBool ("Expected closure error, got: " ++ err) ("closure = complete" `isInfixOf` err) - Right () -> assertFailure "Expected partial closure to be rejected" - case verifyBundle capabilityBundle of - Left err -> assertBool ("Expected capability error, got: " ++ err) ("capabilities" `isInfixOf` err) - Right () -> assertFailure "Expected runtime capabilities to be rejected" - case verifyBundle wrongHashBundle of - Left err -> assertBool ("Expected hash algorithm error, got: " ++ err) ("node hash algorithm" `isInfixOf` err) - Right () -> assertFailure "Expected unsupported node hash algorithm to be rejected" - close srcConn - - , testCase "Portable bundle: import registers manifest export names in fresh content store" $ do - (srcConn, termHash, originalTerm) <- storeTermInTempDB $ unlines - [ "validateEmail = a : a" - , "main = validateEmail t" - ] - wireData <- exportNamedBundle srcConn [("validateEmail", termHash)] - dstConn <- newContentStore - _ <- importBundle dstConn wireData - loadedByHash <- loadTermByHash dstConn termHash - loadedByName <- loadTerm dstConn "validateEmail" - loadedByHash @?= originalTerm - loadedByName @?= Just originalTerm - close srcConn - close dstConn - - , testCase "Round-trip: store, export, import, load" $ do - -- Store a term - (srcConn, termHash, originalTerm) <- storeTermInTempDB $ unlines - [ "x = t" - , "y = t x" - , "z = t y" - , "main = z" - ] - -- Export by root hash - wireData <- exportBundle srcConn [termHash] - -- Import into a fresh DB - dstConn <- newContentStore - _ <- importBundle dstConn wireData - -- Load the term back and compare - loadedTerm <- loadTermByHash dstConn termHash - loadedTerm @?= originalTerm - -- Cleanup - close srcConn - close dstConn - - , testCase "Round-trip: evaluate from original, export, import, load root" $ do - (srcConn, termHash, originalTerm) <- storeTermInTempDB $ unlines - [ "add = a b : t (t a) b" - , "val = add (t t) (t)" - , "main = val" - ] - -- Export - wireData <- exportBundle srcConn [termHash] - -- Import into fresh DB - dstConn <- newContentStore - _ <- importBundle dstConn wireData - -- Load the root term by hash and compare - loadedTerm <- loadTermByHash dstConn termHash - loadedTerm @?= originalTerm - close srcConn - close dstConn - - , testCase "Negative: corrupt payload byte causes import to fail" $ do - (srcConn, termHash, _) <- storeTermInTempDB $ unlines - [ "x = t" - , "y = t x" - , "z = t y" - , "main = z" - ] - wireData <- exportBundle srcConn [termHash] - -- Decode, mutate one node's payload byte, re-encode - case decodeBundle wireData of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right bundle -> do - let (h, payload) = - head - [ (h', p) - | (h', p) <- Map.toList (bundleNodes bundle) - , BS.length p > 0 - ] - payload' = BS.pack [(BS.head payload `xor` 0x01)] <> BS.tail payload - bundle' = bundle { bundleNodes = Map.insert h payload' (bundleNodes bundle) } - wireData' = encodeBundle bundle' - dstConn <- newContentStore - result <- try (importBundle dstConn wireData') :: IO (Either SomeException [MerkleHash]) - case result of - Left e -> - assertBool ("Expected hash mismatch or invalid payload, got: " ++ show e) - $ "mismatch" `isInfixOf` show e || "invalid" `isInfixOf` show e - Right _ -> - assertFailure "Expected import to fail on corrupted payload" - close dstConn - close srcConn - - , testCase "Negative: missing child node causes import to fail" $ do - (srcConn, termHash, _) <- storeTermInTempDB $ unlines - [ "x = t" - , "y = t x" - , "z = t y" - , "main = z" - ] - wireData <- exportBundle srcConn [termHash] - -- Decode, remove a node, re-encode - case decodeBundle wireData of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right bundle -> do - let nodeList = Map.toList (bundleNodes bundle) - trimmed = Map.fromList (tail nodeList) - newBundle = bundle { bundleNodes = trimmed } - newWire = encodeBundle newBundle - dstConn <- newContentStore - result <- try (importBundle dstConn newWire) :: IO (Either SomeException [MerkleHash]) - case result of - Left e -> - assertBool ("Expected verify error, got: " ++ show e) True - Right _ -> - assertFailure "Expected import to fail on missing child node" - close dstConn - close srcConn + , bundleManifestBytes = BS.empty + } + case verifyBundle partialBundle of + Left err -> assertBool ("Expected closure error, got: " ++ err) ("closure = complete" `isInfixOf` err) + Right () -> assertFailure "Expected partial closure to be rejected" + case verifyBundle capabilityBundle of + Left err -> assertBool ("Expected capability error, got: " ++ err) ("capabilities" `isInfixOf` err) + Right () -> assertFailure "Expected runtime capabilities to be rejected" + case verifyBundle wrongHashBundle of + Left err -> assertBool ("Expected hash algorithm error, got: " ++ err) ("node hash algorithm" `isInfixOf` err) + Right () -> assertFailure "Expected unsupported node hash algorithm to be rejected" ] +-- -------------------------------------------------------------------------- +-- Tricu reader tests +-- Smoke-test the tricu-native Arboricx reader against indexed bundles. +-- -------------------------------------------------------------------------- + +tricuReaderTests :: TestTree +tricuReaderTests = testGroup "Tricu Reader Tests" + [ testCase "Tricu reader parses indexed bundle (id fixture)" $ do + bundleBytes <- BS.readFile "./test/fixtures/id.arboricx" + let bundleT = ofBytes bundleBytes + readerEnv <- evaluateFile "./lib/arboricx.tri" + let env = Map.insert "testBundle" bundleT readerEnv + tagExpr = parseTricu "pairFirst (runArboricx testBundle t)" + tag = result (evalTricu env tagExpr) + codeExpr = parseTricu "pairFirst (pairSecond (runArboricx testBundle t))" + code = result (evalTricu env codeExpr) + tag @?= trueT + + , testCase "Tricu reader parses indexed bundle (append fixture)" $ do + bundleBytes <- BS.readFile "./test/fixtures/append.arboricx" + let bundleT = ofBytes bundleBytes + readerEnv <- evaluateFile "./lib/arboricx.tri" + let env = Map.insert "testBundle" bundleT readerEnv + tagExpr = parseTricu "pairFirst (runArboricx testBundle t)" + tag = result (evalTricu env tagExpr) + tag @?= trueT + + , testCase "Tricu reader parses indexed bundle (bool fixtures)" $ do + forM_ ["true", "false"] $ \name -> do + bundleBytes <- BS.readFile ("./test/fixtures/" ++ name ++ ".arboricx") + let bundleT = ofBytes bundleBytes + readerEnv <- evaluateFile "./lib/arboricx.tri" + let env = Map.insert "testBundle" bundleT readerEnv + tagExpr = parseTricu "pairFirst (runArboricx testBundle t)" + tag = result (evalTricu env tagExpr) + tag @?= trueT + ] + -- -------------------------------------------------------------------------- -- Byte-list utility tests -- Expected values built with canonical Haskell-side T constructors. @@ -1327,1533 +1252,3 @@ byteListUtilities = testGroup "Byte List Utility Tests" let env = evalTricu library (parseTricu input) result env @?= falseT ] - --- -------------------------------------------------------------------------- --- Binary reader tests (binary.tri) --- -------------------------------------------------------------------------- -okT :: T -> T -> T -okT value rest = pairT trueT (pairT value rest) - -errT :: T -> T -> T -errT code rest = pairT falseT (pairT code rest) - -eofT :: T -eofT = byteT 1 - -unitT :: T -unitT = Leaf - -unexpectedBytesT :: T -unexpectedBytesT = byteT 2 - -unexpectedByteT :: T -unexpectedByteT = byteT 3 - -missingSectionT :: T -missingSectionT = byteT 4 - -unsupportedVersionT :: T -unsupportedVersionT = byteT 5 - -duplicateSectionT :: T -duplicateSectionT = byteT 6 - -duplicateNodeT :: T -duplicateNodeT = byteT 7 - -invalidNodePayloadT :: T -invalidNodePayloadT = byteT 8 - -missingNodeT :: T -missingNodeT = byteT 9 - -binaryReaderTests :: TestTree -binaryReaderTests = testGroup "Binary Reader Tests" - [ testCase "readU8: empty input returns err" $ do - let input = "readU8 []" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT []) - - , testCase "readU8: single byte returns ok" $ do - let input = "readU8 [(7)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (byteT 7) (bytesT []) - - , testCase "readU8: multi-byte returns first byte and rest" $ do - let input = "readU8 [(7) (8)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (byteT 7) (bytesT [8]) - - , testCase "readBytes 0: returns ok with empty bytes and original input" $ do - let input = "readBytes 0 [(1) (2)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (bytesT []) (bytesT [1,2]) - - , testCase "readBytes 2: exact read returns ok with taken and rest" $ do - let input = "readBytes 2 [(1) (2) (3)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (bytesT [1,2]) (bytesT [3]) - - , testCase "readBytes 3: exact read with no remainder" $ do - let input = "readBytes 3 [(1) (2) (3)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (bytesT [1,2,3]) (bytesT []) - - , testCase "readBytes 5: overlong read returns err preserving input" $ do - let input = "readBytes 5 [(1) (2)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [1,2]) - - -- ------------------------------------------------------------------------ - -- Binary Result Matcher Tests - -- ------------------------------------------------------------------------ - - , testCase "matchResult: ok branch returns value" $ do - let input = "matchResult (code rest : 0) (value rest : value) (ok 7 [])" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= byteT 7 - - , testCase "matchResult: err branch returns code" $ do - let input = "matchResult (code rest : code) (value rest : 0) (err 1 [])" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= byteT 1 - - , testCase "matchResult: ok branch receives rest" $ do - let input = "matchResult (code rest : []) (value rest : rest) (ok 7 [(8)])" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= bytesT [8] - - , testCase "matchResult: err branch receives rest" $ do - let input = "matchResult (code rest : rest) (value rest : []) (err 1 [(7) (8)])" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= bytesT [7,8] - - , testCase "matchResult: transforms readU8 ok result" $ do - let input = "matchResult (code rest : code) (value rest : value) (readU8 [(7) (8)])" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= byteT 7 - - , testCase "matchResult: transforms readU8 err result" $ do - let input = "matchResult (code rest : code) (value rest : value) (readU8 [])" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= byteT 1 - - -- ------------------------------------------------------------------------ - -- Binary expectBytes Tests - -- ------------------------------------------------------------------------ - - , testCase "expectBytes: empty expected matches and preserves input" $ do - let input = "expectBytes [] [(1) (2)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT unitT (bytesT [1,2]) - - , testCase "expectBytes: single byte consumed, rest preserved" $ do - let input = "expectBytes [(1)] [(1) (2)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT unitT (bytesT [2]) - - , testCase "expectBytes: exact match with trailing data" $ do - let input = "expectBytes [(1) (2)] [(1) (2) (3)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT unitT (bytesT [3]) - - , testCase "expectBytes: mismatch returns err with original input" $ do - let input = "expectBytes [(1) (2)] [(1) (3)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT unexpectedBytesT (bytesT [1,3]) - - , testCase "expectBytes: overlong expected returns errEof with original input" $ do - let input = "expectBytes [(1) (2) (3)] [(1) (2)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [1,2]) - - -- ------------------------------------------------------------------------ - -- Binary expectU8 Tests - -- ------------------------------------------------------------------------ - - , testCase "expectU8: matches and preserves rest" $ do - let input = "expectU8 7 [(7) (8)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT unitT (bytesT [8]) - - , testCase "expectU8: mismatch returns err with original input" $ do - let input = "expectU8 7 [(8)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT unexpectedByteT (bytesT [8]) - - , testCase "expectU8: empty input returns errEof with original input" $ do - let input = "expectU8 7 []" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT []) - - -- ------------------------------------------------------------------------ - -- Binary fixed-size readers (read2 / read4) - -- ------------------------------------------------------------------------ - - , testCase "read2: reads two bytes and preserves rest" $ do - let input = "read2 [(1) (2) (3)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (bytesT [1,2]) (bytesT [3]) - - , testCase "read2: exact two-byte read" $ do - let input = "read2 [(1) (2)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (bytesT [1,2]) (bytesT []) - - , testCase "read2: one byte returns EOF preserving input" $ do - let input = "read2 [(1)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [1]) - - , testCase "read2: empty input returns EOF" $ do - let input = "read2 []" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT []) - - , testCase "read4: reads four bytes and preserves rest" $ do - let input = "read4 [(1) (2) (3) (4) (5)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (bytesT [1,2,3,4]) (bytesT [5]) - - , testCase "read4: exact four-byte read" $ do - let input = "read4 [(1) (2) (3) (4)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (bytesT [1,2,3,4]) (bytesT []) - - , testCase "read4: short input returns EOF preserving input" $ do - let input = "read4 [(1) (2) (3)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [1,2,3]) - - , testCase "read4: empty input returns EOF" $ do - let input = "read4 []" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT []) - - -- ------------------------------------------------------------------------ - -- Binary Result sequencing combinators (mapResult / bindResult) - -- ------------------------------------------------------------------------ - - , testCase "mapResult: maps ok value and preserves rest" $ do - let input = "mapResult (x : bytesLength x) (ok [(1) (2)] [(3)])" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (ofNumber 2) (bytesT [3]) - - , testCase "mapResult: preserves err unchanged" $ do - let input = "mapResult (x : bytesLength x) (err 1 [(7)])" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [7]) - - , testCase "bindResult: ok invokes continuation" $ do - let input = "bindResult (ok 7 [(8)]) (value rest : ok rest [])" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (bytesT [8]) (bytesT []) - - , testCase "bindResult: err skips continuation" $ do - let input = "bindResult (err 1 [(8)]) (value rest : ok value [])" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [8]) - - -- ------------------------------------------------------------------------ - -- Binary fixed-size byte readers with BE byte-swap naming - -- ------------------------------------------------------------------------ - - , testCase "readU16BEBytes: reads two raw bytes" $ do - let input = "readU16BEBytes [(1) (2) (3)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (bytesT [1,2]) (bytesT [3]) - - , testCase "readU16BEBytes: short input EOF" $ do - let input = "readU16BEBytes [(1)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [1]) - - , testCase "readU32BEBytes: reads four raw bytes" $ do - let input = "readU32BEBytes [(1) (2) (3) (4) (5)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (bytesT [1,2,3,4]) (bytesT [5]) - - , testCase "readU32BEBytes: short input EOF" $ do - let input = "readU32BEBytes [(1) (2) (3)]" - library <- evaluateFile "./lib/binary.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [1,2,3]) - - -- ------------------------------------------------------------------------ - -- Arboricx magic recognition - -- ------------------------------------------------------------------------ - - , testCase "readArboricxMagic: accepts magic and preserves rest" $ do - let input = "readArboricxMagic ((append arboricxMagic) [(1) (2)])" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT unitT (bytesT [1,2]) - - , testCase "readArboricxMagic: rejects wrong magic preserving input" $ do - let input = "readArboricxMagic [(65) (83) (66) (79) (82) (73) (67) (88) (1) (9)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT unexpectedBytesT (bytesT [65,83,66,79,82,73,67,88,1,9]) - - , testCase "readArboricxMagic: short input returns EOF preserving input" $ do - let input = "readArboricxMagic [(65) (82) (66) (79)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [65,82,66,79]) - - -- ------------------------------------------------------------------------ - -- Arboricx header parsing - -- ------------------------------------------------------------------------ - - , testCase "readArboricxHeader: parses portable header" $ do - let input = "readArboricxHeader " ++ bytesExpr (arboricxHeaderBytes 0) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (arboricxHeaderT 0) (bytesT []) - - , testCase "readArboricxHeader: preserves trailing bytes" $ do - let input = "readArboricxHeader " ++ bytesExpr (arboricxHeaderBytes 0 ++ [9,8]) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (arboricxHeaderT 0) (bytesT [9,8]) - - , testCase "readArboricxHeader: short input returns EOF preserving input" $ do - let input = "readArboricxHeader [(65) (82)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [65,82]) - - -- ------------------------------------------------------------------------ - -- Arboricx section directory record parsing - -- ------------------------------------------------------------------------ - - , testCase "readSectionRecord: parses portable section entry" $ do - let input = "readSectionRecord " ++ bytesExpr (nodesEntryBytes 16 32) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (sectionRecordT nodesSectionIdBytes 16 32) (bytesT []) - - , testCase "readSectionRecord: preserves trailing bytes" $ do - let input = "readSectionRecord " ++ bytesExpr (nodesEntryBytes 16 32 ++ [9,8]) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (sectionRecordT nodesSectionIdBytes 16 32) (bytesT [9,8]) - - , testCase "readSectionRecord: empty input returns EOF" $ do - let input = "readSectionRecord []" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT []) - - , testCase "readSectionRecord: short section id returns EOF preserving input" $ do - let input = "readSectionRecord [(0)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [0]) - - , testCase "readSectionRecord: missing section version returns EOF preserving unread bytes" $ do - let input = "readSectionRecord [(0) (2)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [0,2]) - - , testCase "readSectionRecord: short section version returns EOF preserving unread bytes" $ do - let input = "readSectionRecord [(0) (2) (0) (0) (0)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [0]) - - , testCase "readSectionRecord: missing length returns EOF preserving unread length bytes" $ do - let input = "readSectionRecord [(0) (2) (0) (0) (0) (16)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT []) - - , testCase "readSectionRecord: short section flags returns EOF preserving unread bytes" $ do - let input = "readSectionRecord [(0) (2) (0) (0) (0) (16) (0) (0) (0)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [0]) - - -- ------------------------------------------------------------------------ - -- Arboricx section directory parsing - -- ------------------------------------------------------------------------ - - , testCase "readSectionDirectory: zero records preserves input" $ do - let input = "readSectionDirectory 0 [(9) (8)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (ofList []) (bytesT [9,8]) - - , testCase "readSectionDirectory: reads requested records and preserves trailing bytes" $ do - let input = "readSectionDirectory 2 " ++ bytesExpr (manifestEntryBytes 10 20 ++ nodesEntryBytes 30 40 ++ [9]) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT - (ofList - [ sectionRecordT manifestSectionIdBytes 10 20 - , sectionRecordT nodesSectionIdBytes 30 40 - ]) - (bytesT [9]) - - , testCase "readSectionDirectory: truncated record returns EOF" $ do - let input = "readSectionDirectory 2 [(0) (1) (0) (0) (0) (10) (0) (0) (0) (20) (0) (2) (0) (0)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [0,0]) - - -- ------------------------------------------------------------------------ - -- Arboricx section lookup and raw byte slicing - -- ------------------------------------------------------------------------ - - , testCase "lookupSectionRecord: finds record by raw section id" $ do - let input = "lookupSectionRecord " ++ bytesExpr nodesSectionIdBytes ++ " [(" ++ "pair " ++ bytesExpr manifestSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 10) ++ " (pair " ++ bytesExpr (u64 20) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ") (" ++ "pair " ++ bytesExpr nodesSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 30) ++ " (pair " ++ bytesExpr (u64 40) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ")]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= justT (sectionRecordT nodesSectionIdBytes 30 40) - - , testCase "lookupSectionRecord: missing section id returns nothing" $ do - let input = "lookupSectionRecord " ++ bytesExpr [0,0,0,3] ++ " [(" ++ "pair " ++ bytesExpr manifestSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 10) ++ " (pair " ++ bytesExpr (u64 20) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ") (" ++ "pair " ++ bytesExpr nodesSectionIdBytes ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr [0,0] ++ " (pair " ++ bytesExpr [0,1] ++ " (pair " ++ bytesExpr (u64 30) ++ " (pair " ++ bytesExpr (u64 40) ++ " " ++ bytesExpr (replicate 32 0) ++ "))))))" ++ ")]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= nothingT - - , testCase "byteSlice: extracts requested byte range" $ do - let input = "byteSlice 2 3 [(10) (11) (12) (13) (14) (15)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= bytesT [12,13,14] - - , testCase "byteSlice: overlong length returns remaining bytes" $ do - let input = "byteSlice 4 9 [(10) (11) (12) (13) (14) (15)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= bytesT [14,15] - - -- ------------------------------------------------------------------------ - -- Arboricx minimal container parsing foundation - -- ------------------------------------------------------------------------ - - , testCase "u32BEBytesToNat: decodes zero" $ do - let input = "u32BEBytesToNat [(0) (0) (0) (0)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofNumber 0 - - , testCase "u32BEBytesToNat: decodes small section count" $ do - let input = "u32BEBytesToNat [(0) (0) (0) (2)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofNumber 2 - - , testCase "u64BEBytesToNat: decodes small node count" $ do - let input = "u64BEBytesToNat [(0) (0) (0) (0) (0) (0) (0) (2)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofNumber 2 - - , testCase "u64BEBytesToNat: decodes fixture-scale offset" $ do - let input = "u64BEBytesToNat [(0) (0) (0) (0) (0) (0) (3) (214)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofNumber 982 - - , testCase "readArboricxContainer: reads header directory and preserves payload" $ do - let input = "readArboricxContainer " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204]) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT - (pairT - (arboricxHeaderT 2) - (ofList - [ sectionRecordT manifestSectionIdBytes 152 3 - , sectionRecordT nodesSectionIdBytes 155 4 - ])) - (bytesT [101,102,103,201,202,203,204]) - - , testCase "readArboricxContainer: truncated directory returns EOF" $ do - let input = "readArboricxContainer " ++ bytesExpr (arboricxHeaderBytes 1 ++ [0,0]) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [0,0]) - - , testCase "readArboricxContainer: rejects unsupported major version" $ do - let badHeader = [65,82,66,79,82,73,67,88] ++ u16 2 ++ u16 0 ++ u32 0 ++ u64 0 ++ u64 32 - input = "readArboricxContainer " ++ bytesExpr badHeader - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT unsupportedVersionT (bytesT []) - - , testCase "readArboricxContainer: rejects unsupported minor version" $ do - let badHeader = [65,82,66,79,82,73,67,88] ++ u16 1 ++ u16 1 ++ u32 0 ++ u64 0 ++ u64 32 - input = "readArboricxContainer " ++ bytesExpr badHeader - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT unsupportedVersionT (bytesT []) - - , testCase "readArboricxContainer: rejects duplicate section ids" $ do - let input = "readArboricxContainer " ++ bytesExpr (arboricxHeaderBytes 2 ++ manifestEntryBytes 152 1 ++ manifestEntryBytes 153 1 ++ [9]) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT duplicateSectionT (bytesT [9]) - - , testCase "extractSectionBytes: uses raw offset and length fields" $ do - let input = "extractSectionBytes " ++ sectionRecordExpr nodesSectionIdBytes 3 4 ++ " " ++ bytesExpr [10,11,12,13,14,15,16,17] - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= bytesT [13,14,15,16] - - , testCase "lookupSectionBytes: finds section and extracts raw bytes" $ do - let input = "lookupSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " [" ++ sectionRecordExpr manifestSectionIdBytes 1 2 ++ " " ++ sectionRecordExpr nodesSectionIdBytes 4 3 ++ "] " ++ bytesExpr [10,11,12,13,14,15,16,17] - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= justT (bytesT [14,15,16]) - - , testCase "lookupSectionBytes: missing section returns nothing" $ do - let input = "lookupSectionBytes " ++ bytesExpr [0,0,0,3] ++ " [" ++ sectionRecordExpr manifestSectionIdBytes 1 2 ++ " " ++ sectionRecordExpr nodesSectionIdBytes 4 3 ++ "] " ++ bytesExpr [10,11,12,13,14,15,16,17] - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= nothingT - - , testCase "extractSectionBytesResult: rejects out-of-bounds section" $ do - let input = "extractSectionBytesResult " ++ sectionRecordExpr nodesSectionIdBytes 6 4 ++ " " ++ bytesExpr [10,11,12,13,14,15,16,17] ++ " []" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT []) - - , testCase "readArboricxSectionBytes: extracts requested section from container" $ do - let input = "readArboricxSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204]) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (bytesT [201,202,203,204]) (bytesT [101,102,103,201,202,203,204]) - - , testCase "readArboricxSectionBytes: missing section returns missing-section err" $ do - let input = "readArboricxSectionBytes " ++ bytesExpr nodesSectionIdBytes ++ " " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103]) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT missingSectionT (bytesT [101,102,103]) - - , testCase "readArboricxRequiredSections: extracts manifest and nodes bytes" $ do - let input = "readArboricxRequiredSections " ++ bytesExpr (simpleContainerBytes [101,102,103] [201,202,203,204]) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT - (pairT (bytesT [101,102,103]) (bytesT [201,202,203,204])) - (bytesT [101,102,103,201,202,203,204]) - - , testCase "readArboricxRequiredSections: missing nodes section returns missing-section err" $ do - let input = "readArboricxRequiredSections " ++ bytesExpr (singleSectionContainerBytes manifestSectionIdBytes [101,102,103]) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT missingSectionT (bytesT [101,102,103]) - - , testCase "readArboricxRequiredSections: out-of-bounds section returns EOF" $ do - let manifestBytes = [101,102,103] - nodesBytes = [201,202,203,204] - badContainer = arboricxHeaderBytes 2 ++ manifestEntryBytes 152 3 ++ nodesEntryBytes 155 9 ++ manifestBytes ++ nodesBytes - input = "readArboricxRequiredSections " ++ bytesExpr badContainer - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [101,102,103,201,202,203,204]) - - -- ------------------------------------------------------------------------ - -- Arboricx raw nodes section parsing - -- ------------------------------------------------------------------------ - - , testCase "readNodeRecord: parses hash length and raw payload" $ do - let input = "readNodeRecord [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (3) (101) (102) (103) (9)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT - (pairT (bytesT [1..32]) - (pairT (bytesT [0,0,0,3]) - (bytesT [101,102,103]))) - (bytesT [9]) - - , testCase "readNodeRecord: truncated payload returns EOF preserving unread payload" $ do - let input = "readNodeRecord [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (3) (101) (102)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [101,102]) - - , testCase "readNodesSection: parses node count and records" $ do - let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (9)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT - (pairT (bytesT [0,0,0,0,0,0,0,1]) - (ofList - [ pairT (bytesT [1..32]) - (pairT (bytesT [0,0,0,1]) - (bytesT [0])) - ])) - (bytesT [9]) - - , testCase "readNodesSectionComplete: rejects trailing bytes inside nodes section" $ do - let input = "readNodesSectionComplete [(0) (0) (0) (0) (0) (0) (0) (0) (9)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT unexpectedBytesT (bytesT [9]) - - , testCase "readNodesSection: rejects duplicate node hashes" $ do - let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (0) (9)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT duplicateNodeT (bytesT [9]) - - , testCase "nodePayloadValid?: accepts leaf stem and fork payload shapes" $ do - let input = "[(nodePayloadValid? [(0)]) (nodePayloadValid? [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]) (nodePayloadValid? [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)])]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofList [trueT, trueT, trueT] - - , testCase "nodePayloadValid?: rejects invalid payload shapes" $ do - let input = "[(nodePayloadValid? []) (nodePayloadValid? [(9)]) (nodePayloadValid? [(1) (1)]) (nodePayloadValid? [(2) (1) (2)])]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofList [falseT, falseT, falseT, falseT] - - , testCase "node payload child accessors expose raw hashes" $ do - let input = "[(nodePayloadStemChildHash [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]) (nodePayloadForkLeftHash [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]) (nodePayloadForkRightHash [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)])]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofList [bytesT [1..32], bytesT [1..32], bytesT [33..64]] - - , testCase "lookupNodeRecord: finds record by raw node hash" $ do - let input = "lookupNodeRecord [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (1)] [(0)]))]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= justT - (pairT (bytesT [33..64]) - (pairT (bytesT [0,0,0,1]) - (bytesT [0]))) - - , testCase "nodeRecordChildHashes: extracts stem and fork references" $ do - let input = "[(nodeRecordChildHashes (pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (33)] [(1) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]))) (nodeRecordChildHashes (pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (65)] [(2) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64) (65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)])))]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofList - [ ofList [bytesT [33..64]] - , ofList [bytesT [33..64], bytesT [65..96]] - ] - - , testCase "readNodesSection: rejects invalid node payload shape" $ do - let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (1) (9)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT invalidNodePayloadT (bytesT []) - - , testCase "readNodesSection: rejects missing child node" $ do - let input = "readNodesSection [(0) (0) (0) (0) (0) (0) (0) (1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (0) (0) (0) (33) (1) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64) (9)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT missingNodeT (bytesT [9]) - - , testCase "readArboricxNodesSection: extracts and parses raw nodes section" $ do - let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0] - input = "readArboricxNodesSection " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT - (pairT (bytesT [0,0,0,0,0,0,0,1]) - (ofList - [ pairT (bytesT [1..32]) - (pairT (bytesT [0,0,0,1]) - (bytesT [0])) - ])) - (bytesT ([101,102,103] ++ nodesBytes)) - - -- ------------------------------------------------------------------------ - -- Arboricx node DAG reconstruction - -- ------------------------------------------------------------------------ - - , testCase "nodeHashToTree: reconstructs leaf node" $ do - let input = "nodeHashToTree [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)]))]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT Leaf Leaf - - , testCase "nodeHashToTree: reconstructs stem node" $ do - let input = "nodeHashToTree [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (33)] [(1) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)]))]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (Stem Leaf) Leaf - - , testCase "nodeHashToTree: reconstructs fork node" $ do - let input = "nodeHashToTree [(65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)] [(pair [(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)] (pair [(0) (0) (0) (1)] [(0)])) (pair [(65) (66) (67) (68) (69) (70) (71) (72) (73) (74) (75) (76) (77) (78) (79) (80) (81) (82) (83) (84) (85) (86) (87) (88) (89) (90) (91) (92) (93) (94) (95) (96)] (pair [(0) (0) (0) (65)] [(2) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25) (26) (27) (28) (29) (30) (31) (32) (33) (34) (35) (36) (37) (38) (39) (40) (41) (42) (43) (44) (45) (46) (47) (48) (49) (50) (51) (52) (53) (54) (55) (56) (57) (58) (59) (60) (61) (62) (63) (64)]))]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (Fork Leaf Leaf) Leaf - - , testCase "readArboricxTreeFromHash: reconstructs tree from bundle bytes" $ do - let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0] - input = "readArboricxTreeFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT Leaf (bytesT ([101,102,103] ++ nodesBytes)) - - , testCase "readArboricxExecutableFromHash: alias reconstructs tree" $ do - let nodesBytes = u64 1 ++ [1..32] ++ u32 1 ++ [0] - input = "readArboricxExecutableFromHash " ++ bytesExpr [1..32] ++ " " ++ bytesExpr (simpleContainerBytes [101,102,103] nodesBytes) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT Leaf (bytesT ([101,102,103] ++ nodesBytes)) - - , testCase "readArboricxNodesSection: reads id fixture bundle" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" - case decodeBundle fixtureBytes of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right _ -> do - let input = "matchResult (code rest : code) (nodes rest : 0) (readArboricxNodesSection " - ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) - ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofNumber 0 - - , testCase "readArboricxNodesSection: reads notQ fixture bundle" $ do - fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx" - case decodeBundle fixtureBytes of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right _ -> do - let input = "matchResult (code rest : code) (nodes rest : 0) (readArboricxNodesSection " - ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) - ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofNumber 0 - - , testCase "readArboricxNodesSection: reads map fixture bundle" $ do - fixtureBytes <- BS.readFile "test/fixtures/map.arboricx" - case decodeBundle fixtureBytes of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right _ -> do - let input = "matchResult (code rest : code) (nodes rest : 0) (readArboricxNodesSection " - ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) - ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofNumber 0 - - , testCase "readArboricxExecutableFromHash: reconstructs id fixture root" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" - case decodeBundle fixtureBytes of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right bundle -> case bundleRoots bundle of - [] -> assertFailure "fixture has no roots" - (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : 0) (readArboricxExecutableFromHash " - ++ bytesExpr (hexTextBytes rootHash) - ++ " " - ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) - ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofNumber 0 - - , testCase "readArboricxExecutableFromHash: reconstructs notQ fixture root" $ do - fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx" - case decodeBundle fixtureBytes of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right bundle -> case bundleRoots bundle of - [] -> assertFailure "fixture has no roots" - (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : 0) (readArboricxExecutableFromHash " - ++ bytesExpr (hexTextBytes rootHash) - ++ " " - ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) - ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofNumber 0 - - , testCase "readArboricxExecutableFromHash: reconstructs map fixture root" $ do - fixtureBytes <- BS.readFile "test/fixtures/map.arboricx" - case decodeBundle fixtureBytes of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right bundle -> case bundleRoots bundle of - [] -> assertFailure "fixture has no roots" - (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : 0) (readArboricxExecutableFromHash " - ++ bytesExpr (hexTextBytes rootHash) - ++ " " - ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) - ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofNumber 0 - - , testCase "readArboricxExecutableFromHash: executes id fixture root" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" - case decodeBundle fixtureBytes of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right bundle -> case bundleRoots bundle of - [] -> assertFailure "fixture has no roots" - (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : tree 42) (readArboricxExecutableFromHash " - ++ bytesExpr (hexTextBytes rootHash) - ++ " " - ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) - ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= ofNumber 42 - - , testCase "readArboricxExecutableFromHash: executes notQ fixture on true" $ do - fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx" - case decodeBundle fixtureBytes of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right bundle -> case bundleRoots bundle of - [] -> assertFailure "fixture has no roots" - (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : tree true) (readArboricxExecutableFromHash " - ++ bytesExpr (hexTextBytes rootHash) - ++ " " - ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) - ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= falseT - - , testCase "readArboricxExecutableFromHash: executes notQ fixture on false" $ do - fixtureBytes <- BS.readFile "test/fixtures/notQ.arboricx" - case decodeBundle fixtureBytes of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right bundle -> case bundleRoots bundle of - [] -> assertFailure "fixture has no roots" - (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : tree false) (readArboricxExecutableFromHash " - ++ bytesExpr (hexTextBytes rootHash) - ++ " " - ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) - ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= trueT - - , testCase "readArboricxExecutableFromHash: executes map fixture root" $ do - fixtureBytes <- BS.readFile "test/fixtures/map.arboricx" - case decodeBundle fixtureBytes of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right bundle -> case bundleRoots bundle of - [] -> assertFailure "fixture has no roots" - (rootHash:_) -> do - let input = "matchResult (code rest : code) (tree rest : head (tail (tree (a : (t t t)) [(t) (t) (t)]))) (readArboricxExecutableFromHash " - ++ bytesExpr (hexTextBytes rootHash) - ++ " " - ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) - ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= Fork Leaf Leaf - ] - --- --------------------------------------------------------------------------- --- Manifest reading tests (Steps 1-9) --- --------------------------------------------------------------------------- - --- Build a minimal manifest: --- magic "ARBMNFST" (8) + version 1.0 (4) + --- schema "arboricx.bundle.manifest.v1" (4+27=31) + --- bundleType "tree-calculus-executable-object" (4+31=35) + --- treeCalculus "tree-calculus.v1" (4+16=20) + --- treeHashAlgorithm "sha256" (4+6=10) + --- treeHashDomain "arboricx.merkle.node.v1" (4+23=27) + --- treeNodePayload "arboricx.merkle.payload.v1" (4+26=30) + --- runtimeSemantics "tree-calculus.v1" (4+16=20) + --- runtimeEvaluation "normal-order" (4+12=16) + --- runtimeAbi "arboricx.abi.tree.v1" (4+20=24) + --- capabilityCount 0 (4) + --- closure 0 (1) + --- rootCount 1 (4) + --- root: hash (32) + role "default" (4+7=11) = 43 + --- exportCount 1 (4) + --- export: name "term" (4+4=8) + root (32) + kind "term" (4+4=8) + abi "arboricx.abi.tree.v1" (4+20=24) = 72 + --- Total core = 8+4+31+35+20+10+27+30+20+16+24+4+1+4+43+4+72 = 378 bytes - -minimalManifestCoreBytes :: [Integer] -minimalManifestCoreBytes = [65,82,66,77,78,70,83,84] -- ARBMNFST magic - ++ u16 1 ++ u16 0 -- version 1.0 - ++ lengthPrefixed "arboricx.bundle.manifest.v1" -- schema - ++ lengthPrefixed "tree-calculus-executable-object" -- bundleType - ++ lengthPrefixed "tree-calculus.v1" -- treeCalculus - ++ lengthPrefixed "sha256" -- treeHashAlgorithm - ++ lengthPrefixed "arboricx.merkle.node.v1" -- treeHashDomain - ++ lengthPrefixed "arboricx.merkle.payload.v1" -- treeNodePayload - ++ lengthPrefixed "tree-calculus.v1" -- runtimeSemantics - ++ lengthPrefixed "normal-order" -- runtimeEvaluation - ++ lengthPrefixed "arboricx.abi.tree.v1" -- runtimeAbi - ++ u32 0 -- 0 capabilities - ++ [0] -- closure complete - ++ u32 1 -- 1 root - ++ replicate 32 0 -- placeholder root hash - ++ lengthPrefixed "default" -- root role - ++ u32 1 -- 1 export - ++ lengthPrefixed "term" -- export name - ++ replicate 32 0 -- placeholder export root hash - ++ lengthPrefixed "term" -- export kind - ++ lengthPrefixed "arboricx.abi.tree.v1" -- export abi - -lengthPrefixed :: String -> [Integer] -lengthPrefixed s = u32 (fromIntegral (length s)) ++ map (fromIntegral . fromEnum) s - --- Full manifest: core + 0 metadata + 0 extension = core + u32(0) + u32(0) -fullMinimalManifestBytes :: [Integer] -fullMinimalManifestBytes = minimalManifestCoreBytes ++ u32 0 ++ u32 0 - --- Create TLV list with two entries: --- tag 1 (package), value "my-pkg", then tag 2 (version), value "1.0" --- then "rest" bytes - -tlvForTagAndValue :: Integer -> String -> [Integer] -tlvForTagAndValue tag val = - u16 (fromIntegral tag) ++ lengthPrefixed val - --- Build a pair of (tag, value) TLV -makeTLVPair :: Integer -> String -> String -makeTLVPair tag val = - "[(pair " ++ bytesExpr [0, fromIntegral tag] ++ " " - ++ bytesExpr (map (fromIntegral . fromEnum) val) ++ ")]" - -exportEntryExpr :: String -> [Integer] -> String -> String -> String -exportEntryExpr name rootHashBytes kind abi = - "(pair " ++ bytesExpr (map (fromIntegral . fromEnum) name) ++ " " - ++ "(pair " ++ bytesExpr rootHashBytes ++ " " - ++ "(pair " ++ bytesExpr (map (fromIntegral . fromEnum) kind) ++ " " - ++ bytesExpr (map (fromIntegral . fromEnum) abi) ++ ")))" - --- Build list of export entries for the test -singleExportExpr :: String -singleExportExpr = - "[" ++ exportEntryExpr "main" (replicate 32 0) "term" "arboricx.abi.tree.v1" ++ "]" - -multiExportExpr :: String -multiExportExpr = - "[" - ++ exportEntryExpr "main" (replicate 32 0) "term" "arboricx.abi.tree.v1" --- ++ ", " - ++ exportEntryExpr "test" (replicate 32 1) "term" "arboricx.abi.tree.v1" - ++ "]" - --- Helper to build a minimal valid manifest core --- Returns a tricu expression representing the parsed core structure -buildValidCoreExpr :: String -buildValidCoreExpr = - "(pair " - ++ bytesExpr (map (fromIntegral . fromEnum) "arboricx.bundle.manifest.v1") ++ " " -- schema - ++ "(pair " - ++ bytesExpr (map (fromIntegral . fromEnum) "tree-calculus-executable-object") ++ " " -- bundleType - ++ "(pair " - ++ bytesExpr (map (fromIntegral . fromEnum) "tree-calculus.v1") ++ " " -- treeCalculus - ++ "(pair " - ++ bytesExpr (map (fromIntegral . fromEnum) "sha256") ++ " " -- treeHashAlgorithm - ++ "(pair " - ++ bytesExpr (map (fromIntegral . fromEnum) "arboricx.merkle.node.v1") ++ " " -- treeHashDomain - ++ "(pair " - ++ bytesExpr (map (fromIntegral . fromEnum) "arboricx.merkle.payload.v1") ++ " " -- treeNodePayload - ++ "(pair " - ++ bytesExpr (map (fromIntegral . fromEnum) "tree-calculus.v1") ++ " " -- runtimeSemantics - ++ "(pair " - ++ bytesExpr (map (fromIntegral . fromEnum) "normal-order") ++ " " -- runtimeEvaluation - ++ "(pair " - ++ bytesExpr (map (fromIntegral . fromEnum) "arboricx.abi.tree.v1") ++ " " -- runtimeAbi - ++ "(pair " - ++ "[] " -- capabilities - ++ "(pair " - ++ "0 " -- closure - ++ "(pair " - ++ "[(pair " ++ bytesExpr (replicate 32 0) ++ " " - ++ bytesExpr (map (fromIntegral . fromEnum) "default") ++ ")" -- roots (1 root) - ++ "] " - ++ "[(pair " - ++ bytesExpr (map (fromIntegral . fromEnum) "term") ++ " " - ++ "(pair " ++ bytesExpr (replicate 32 0) ++ " " - ++ "(pair " - ++ bytesExpr (map (fromIntegral . fromEnum) "term") ++ " " - ++ bytesExpr (map (fromIntegral . fromEnum) "arboricx.abi.tree.v1") ++ ")))" -- exports (1 export) - ++ "])" - ++ "]" - ++ "]" - ++ "]" - ++ "]" - ++ "]" - ++ "]" - ++ "]" - ++ "]" - ++ "]" - ++ "]" - ++ "]" - ++ "]" - ++ ")" - --- Build a tricu expression that extracts a specific manifest field from --- readArboricxBundle result and returns it as a byte-list T value. --- The Haskell test then uses toString to convert it to a String. -extractManifestField :: ByteString -> String -> String -extractManifestField fixtureBytes fieldName = - "matchResult " - ++ " (errCode rest : errCode) " - ++ " (bundleResult rest : " - ++ " matchPair " - ++ " (validCore metadataWithExtensions : " - ++ " " ++ fieldName ++ " validCore) " - ++ " bundleResult) " - ++ " (readArboricxBundle " ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ ")" - -manifestReadingTests :: TestTree -manifestReadingTests = testGroup "Manifest Reading Tests" - [ - -- ------------------------------------------------------------------------ - -- Step 1: readManifestMagic - -- ------------------------------------------------------------------------ - testCase "readManifestMagic: accepts correct manifest magic and preserves rest" $ do - let input = "readManifestMagic ((append arboricxManifestMagic) [(1) (2)])" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT unitT (bytesT [1,2]) - - , testCase "readManifestMagic: rejects wrong magic" $ do - let input = "readManifestMagic [(65) (83) (66) (77) (78) (70) (83) (84)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT unexpectedBytesT (bytesT [65,83,66,77,78,70,83,84]) - - , testCase "readManifestMagic: short input returns EOF" $ do - let input = "readManifestMagic [(65) (82) (66) (77)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [65,82,66,77]) - - -- ------------------------------------------------------------------------ - -- Step 2: readLengthPrefixedString - -- ------------------------------------------------------------------------ - - , testCase "readLengthPrefixedString: reads a 5-byte string" $ do - let input = "readLengthPrefixedString [(0) (0) (0) (5) (104) (101) (108) (108) (111) (99) (111) (110) (116) (101) (114)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (bytesT [104,101,108,108,111]) (bytesT [99,111,110,116,101,114]) - - , testCase "readLengthPrefixedString: reads an empty string" $ do - let input = "readLengthPrefixedString [(0) (0) (0) (0) (97) (98)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (bytesT []) (bytesT [97,98]) - - , testCase "readLengthPrefixedString: short payload returns EOF" $ do - let input = "readLengthPrefixedString [(0) (0) (0) (5) (104) (101) (108)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [104,101,108]) - - -- ------------------------------------------------------------------------ - -- Step 3: readManifestCore (construct a minimal valid manifest) - -- ------------------------------------------------------------------------ - - , testCase "readManifestCore: reads a minimal valid manifest core" $ do - let input = "readManifestCore " ++ bytesExpr minimalManifestCoreBytes - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork Leaf Leaf) -> assertFailure "should be ok, not t" - (Fork _ (Fork _ rest)) -> return () -- ok case: pair true (pair value rest) - _ -> assertFailure $ "expected ok result, got: " ++ show actualResult - - , testCase "readManifestCore: returns error on wrong magic" $ do - let badMagic = [65,83,66,77,78,70,83,84] ++ (drop 8 minimalManifestCoreBytes) - let input = "readManifestCore " ++ bytesExpr badMagic - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork falseT _) -> return () -- err case: pair false (pair code rest) - _ -> assertFailure $ "expected err result, got: " ++ show actualResult - - -- ------------------------------------------------------------------------ - -- Step 4: TLV reader - -- ------------------------------------------------------------------------ - - , testCase "readTLV: reads a metadata TLV entry" $ do - -- tag = u16 1 = [(0)(1)], length = u32 3 = [(0)(0)(0)(3)], value = "foo" = [102,111,111] - let input = "readTLV [(0) (1) (0) (0) (0) (3) (102) (111) (111) (99) (111) (110) (116) (114) (101) (115) (116)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork _ (Fork _ rest)) -> do - -- ok case: verify the value pair - let value = case result env of - (Fork _ (Fork val _)) -> case val of - (Fork tagVal _) -> tagVal - _ -> Leaf - return () - _ -> assertFailure $ "expected ok result, got: " ++ show actualResult - - , testCase "readTLV: returns EOF on empty input" $ do - let input = "readTLV []" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT []) - - , testCase "readTLV: returns EOF on short tag" $ do - let input = "readTLV [(0)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= errT eofT (bytesT [0]) - - , testCase "readTLVList: reads zero TLV entries" $ do - let input = "readTLVList 0 [(1) (2) (3)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (ofList []) (bytesT [1,2,3]) - - , testCase "readTLVList: reads one TLV entry and preserves rest" $ do - -- tag=1, len=3, value="foo" - let input = "readTLVList 1 [(0) (1) (0) (0) (0) (3) (102) (111) (111) (99) (111) (110) (116) (114) (101) (115) (116)]" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork _ (Fork _ rest)) -> do - -- ok: value is list with one TLV, rest should be [(99)...] - return () - _ -> assertFailure $ "expected ok result, got: " ++ show actualResult - - -- ------------------------------------------------------------------------ - -- Step 5: readManifest (full parser) - -- ------------------------------------------------------------------------ - - , testCase "readManifest: parses a minimal manifest with no metadata" $ do - let input = "readManifest " ++ bytesExpr fullMinimalManifestBytes - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork _ (Fork _ _)) -> return () -- ok result - _ -> assertFailure $ "expected ok result, got: " ++ show actualResult - - , testCase "readManifest: preserves trailing extension bytes" $ do - let input = "readManifest (append " ++ bytesExpr fullMinimalManifestBytes ++ " [(99) (111) (110) (116) (101) (110) (116) (101) (114)])" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork trueTag (Fork _ _)) | trueTag == trueT -> return () - _ -> assertFailure $ "expected ok result, got: " ++ show actualResult - - -- ------------------------------------------------------------------------ - -- Step 6: lookupMetadata - -- ------------------------------------------------------------------------ - - , testCase "lookupMetadata: finds metadata by tag" $ do - let tlv1 = makeTLVPair 1 "my-pkg" - let tlv2 = makeTLVPair 2 "1.0" - let input = "lookupMetadata (" ++ tlv1 ++ ") " ++ bytesExpr [(0), (1)] - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= justT (bytesT [109,121,45,112,107,103]) - - , testCase "lookupMetadata: returns nothing for unknown tag" $ do - let tlv1 = makeTLVPair 1 "my-pkg" - let input = "lookupMetadata " ++ tlv1 ++ " " ++ bytesExpr [(0), (2)] - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= nothingT - - , testCase "lookupMetadata: returns nothing for empty list" $ do - let input = "lookupMetadata [] " ++ bytesExpr [(0), (1)] - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= nothingT - - -- ------------------------------------------------------------------------ - -- Step 7: Export selection - -- ------------------------------------------------------------------------ - - -- Build export entry: (pair name (pair rootHash (pair kind abi))) - -- Test: select export by explicit name ("main") - , testCase "selectExport: finds export by explicit name" $ do - let input = "selectExport " ++ multiExportExpr ++ " " ++ bytesExpr (map (fromIntegral . fromEnum) "main") - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork _ (Fork _ _)) -> return () -- ok result - _ -> assertFailure $ "expected ok result, got: " ++ show actualResult - - -- Test: selectExport prefers "main" when no explicit name - , testCase "selectExport: selects 'main' when no explicit name and 'main' exists" $ do - let input = "selectExport " ++ multiExportExpr ++ " " ++ bytesExpr [] - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork _ (Fork _ _)) -> return () -- ok result - _ -> assertFailure $ "expected ok result, got: " ++ show actualResult - - -- Test: selectExport selects single export when only one exists - , testCase "selectExport: auto-selects single export" $ do - let input = "selectExport " ++ singleExportExpr ++ " " ++ bytesExpr [] - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork _ (Fork _ _)) -> return () -- ok result - _ -> assertFailure $ "expected ok result, got: " ++ show actualResult - - -- Test: getExportNames lists all export names - , testCase "getExportNames: returns list of all export names" $ do - let input = "getExportNames " ++ multiExportExpr - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - -- Should return a list of two byte strings - case actualResult of - (Fork (Fork _ _) (Fork (Fork _ _) _)) -> return () -- list with 2 items - _ -> assertFailure $ "expected list of 2 items, got: " ++ show actualResult - - -- Test: selectExport errors when multiple exports but no "main" and no explicit name - , testCase "selectExport: errors with multiple exports but no 'main'" $ do - let multiNoMain = - "[" - ++ exportEntryExpr "validate" (replicate 32 0) "term" "arboricx.abi.tree.v1" - ++ " " - ++ exportEntryExpr "test" (replicate 32 1) "term" "arboricx.abi.tree.v1" - ++ "]" - let input = "selectExport " ++ multiNoMain ++ " " ++ bytesExpr [] - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork falseT _) -> return () -- err result - _ -> assertFailure $ "expected err result, got: " ++ show actualResult - - -- Test: selectExportOpt works with Just bytes (explicit name given) - , testCase "selectExportOpt: selects by explicit name when given" $ do - let input = "selectExportOpt " ++ multiExportExpr ++ " " ++ bytesExpr (map (fromIntegral . fromEnum) "validate") - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork _ (Fork _ _)) -> return () -- ok result - _ -> assertFailure $ "expected ok result, got: " ++ show actualResult - - -- ------------------------------------------------------------------------ - -- Step 8: validateManifestCore - -- ------------------------------------------------------------------------ - - , testCase "validateManifestCore: passes on valid core" $ do - let input = "matchResult (code rest : err code rest) (core rest : validateManifestCore core " ++ bytesExpr [(1), (2)] ++ ") (readManifestCore " ++ bytesExpr minimalManifestCoreBytes ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork trueTag (Fork _ rest)) | trueTag == trueT -> rest @?= bytesT [1,2] - _ -> assertFailure $ "expected ok result, got: " ++ show actualResult - - , testCase "validateManifestCore: fails on wrong schema" $ do - let badCoreBytes = take 16 minimalManifestCoreBytes ++ map (fromIntegral . fromEnum) "z" ++ drop 17 minimalManifestCoreBytes - let input = "matchResult (code rest : err code rest) (core rest : validateManifestCore core " ++ bytesExpr [] ++ ") (readManifestCore " ++ bytesExpr badCoreBytes ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork falseTag _) | falseTag == falseT -> return () - _ -> assertFailure $ "expected err result, got: " ++ show actualResult - - -- ------------------------------------------------------------------------ - -- Step 9: readArboricxBundle (end-to-end with real fixture) - -- ------------------------------------------------------------------------ - - , testCase "readArboricxBundle: parses id.arboricx fixture" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" - case decodeBundle fixtureBytes of - Left err -> assertFailure $ "decodeBundle failed: " ++ err - Right bundle -> do - let manifestBytes = bundleManifestBytes bundle - -- The manifest section should be parseable - let input = "readManifest " ++ bytesExpr (map toInteger (BS.unpack manifestBytes)) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork trueTag (Fork _ _)) | trueTag == trueT -> return () - _ -> assertFailure $ "readManifest failed on id.arboricx manifest: " ++ show actualResult - - , testCase "readArboricxBundle: end-to-end bundle parse" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" - let input = "readArboricxBundle " ++ bytesExpr (map toInteger (BS.unpack fixtureBytes)) - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork _ (Fork _ _)) -> return () -- ok: (pair validManifest afterManifest) - _ -> assertFailure $ "readArboricxBundle failed: " ++ show actualResult - - , testCase "readArboricxBundle: rejects bundle with wrong manifest core" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" - -- Modify a byte in the manifest section to invalidate it - -- The manifest starts at offset 152 in the bundle (from header dirOffset=32) - -- Section directory: 2 entries * 60 = 120 bytes, starting at offset 32 - -- Manifest entry at directory offset 32: type(4) + version(2) + flags(2) + compression(2) + digestAlg(2) + offset(8) + length(8) + digest(32) = 60 - -- Manifest offset = 32 + 60 = 92 - -- The manifest itself starts at offset 152 (0x98) - -- Change byte at position 152+8 = 160 from 'a' (97) to 'z' (122) to break the schema string - let bs = map toInteger (BS.unpack fixtureBytes) - let modifiedBs = take 160 bs ++ [122] ++ drop 161 bs - let input = "readArboricxBundle " ++ bytesExpr modifiedBs - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let actualResult = result env - case actualResult of - (Fork falseT _) -> return () -- err result (validation failure) - _ -> assertFailure $ "expected err result, got: " ++ show actualResult - - -- ------------------------------------------------------------------------ - -- Comprehensive end-to-end: extract manifest fields and verify as strings - -- ------------------------------------------------------------------------ - - , testCase "readArboricxBundle: extracts and validates manifest schema" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" - let input = extractManifestField fixtureBytes "manifestSchema" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let schemaT = result env - toString schemaT @?= Right "arboricx.bundle.manifest.v1" - - , testCase "readArboricxBundle: extracts and validates bundleType" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" - let input = extractManifestField fixtureBytes "manifestBundleType" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let bundleTypeT = result env - toString bundleTypeT @?= Right "tree-calculus-executable-object" - - , testCase "readArboricxBundle: extracts and validates runtime evaluation" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" - let input = extractManifestField fixtureBytes "manifestRuntimeEvaluation" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let evalT = result env - toString evalT @?= Right "normal-order" - - , testCase "readArboricxBundle: extracts and validates runtime ABI" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" - let input = extractManifestField fixtureBytes "manifestRuntimeAbi" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let abiT = result env - toString abiT @?= Right "arboricx.abi.tree.v1" - - , testCase "readArboricxBundle: extracts and validates root names" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" - let input = "matchResult " - ++ " (errCode rest : errCode) " - ++ " (bundleResult rest : " - ++ " matchPair " - ++ " (validCore metadataWithExtensions : " - ++ " matchList " - ++ " (err 99 t) " -- empty roots - ++ " (rootEntry rest : " - ++ " matchPair " - ++ " (_ roleField : roleField) " - ++ " rootEntry) " - ++ " (manifestRoots validCore)) " - ++ " bundleResult) " - ++ " (readArboricxBundle " ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let rootRoleT = result env - -- Should find at least one root with a role (either "default" or "root") - case toString rootRoleT of - Right role -> assertBool "root role should be 'default' or 'root'" - (role == "default" || role == "root") - Left err -> assertFailure $ "failed to extract root role: " ++ err - - , testCase "readArboricxBundle: extracts and validates closure" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" - let input = "matchResult " - ++ " (errCode rest : errCode) " - ++ " (bundleResult rest : " - ++ " matchPair " - ++ " (validCore metadataWithExtensions : " - ++ " matchPair " - ++ " (closure _ : closure) " - ++ " (manifestClosureByte validCore)) " - ++ " bundleResult) " - ++ " (readArboricxBundle " ++ bytesExpr (map toInteger $ BS.unpack fixtureBytes) ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let closureT = result env - case toNumber closureT of - Right 0 -> return () - Right n -> assertFailure $ "closure should be 0, got " ++ show n - Left err -> assertFailure $ "failed to extract closure: " ++ err - - , testCase "readArboricxBundle: extracts and validates hash algorithm" $ do - fixtureBytes <- BS.readFile "test/fixtures/id.arboricx" - let input = extractManifestField fixtureBytes "manifestTreeHashAlgorithm" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - let algoT = result env - toString algoT @?= Right "sha256" - - , testCase "readArboricxExecutable: reconstructs default export tree" $ do - (srcConn, termHash, originalTerm) <- storeTermInTempDB $ unlines - [ "main = t t" ] - wireData <- exportBundle srcConn [termHash] - let input = "matchResult " - ++ " (code rest : err code rest) " - ++ " (tree rest : ok tree []) " - ++ " (readArboricxExecutable " ++ bytesExpr (map toInteger $ BS.unpack wireData) ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT originalTerm (bytesT []) - close srcConn - - , testCase "readArboricxExecutableByName: selects named export" $ do - srcConn <- newContentStore - let parsed = parseTricu $ unlines - [ "leaf = t" - , "stem = t t" - , "main = stem" - ] - env = evalTricu Map.empty parsed - leafTerm = maybe (error "leaf missing") id (Map.lookup "leaf" env) - stemTerm = maybe (error "stem missing") id (Map.lookup "stem" env) - leafHash <- storeTerm srcConn ["leaf"] leafTerm - stemHash <- storeTerm srcConn ["stem"] stemTerm - wireData <- exportNamedBundle srcConn [("leaf", leafHash), ("stem", stemHash)] - let input = "matchResult " - ++ " (code rest : err code rest) " - ++ " (tree rest : ok tree []) " - ++ " (readArboricxExecutableByName " ++ bytesExpr (map (fromIntegral . fromEnum) "stem") ++ " " ++ bytesExpr (map toInteger $ BS.unpack wireData) ++ ")" - library <- evaluateFile "./lib/arboricx.tri" - let resultEnv = evalTricu library (parseTricu input) - result resultEnv @?= okT stemTerm (bytesT []) - close srcConn - - , testCase "runArboricx: applies host-provided argument to default export" $ do - (srcConn, termHash, _) <- storeTermInTempDB $ unlines - [ "main = (x : x)" ] - wireData <- exportBundle srcConn [termHash] - let input = "matchResult " - ++ " (code rest : err code rest) " - ++ " (value rest : value) " - ++ " (runArboricx " ++ bytesExpr (map toInteger $ BS.unpack wireData) ++ " \"hello\")" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - toString (result env) @?= Right "hello" - close srcConn - - , testCase "runArboricxArgs: applies host-provided argument list in order" $ do - (srcConn, termHash, _) <- storeTermInTempDB $ unlines - [ "main = (x y : x)" ] - wireData <- exportBundle srcConn [termHash] - let input = "matchResult " - ++ " (code rest : err code rest) " - ++ " (value rest : value) " - ++ " (runArboricxArgs " ++ bytesExpr (map toInteger $ BS.unpack wireData) ++ " [(\"left\") (\"right\")])" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - toString (result env) @?= Right "left" - close srcConn - - , testCase "host ABI: constructors expose tag and payload" $ do - library <- evaluateFile "./lib/arboricx.tri" - let stringInput = "hostString \"hello\"" - stringEnv = evalTricu library (parseTricu stringInput) - result stringEnv @?= pairT (ofNumber 1) (ofString "hello") - let tagEnv = evalTricu library (parseTricu "hostValueTag (hostNumber 42)") - result tagEnv @?= ofNumber 2 - let payloadEnv = evalTricu library (parseTricu "hostValuePayload (hostBool true)") - result payloadEnv @?= trueT - - , testCase "runArboricxToTree: wraps raw result as hostTree" $ do - (srcConn, termHash, originalTerm) <- storeTermInTempDB $ unlines - [ "main = t t" ] - wireData <- exportBundle srcConn [termHash] - let input = "matchResult " - ++ " (code rest : err code rest) " - ++ " (hostValue rest : ok hostValue []) " - ++ " (runArboricxToTree " ++ bytesExpr (map toInteger $ BS.unpack wireData) ++ " [])" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (pairT (ofNumber 0) originalTerm) (bytesT []) - close srcConn - - , testCase "runArboricxToString: wraps string result as hostString" $ do - (srcConn, termHash, _) <- storeTermInTempDB $ unlines - [ "main = (x : x)" ] - wireData <- exportBundle srcConn [termHash] - let input = "matchResult " - ++ " (code rest : err code rest) " - ++ " (hostValue rest : ok hostValue []) " - ++ " (runArboricxToString " ++ bytesExpr (map toInteger $ BS.unpack wireData) ++ " [(\"hello\")])" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (pairT (ofNumber 1) (ofString "hello")) (bytesT []) - close srcConn - - , testCase "runArboricxToNumber: wraps number result as hostNumber" $ do - (srcConn, termHash, _) <- storeTermInTempDB $ unlines - [ "main = 42" ] - wireData <- exportBundle srcConn [termHash] - let input = "matchResult " - ++ " (code rest : err code rest) " - ++ " (hostValue rest : ok hostValue []) " - ++ " (runArboricxToNumber " ++ bytesExpr (map toInteger $ BS.unpack wireData) ++ " [])" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - result env @?= okT (pairT (ofNumber 2) (ofNumber 42)) (bytesT []) - close srcConn - - , testCase "runArboricxToBool: rejects non-bool result" $ do - (srcConn, termHash, _) <- storeTermInTempDB $ unlines - [ "main = 42" ] - wireData <- exportBundle srcConn [termHash] - let input = "runArboricxToBool " ++ bytesExpr (map toInteger $ BS.unpack wireData) ++ " []" - library <- evaluateFile "./lib/arboricx.tri" - let env = evalTricu library (parseTricu input) - case result env of - Fork falseTag (Fork code _) | falseTag == falseT -> code @?= ofNumber 14 - actual -> assertFailure $ "expected host codec error, got: " ++ show actual - close srcConn - ] diff --git a/test/fixtures/append.arboricx b/test/fixtures/append.arboricx index 8bb2fd811d0861324e1aef1ba18b883ba0029e1d..51fdce1aac2d428339e549f5585cf5e292f66745 100644 GIT binary patch literal 968 zcmZ{i+fKtU42IJ&80X`Faoji`Ti_7}2#E_o8eDKm+6HN;Yp2p8cpn~=-|n0Z2~ws0 z{n@e8q|MglL+keHSviVEiQoNK1JdEA&t3}YT=!7vcij0dcxjhxvvqlQfAjd%%URQ@ zpQP>J)AREKPGwnylP5d_#WJsC)l~OFL zwBr!DA;LA=pvSy82~CmKZ!WQ}{)!SE7U91tPsT~w(UF^^p-EA`YRhZC?a@|>wFuEc zDRqWf3C*j{Bib#Y5#8;Ym^Meja%R#F?0=OUHH_$}^nsRf4#BrHT-=pXEXWz7dy~6k zIwZ6VK@L+Y&0<8tu&N?-ROlE4)#coAp;U$j(@6+!OJqvu^dO}O=gtb9BZOfW&cor9 zg+XdtFA80Ppj=*9g;>Vqg*Blo5R}h!Rm7TvVb!|O4WXM5wp(2&HL%ADt+p_^?luJP xVTUk=b?9%mpS34W17csC8HfXMVAUZempvM!_PiUQClWb^I28w0x#CPfct5pbWjO!< literal 4711 zcma)93p7+~8=e{C8glEn4Nl1r#-Sr!+!|w)LS`t^&D>m?VKmJcgHvv~b}&>#iaIH3 ziU>y{bQs-Tmg<~At|2PPDH_v1ZT~s{I{*6r_1Sx^z4u!0_dU<^z2E!2@4K-?mt916 zoIeZ>`*tCq=YJmGOF!Yz{q4PUFpa{N4pIBZkfmvIM?*;KH`qz7+JEDR(O7gTemIj#TNkHI8S zm#qH_Rymx>3L`NvOcs^Kl7egfjX)xYT7A>8G*@bpdR1;{@zzW`pU`>mbUwmoT18o6 zTPvejd*IDJj>(vEM22*XJe3wi;xN7~*4w#;;c;Khi;aFZ?}Q#ldFU2aTC-R zYUn&&tm>EN94~&+_2E8W+&-6ehJrt*o*O62C~5o5NkOoPoVqezSmVux?qEM3|40)` zj<$+s%q-oeh&!XVQGK;Px+m>{bA)X5wN5ppM^zlr|fbcg4>2%HO!yLr)^CuZ*`y9a)px zcIBc%Sxc;O-smcEOb8~LW4*GCiqE*3+lfmp?mMIvmq;Fq&963;LrNpJBWH9T zf4cT?A}edw54X?`Q7xQOGv!m9Z7A|8O%$H&*`l94}&u zZ0gJ>h=g&VyJJ>Rt)&q^^}V9$wB#ZR`?Ou??_EJjMqJlHC)3;GC60Cvd9zU^Lxwh5 zu#Tc$$Da*Po$91tYb}24{nkA)@2tdo^Tr_0h=I_~*5)MqgUQ|cusBja9U&vQ=M9m7 z+`%O7oh{rr&PrmR;3EA)x})J*zAs(3c5XrK9+VW0vLUPz19hk<;HN21Y`%!r?-@V`8GnT*_my8_yKtE5(+2%9ZULEv zgov8tytttffzsa6^3()xu9w-U5~LKgd!V8K%1w~crZ-CH|FF z2n2NOKt6>)B>F2CJT?vW;!;&G`$>M|4zdG>T#w0Kihlo$KTkN*52}Gu5O7KF?~;Qb z`jQzwKN=!V=2f(vRv(UPbk$`?|mbDU4 zF9!O%l9H^3gO&?w7$Fc)Dg>Q)2n2NEK=~K~ncbs^6kYgWnnZ7FO~Dvuzn{uVu3rc| zaZs&QDe#n<5y&nD!6HDZ5)`MPv;Zj;ch<`rZ5nHI;|t2TmwUKvS9Z}Xg--mYw{J&T ZsGny*c1S5m3FucIR7fEbPtBo39J&Ai diff --git a/test/fixtures/false.arboricx b/test/fixtures/false.arboricx index 7b9335d3ef4169d15760b567b46716c34706eb7d..a590c3fc1b3e6fe13c8b2b32b60c29861acd1137 100644 GIT binary patch delta 179 zcmdnPvVnPmwj%=+BtU3JXDFYEfe{GdfN?RD$qS+#gPeT*+=4?uY9=m;v|!K7OG&Lr zO#wlZ_Yy6o3*Sz@CztmROoo0%U^hWlc-W QDNY4ROuorz#|jn&0BvLf;<5CHdtYncE5 diff --git a/test/fixtures/id.arboricx b/test/fixtures/id.arboricx index 873873f2ae25d388710df3289a852815490cc44b..05cd76b13e4f78bed1867c94435d64591e22f5cd 100644 GIT binary patch delta 207 zcmZ3@c7}O^wj%=+BtU3J2PmJ3fe{GdfN=qosRN=NgPeT*+=4?uY9=m;v|!K7OG&Lr zO#wlZ_YyWPuXQK+K+!nwD6aQvzgy>}ARX cnZ`W%CZnASREU8U<{&Tw#NlKDDFtE%03H<~i2wiq literal 811 zcmZ<^a`F%IbdF$PWB>ssFo_@(fJ{a(1kn&M141#D-(Ya)JEM8+f@tD0eV!Hp@h(n{0t$bGGv>iR(KhnK2Gg<7f z2M2jB04h{uly?4InupyIntgOu4 ww|>v%^zh(_GS%cs;=IDupR*0;mVcBo7gUTO!k>;zgq%yr5<>DomohQ{0IGZr3jhEB diff --git a/test/fixtures/map.arboricx b/test/fixtures/map.arboricx index 62c4c710639520750b72655f940f2513c3c3a3ce..be3635481501cbc9396e9f7193f6c3f76edc503e 100644 GIT binary patch literal 1079 zcmZ{jTW`}q5QW!COA=^lp#e(oP@q>ETIlsHtyGBzKuUSwv1@xH1vz%**j4!>yz@^v zv(DMH5;$7VneWWbY&J@#d-kpS`TU3Slsq;29>3NgJ^LB6t4eyE5vtwyy+0MN{g&A2 zo_+axaru2zv#9eh%lpx75SD42m|&pOe$Nbx;ARiit3_^%AL%42lXB>rTN9Oq4in>N z;ZGA4D4D%iQf+JW(-^fe?0Gj}#DX-7O;t7?PVZX(N-`Z+_P;t$rCC1EiJ#@M$Bnhrl-q9PAjNB(AOhpqA>XY`hRUaH3_%LA<%qIA*7jV-TjFs z1m!cBWZ zTZo-B2O*E`*@m$68H6O3KBss=8&S=Zz9e0MaFZ5E?ai@vnG{j2kgk%-*LEj^RVU}-#32(zb;Lv(9RCHF;%4pu literal 5593 zcma)A2UJtZ8YUqcMWicAD2sFnQko4W|W z1b$v{=>DJEFZ@>+^!xc-xiVKCFlPJ@?*;wgj*}0*wevzxX%-n)*eWs6-0r=S(e$#w zraXQ>_&>+OMOVg3zI4-K?I2#i?))J($1fO}-P3u-ouaQ2vrW2ebFZl^Sp`clv$MB! zbow_dibyBX=)N9dJ4hh`o>cMN^dx%`L#RKe@$(}f=rkH*6~hn=GToow|NrsB1lI6d*ZrlA z|Nhst2VqUK?Du7J`XDQM((|owmisY7X=E)&LocJn{7y*A z;O{7FAD`LJR0&CET(=+u61opDx25IVjaQQ?DLiL$4acWGvdpXu)tCz8W0Z2*L#av^ zyD~gyL%wR(^=QVh--*{RRN`=Wim_Voh@9s{Jp|GqSK4vn3kP|tpo~b2x;pFBT4V1;UsUuBr@T_23gB13deWnfu<`HUV*_ojd#@S9No0=zAd(oRdsRs z+k{C23mIh6Ii3rVbbZ@ncY3<0c5A=u{w4X1Z@6(P@v>Pg16WP{Xynu_^_)>W^;KD7 zaX;%kF0)zA@i=qZD!;_9o5IwpR(B-LfkZ&zxcIOn+`smAb*Hzpqf4lY$3^Lll94=$ zb}n;HMqgafMJe$yXG&+gp5m7;559N_n?&0FMnOdlGT2Kk3)`<(mqJQG9|lCC;&cSj zD4Z-8X}o?LKD~Fl=~0&+ynMs4Y((mI!t|mFL;_l8AQCMVH>;b1b2Gw@MX2-Q&bz{b zr9xl>zbFu8Bs!5X{Zxnq1ai)6iVWBG8NRC{e|FLLtLweI;_Gd-d&^h~v<^Kh)#+{0`yg}$|hEe7YB zXjEClRgr)ajQ3k-(##-x_ok~|wf^$;`)!z}+a4G78o*~NcQT(JKDjEA7Y_C#)BC#D zP4wm9$7fLqj#iBwQF$7$qsnIYZ2WCqC`1DL(Xeb73-P+a=2p~#fq8$NP^hcMylHJX zq3VJgpF|OMbN=$}2n(Yktf9w^TOCe z^4d}Y&AA-RdHGA1+9@|0^IIHdAf>&(R$4vBW4;HiGu~|%o4ZGTIe&U^OfY$Qw5R`z zk<=Y({DaynvP_AFu;fYYW~Q14S~vG@op67joXMEXodj=C`>Mk6P``Pd` z`KMUtUBZKzn(8CRmAN)OZu7-mpd|{0nW8~$`t`Wl$N{X|D*|d%J#DE2< zxAYprEQBULE!P)m>6V<054?XmBYKEkCgfUFni%WAv{xS$f|P;+4>AQ%f`gRSeGpPT z9eCHPEwJvg=oyFMTBW;_n)B}5Jr+4?X)i_=Lz~Brwl?MkJv7#+6tYmO_}=tzFeeABEuW?v;IT7yeb#q_IJ|9lWVr64tl#6BCsH9(HQYeSmRoo=? zM4+Kcdd<6IubNL{kU})Oz%$R%6}AmSAfN;SvM&e(w6{SY1_D_^iO$;^3~|g8H=>Ua zohOXf^dHBaT5~O8_D$*!e0pDucvcAat6KPeH)L*l+Yj9y!$VcA9w@Ac52u_pg1G^K zfD*u&?wV7Y+bbhvO`c1-S1)Eqx1A-iQa5`0=chy))%C4dHYl@$eE*zsyACm!JRb2D z9>Ym98{idVjA=P4({<(QbLnGFzw(u0;q6PU1;XRJ*gY40yp!%2t3+MBcwR(GTQ7AbWvGcw<=ePAo5nn|b_#)H`z8nn%8KDq~i5=8R9Lf$`# zt2tb~B2WoY2BV!-@tjIP8T;K;Z&~V-4Nqys1}1rK-=yW8n@0Kg+$ILa_1V`Gz+u8zJC+^Npf8DIj`6$%B%q7#(WY*7^s#4 zmGck?D5-)xYE{#{_x>yip5atdb=wy8sO`mObg-9$q$Ts|R)${=O2BQR%mm|$P};e$ z;3S-))EU6Ct==}_ah0S#zV=!5=g%3BA(91zN5a}SUCSxG=6UKBqY`&bZj@nhTV*2A zv23NC=O~~e7$gEB0Uazr6-?zaHS|J4saoL6tGsV}mhV||#oXQ}#yrxgcwByI=jH)E zi6U&Ef4C${S9e)VRrd6!dlB=`Wj@6UMi?5rw3So#LL_Vf*$bo;v?D<6AR}BA2`J{E zx&R^p86AkEv4oyjka;28yh-8rw}pi+Spt9jX!ur`R<(mMH|tnnzp9yl&S}paGAwM4 zo;OkCR$GdnrZXo>F+9d6vHtmnj-1Ln!>ter=<9=K0;(z@641fp>zrcyLf-gEuI`V| zqD@Un-!;cxT5hO6CV*-AU?|&$1LYow1mxDBaDa*%hy=9GK}#6q!F&=$7-$KC&fX9R zsO|vS8w3K%a-g&T#S9b^&|wc$Qb0;UMw0grk0g3 pf9qmCD?UKhdSQ}UH#0LzmpwlRYF|}ML4TfsN?(WsWa%I`{}U-3%SHeI diff --git a/test/fixtures/notQ.arboricx b/test/fixtures/notQ.arboricx index a91ad48502f7a6574045e86820c4f14d63915c45..82c55a0dc8768d01e63a867cd2d36f864e3e415b 100644 GIT binary patch delta 239 zcmbQr@rHSVwj%=+BtU3JCn%qZfe{GdfN>F&=>?)4gPeT*+=4?uY9=m;v|!K7OG&Lr zO#wlZ_Yy}AQz nFR=$nO}@!!2XYkHF|05raWVmgU;xYk$$$jmGH?zvP?QA#wxuG( literal 1045 zcmZ<^a`F%IbdF$PWB>ssFo_@(fJ{a(1kn&M141#DA2~AnQ%Hckwxg%&=93@%a%#QY zTosqEXP&Uk_|vj^TR#ESF`?>Z;)m#EV%sNs@O?;@Ij^&)=aJY{y@&5#Fs!NL*pOxR z=lt}mOE>;;407`Ia|;fE+A5t`l$2kTnOvclRGOEPld6}Sn3tKBT3n)6W(YJ=zN9EM zRW~^?C%H7Iv{*N_A~m_RBrz!`RX0B=D>bREQ!ebg6>G%AEYf6o@P~V|nuOi*gflbn}Z+Qj36wi@;4tOv=;)xdWmXC-181RCPVM98N7uesFP3O1&lFW5hozdhjbKIM%s zwSOhNvT~iHKKaCxzeNW>Y~09w;f~0e;t7gD@7q4KHlLE65$Blm{#^RHR<}c7O9;7? ekR^ol11(`>AR2K3O5XIMRN?It;LV-eoma|+o_kAXXLg|5&@_~=|*h)c;ja(a*e}})6H{-mG zBY~sMy!pNH&K^mp*ZtDlSo>x?BhQ3>kG@tQJ^hT>_gZ@0A(|NXy+1XdEF^Y%-R+(A z-LJ!%rLc(dq8}dwQI#dB4F+M>-?RH=@asORm&?LhKMvElN~?X}9@w}l!zi_W9{sR! ziITaolKQoFKTA-Xz+Q9>hAhbP#MWh_=JbAsztcQSYWrU^&t`cs2va{V5-a;zaMOoT zKai!=@4}LUGG-g2o!GswN@cZzT1B3V7OPd-Vu1L+c|0=(cbZ+G`RqXG9Zl6YWsC|* zX5`j*b4n*kTM&{cm8n%6qK#A`5HhdU9wud5?KTT4kCq&TP%7jY>G38NM{DOP zWWD;*35b(S5X}OH)}Crox1ck$7io()%j7h~ISid)iS#^#wAPMu5Eqyrnu{1(zf3BN zlE-c95)v)BOe$T4YRMIds~B2xjr2N%Tkb0D8?+J4P10MWw;|kS?~o!IcNIR8fX)iU zT_%X;9+M*w@~?86cz{G#_>kgJ(|%0)gyLz_en$G7R8C2LVLInaQaKCRfNT4jC2uI+ zHtlz$9nvm@P@O?GyT;@d#Cs+$AU-fbH0v0;mXD;LAjS@7gN_`|XAG_1B;A6T{|lL3 BZeIWZ literal 0 HcmV?d00001 diff --git a/test/fixtures/true.arboricx b/test/fixtures/true.arboricx index a87c11d3fb858d15deff37f7258f90b15f3b592f..8ec868fee4a38cd9f599f3e1bd9bbc573f147501 100644 GIT binary patch delta 187 zcmZo<-N8IT+mQhZ5+F3A6O_-yzzBqJz_ktlBn`?SDW4b8X>apngoS*9$qbU%`wQy z*Uv3D1gcXyv1npZuqj({Mxv3a8PE{%#G<79qRiw9z1-BI?3`4+y!@0@y)r|floXoO z#J>Rw($2qYJYCa2