Switch manifest serialization to CBOR

Replace JSON-based bundle manifest with a CBOR-encoded format. The manifest
is now a canonical CBOR map with order-strict key decoding, raw 32-byte hash
payloads (instead of hex-encoded JSON), and compact binary representation.
This commit is contained in:
2026-05-07 21:41:50 -05:00
parent d9f25a2b5a
commit e3117e3ac8
23 changed files with 988 additions and 275 deletions

View File

@@ -5,7 +5,9 @@
## 1. Build & Test ## 1. Build & Test
```bash ```bash
# Full build + tests # Tests
nix flake check
# Full build
nix build .# nix build .#
``` ```

View File

@@ -0,0 +1,339 @@
# Arborix Portable Bundle v1 (CBOR Manifest Profile)
Status: **Draft, implementation-aligned** (derived from `src/Wire.hs` as of 2026-05-07)
This document specifies the **actual on-wire format and validation behavior** currently implemented by `tricu` for Arborix bundles, with a focus on the newer CBOR manifest path.
---
## 1. Scope
This profile defines:
1. The binary container envelope (header + section directory + section payloads).
2. The CBOR manifest section format.
3. The Merkle node section format.
4. Decode/verify/import behavior in `Wire.hs`.
5. Known gaps and sane resolutions.
Non-goals:
- tricu source parsing/lambda elimination/module semantics.
- Signature systems / trust policy.
- Compression codecs beyond `none`.
---
## 2. Container format
A bundle is a byte stream:
```
[32-byte header]
[section directory: section_count * 60 bytes]
[section payload bytes...]
```
### 2.1 Header (32 bytes)
| Field | Size | Encoding | Value / Notes |
|---|---:|---|---|
| Magic | 8 | raw bytes | `41 52 42 4f 52 49 58 00` (`"ARBORIX\0"`) |
| Major | 2 | u16 BE | Must be `1` |
| Minor | 2 | u16 BE | Currently `0` |
| SectionCount | 4 | u32 BE | Number of section directory entries |
| Flags | 8 | u64 BE | Currently emitted as `0`; not interpreted |
| DirectoryOffset | 8 | u64 BE | Offset of section directory (currently `32`) |
Reader behavior:
- Reject if total bytes < 32.
- Reject bad magic.
- Reject major != 1.
### 2.2 Section directory entry (60 bytes each)
| Field | Size | Encoding | Notes |
|---|---:|---|---|
| Type | 4 | u32 BE | e.g. 1=manifest, 2=nodes |
| Version | 2 | u16 BE | Currently emitted as `1`; not enforced on read |
| Flags | 2 | u16 BE | bit0 = critical |
| Compression | 2 | u16 BE | `0` = none (required) |
| DigestAlgorithm | 2 | u16 BE | `1` = SHA-256 (required) |
| Offset | 8 | u64 BE | Absolute byte offset |
| Length | 8 | u64 BE | Section payload length |
| Digest | 32 | raw bytes | SHA-256 of section bytes |
Reader behavior:
- Reject unknown **critical** section types.
- Reject compression != 0.
- Reject digest algorithm != 1.
- Reject out-of-bounds sections.
- Reject digest mismatch.
### 2.3 Required section types
| Type | Name | Required |
|---:|---|---|
| 1 | manifest | yes |
| 2 | nodes | yes |
Decode currently rejects duplicate section type 1 or 2.
---
## 3. Manifest section (CBOR)
Manifest bytes are CBOR-encoded map data (using `cborg`).
### 3.1 Top-level manifest schema
Top-level map has **exactly 8 keys** in this exact decode order in current implementation:
1. `schema` (text)
2. `bundleType` (text)
3. `tree` (map)
4. `runtime` (map)
5. `closure` (text: `"complete"|"partial"`)
6. `roots` (array)
7. `exports` (array)
8. `metadata` (map)
> Important: Current decoder is order-strict; it expects keys in this sequence.
### 3.2 Nested structures
#### `tree` map (3 keys, order-strict)
- `calculus`: text
- `nodeHash`: map
- `nodePayload`: text
`nodeHash` map (2 keys, order-strict):
- `algorithm`: text
- `domain`: text
#### `runtime` map (4 keys, order-strict)
- `semantics`: text
- `evaluation`: text
- `abi`: text
- `capabilities`: array(text)
#### `roots` array of maps
Each root map has 2 keys (order-strict):
- `hash`: bytes (raw 32-byte hash payload encoded as CBOR byte string)
- `role`: text
#### `exports` array of maps
Each export map has 4 keys (order-strict):
- `name`: text
- `root`: bytes (32-byte hash)
- `kind`: text
- `abi`: text
#### `metadata` map
Flexible key set; decoded as map(text -> text), then projected into optional fields:
- `package`
- `version`
- `description`
- `license`
- `createdBy`
Unknown metadata keys are ignored.
### 3.3 Default emitted manifest values
Writers in `Wire.hs` currently emit:
- `schema = "arborix.bundle.manifest.v1"`
- `bundleType = "tree-calculus-executable-object"`
- `tree.calculus = "tree-calculus.v1"`
- `tree.nodeHash.algorithm = "sha256"`
- `tree.nodeHash.domain = "arborix.merkle.node.v1"`
- `tree.nodePayload = "arborix.merkle.payload.v1"`
- `runtime.semantics = "tree-calculus.v1"`
- `runtime.evaluation = "normal-order"`
- `runtime.abi = "arborix.abi.tree.v1"`
- `runtime.capabilities = []`
- `closure = "complete"`
- `metadata.createdBy = "arborix"`
---
## 4. Nodes section (binary)
Node section payload layout:
```
node_count: u64 BE
repeat node_count times:
hash: 32 bytes
payload_len: u32 BE
payload: payload_len bytes
```
Node payload grammar:
- `0x00` => Leaf
- `0x01 || child_hash(32)` => Stem
- `0x02 || left_hash(32)||right(32)` => Fork
Section decoder rejects:
- duplicate node hashes,
- truncated entries,
- payload overruns,
- trailing bytes after final node.
---
## 5. Verification behavior (`verifyBundle`)
`verifyBundle` enforces all of:
1. bundle version >= 1.
2. bundle has at least one node.
3. manifest constants match hardcoded v1 values (schema/type/calculus/hash algo/domain/payload/runtime semantics/ABI).
4. runtime capabilities must be empty.
5. closure must be `complete`.
6. manifest has at least one root and one export.
7. root sets in `bundleRoots` and `manifest.roots` must match exactly.
8. each root and export root exists in node map.
9. each node payload deserializes and re-hashes to declared node hash.
10. all referenced child hashes exist.
11. full closure reachability from roots succeeds.
`importBundle` runs decode + verify before storing nodes.
---
## 6. Export/import semantics
### 6.1 Export
`exportNamedBundle`:
- Traverses reachable nodes for each requested root hash.
- Builds node map.
- Builds default manifest and CBOR bytes.
- Emits two sections (manifest, nodes).
`exportBundle` auto-names exports:
- 1 root => `root`
- N>1 => `root0`, `root1`, ...
### 6.2 Import
`importBundle`:
1. Decode bundle.
2. Verify bundle.
3. Insert all node payloads into content store.
4. For each manifest export: reconstruct tree by export root and store name binding in DB.
5. Return bundle root list.
---
## 7. Determinism properties
Current implementation is deterministic for identical logical input because:
- Node map serialized in ascending hash order (`Map.toAscList`).
- Field order in manifest encoding is fixed by code.
- Section ordering is fixed: manifest then nodes.
So repeated exports of same roots produce byte-identical bundles.
---
## 8. Known gaps and sane resolutions
These are important design gaps visible from current code.
### Gap A: Node hash domain mismatch risk (critical)
Status: **resolved in current codebase**.
What was wrong:
- Manifest declared `tree.nodeHash.domain = "arborix.merkle.node.v1"`.
- Hashing implementation previously used `"tricu.merkle.node.v1"`.
Current state:
- Haskell hashing now uses `"arborix.merkle.node.v1"`.
- JS reference runtime hashing now uses `"arborix.merkle.node.v1"`.
- JS manifest validation now requires `"arborix.merkle.node.v1"`.
Remaining recommendation:
- Keep hash-domain constants centralized/shared to prevent future drift.
- Add explicit test vectors for Leaf/Stem/Fork hashes under the Arborix domain.
### Gap B: CBOR decode is order-strict, not generic-map tolerant
Observed:
- Decoder expects exact key order for most maps.
Impact:
- Another canonical CBOR writer that reorders keys may decode-fail even if semantically equivalent.
Sane resolution:
- For v1 compatibility, decode maps as unordered key/value collections, require key presence and types, and reject unknown keys only where desired.
- Keep writer deterministic, but relax reader.
### Gap C: “Canonical CBOR” claim is stronger than implementation
Observed:
- Writer uses fixed order but does not explicitly sort keys per RFC 8949 canonical ordering rules.
Sane resolution:
- Either (a) rename as “deterministic CBOR” profile, or (b) implement explicit canonical key ordering and canonical-length/minimal integer forms checks.
### Gap D: Extra section preservation
Observed:
- Decoder tolerates unknown non-critical sections, but `Bundle` model/encoder drops them on re-encode.
Sane resolution:
- Add `bundleExtraSections :: [SectionEntry+Bytes]` if round-trip preservation is desired.
### Gap E: Section version not enforced
Observed:
- Section entry `Version` is parsed but unused.
Sane resolution:
- Enforce known version matrix (e.g., manifest v1, nodes v1), or explicitly document “advisory only”.
### Gap F: Runtime capability policy is hard fail
Observed:
- Any non-empty capabilities list is rejected.
Sane resolution:
- Keep strict for now, but define capability negotiation strategy for v1.1+ (unknown capabilities => reject unless explicitly allowed by host policy).
### Gap G: Error handling style in import/export path
Observed:
- Several paths throw `error` for malformed data/store misses.
Sane resolution:
- Return `Either`-style typed errors through public API (`decode`, `verify`, `import`), reserve exceptions for truly internal faults.
---
## 9. Conformance checklist (v1 current)
A conforming v1 reader/writer for this profile should:
- Implement the 32-byte header and 60-byte section records exactly.
- Support required sections 1 and 2.
- Verify section digests with SHA-256.
- Decode/encode manifest CBOR matching the field model above.
- Parse nodes section and validate node payload structure.
- Recompute and verify node hashes.
- Enforce complete closure for roots.
- Enforce manifest/runtime constants used by v1.
---
## 10. Suggested follow-up docs
To stabilize interoperability, add:
1. `docs/arborix-bundle-test-vectors.md` (golden header/manifest/nodes + expected hashes).
2. `docs/arborix-bundle-errors.md` (normative error codes/strings).
3. `docs/arborix-bundle-evolution.md` (rules for minor/major upgrades, capability negotiation, extra sections).

View File

@@ -1,49 +0,0 @@
1. Scope
This profile defines the minimum required behavior for runtimes that execute tricu bundles.
2. Non-goals
No tricu source parsing.
No lambda elimination.
No module system.
No package manager.
No local DB requirement.
No authoring names beyond bundle exports.
3. Required bundle sections
Header
Manifest/exports
Merkle nodes
4. Optional/skippable sections
Source, debug, package metadata, signatures, provenance, etc.
5. Entrypoint selection
Explicit export name first.
Else export named main.
Else single default root.
Else error.
6. Node payload format
Leaf/Stem/Fork byte layouts.
7. Hash verification
Domain string and payload hashing rules.
8. Closure verification
All referenced child hashes must exist.
9. Runtime representation
Suggested JS representation, but not normative.
10. Reduction semantics
The six Tree Calculus apply rules.
11. Codecs for v1
Raw tree required.
Maybe string/bool optional or experimental.
12. Required error cases
Bad magic/version, missing export, hash mismatch, malformed payload, missing child.
13. Test fixtures
List of bundles the implementation must pass.

View File

@@ -18,9 +18,12 @@
* Offset 8B u64 BE * Offset 8B u64 BE
* Length 8B u64 BE * Length 8B u64 BE
* SHA256Digest 32B raw * SHA256Digest 32B raw
* Manifest: canonical CBOR-encoded map (cborg output from Haskell)
* Nodes: binary section
*/ */
import { createHash } from "node:crypto"; import { createHash } from "node:crypto";
import { decodeCbor } from "./cbor.js";
// ── Constants ─────────────────────────────────────────────────────────────── // ── Constants ───────────────────────────────────────────────────────────────
@@ -170,12 +173,37 @@ export function parseBundle(buffer) {
} }
/** /**
* Convenience: parse and return just the manifest JSON. * Post-process a CBOR-decoded manifest to normalize hash fields
* from raw bytes to hex strings (matching the old JSON wire format).
*/
function normalizeManifest(raw) {
const tree = raw.tree;
if (tree && tree.nodeHash && tree.nodeHash.domain) {
tree.nodeHash.domain = tree.nodeHash.domain;
}
// Convert root hashes from raw bytes to hex
const roots = (raw.roots || []).map((r) => ({
...r,
hash: r.hash instanceof Uint8Array ? Buffer.from(r.hash).toString("hex") : r.hash,
}));
// Convert export root hashes from raw bytes to hex
const exports = (raw.exports || []).map((e) => ({
...e,
root: e.root instanceof Uint8Array ? Buffer.from(e.root).toString("hex") : e.root,
}));
return { ...raw, roots, exports };
}
/**
* Convenience: parse and return the manifest from CBOR.
*/ */
export function parseManifest(buffer) { export function parseManifest(buffer) {
const bundle = parseBundle(buffer); const bundle = parseBundle(buffer);
const manifestEntry = bundle.sections.get(SECTION_MANIFEST); const manifestEntry = bundle.sections.get(SECTION_MANIFEST);
return JSON.parse(manifestEntry.data.toString("utf-8")); return normalizeManifest(decodeCbor(manifestEntry.data));
} }
/** /**

130
ext/js/src/cbor.js Normal file
View File

@@ -0,0 +1,130 @@
/**
* cbor.js — Minimal CBOR decoder for the Arborix manifest format.
*
* Decodes the canonical CBOR produced by the Haskell cborg library:
* - Maps: major type 5 (0xa0 + length)
* - Arrays: major type 4 (0x80 + length)
* - Text strings: major type 3, UTF-8 encoded
* - Byte strings: major type 2
* - Unsigned ints: major type 0
* - Simple values: 0xc2 = false, 0xc3 = true
*
* Only covers the subset needed for the manifest.
*/
// ── Decoding state ──────────────────────────────────────────────────────────
/**
* @param {Buffer} data
* @returns {number} remaining buffer
*/
function makeDecoder(data) {
let offset = 0;
return {
/** @returns {number} current offset */
getPos() { return offset; },
/** @returns {number} remaining bytes */
remaining() { return data.length - offset; },
/** @returns {number} total length */
length() { return data.length; },
/** Read N bytes and advance */
read(n) {
if (offset + n > data.length) {
throw new Error(`CBOR read: expected ${n} bytes, ${data.length - offset} remaining at offset ${offset}`);
}
const slice = data.slice(offset, offset + n);
offset += n;
return slice;
},
/** Read a single byte */
readByte() {
if (offset >= data.length) {
throw new Error(`CBOR readByte: no bytes remaining at offset ${offset}`);
}
return data[offset++];
},
};
}
// ── CBOR helpers ────────────────────────────────────────────────────────────
/**
* Read a CBOR length (major type initial byte encodes length for values < 24).
* For 24+, reads additional bytes per spec.
* @returns {number}
*/
function cborReadLength(dec, startByte) {
const additional = startByte & 0x1f;
if (additional < 24) return additional;
if (additional === 24) return dec.read(1)[0];
if (additional === 25) return dec.read(2).readUint16BE(0);
if (additional === 26) return dec.read(4).readUint32BE(0);
throw new Error(`CBOR: unsupported additional info ${additional}`);
}
// ── Top-level decode ────────────────────────────────────────────────────────
/**
* Decode a single CBOR value from buffer bytes.
* @param {Buffer} buf
* @returns {*}
*/
export function decodeCbor(buf) {
const dec = makeDecoder(buf);
const result = cborDecode(dec);
return result;
}
function cborDecode(dec) {
const first = dec.readByte();
const major = (first >> 5) & 0x07;
const info = first & 0x1f;
switch (major) {
case 0: // unsigned int
case 1: // negative int
return cborReadLength(dec, first);
case 2: // byte string
return dec.read(cborReadLength(dec, first));
case 3: // text string (UTF-8)
const len = cborReadLength(dec, first);
return dec.read(len).toString("utf-8");
case 4: // array
const arrLen = cborReadLength(dec, first);
const arr = [];
for (let i = 0; i < arrLen; i++) {
arr.push(cborDecode(dec));
}
return arr;
case 5: // map
const mapLen = cborReadLength(dec, first);
const map = {};
for (let i = 0; i < mapLen; i++) {
const key = cborDecode(dec);
const val = cborDecode(dec);
map[key] = val;
}
return map;
case 7: // simple values / floats
if (info === 20) return false;
if (info === 21) return true;
if (info === 22) return null; // undefined
if (info === 23) return null; // break (shouldn't appear in definite-length)
// 0xf9-fb are half/float/double floats — not used by our writer
throw new Error(`CBOR: unsupported simple value ${info}`);
default:
// Tags (major 6) and break (0xff) — not used in our manifest
throw new Error(`CBOR: unsupported major type ${major}, info ${info}`);
}
}

View File

@@ -33,7 +33,7 @@ export function validateManifest(manifest) {
`unsupported node hash algorithm: ${tree.nodeHash.algorithm}` `unsupported node hash algorithm: ${tree.nodeHash.algorithm}`
); );
} }
if (tree.nodeHash.domain !== "tricu.merkle.node.v1" && tree.nodeHash.domain !== "arborix.merkle.node.v1") { if (tree.nodeHash.domain !== "arborix.merkle.node.v1") {
throw new Error( throw new Error(
`unsupported node hash domain: ${tree.nodeHash.domain}` `unsupported node hash domain: ${tree.nodeHash.domain}`
); );

View File

@@ -7,14 +7,14 @@
* Fork: 0x02 || left_hash (32 bytes raw) || right_hash (32 bytes raw) * Fork: 0x02 || left_hash (32 bytes raw) || right_hash (32 bytes raw)
* *
* Hash computation: * Hash computation:
* hash = SHA256( "tricu.merkle.node.v1" || 0x00 || node_payload ) * hash = SHA256( "arborix.merkle.node.v1" || 0x00 || node_payload )
*/ */
import { createHash } from "node:crypto"; import { createHash } from "node:crypto";
// ── Constants ─────────────────────────────────────────────────────────────── // ── Constants ───────────────────────────────────────────────────────────────
const DOMAIN_TAG = "tricu.merkle.node.v1"; const DOMAIN_TAG = "arborix.merkle.node.v1";
const HASH_LENGTH = 32; // raw hash bytes const HASH_LENGTH = 32; // raw hash bytes
const HEX_LENGTH = 64; // hex-encoded hash length const HEX_LENGTH = 64; // hex-encoded hash length

View File

@@ -1,5 +1,6 @@
import { readFileSync } from "node:fs"; import { readFileSync } from "node:fs";
import { strictEqual, ok, throws } from "node:assert"; import { strictEqual, ok, throws } from "node:assert";
import { createHash } from "node:crypto";
import { describe, it } from "node:test"; import { describe, it } from "node:test";
import { import {
parseBundle, parseBundle,
@@ -13,12 +14,12 @@ import {
parseNodeSection as parseNodes, parseNodeSection as parseNodes,
} from "../src/merkle.js"; } from "../src/merkle.js";
const fixtureDir = "test/fixtures"; const fixtureDir = "../../test/fixtures";
describe("bundle parsing", () => { describe("bundle parsing", () => {
it("valid bundle parses header and sections", () => { it("valid bundle parses header and sections", () => {
const bundle = parseBundle( const bundle = parseBundle(
readFileSync(`${fixtureDir}/id.tri.bundle`) readFileSync(`${fixtureDir}/id.arborix`)
); );
strictEqual(bundle.version, "1.0"); strictEqual(bundle.version, "1.0");
strictEqual(bundle.sectionCount, 2); strictEqual(bundle.sectionCount, 2);
@@ -26,15 +27,16 @@ describe("bundle parsing", () => {
ok(bundle.sections.has(2)); // nodes ok(bundle.sections.has(2)); // nodes
}); });
it("parseManifest returns valid JSON", () => { it("parseManifest returns valid manifest", () => {
const manifest = parseManifest( const manifest = parseManifest(
readFileSync(`${fixtureDir}/id.tri.bundle`) readFileSync(`${fixtureDir}/id.arborix`)
); );
strictEqual(manifest.schema, "arborix.bundle.manifest.v1"); strictEqual(manifest.schema, "arborix.bundle.manifest.v1");
strictEqual(manifest.bundleType, "tree-calculus-executable-object"); strictEqual(manifest.bundleType, "tree-calculus-executable-object");
strictEqual(manifest.closure, "complete"); strictEqual(manifest.closure, "complete");
strictEqual(manifest.tree.calculus, "tree-calculus.v1"); strictEqual(manifest.tree.calculus, "tree-calculus.v1");
strictEqual(manifest.tree.nodeHash.algorithm, "sha256"); strictEqual(manifest.tree.nodeHash.algorithm, "sha256");
strictEqual(manifest.tree.nodeHash.domain, "arborix.merkle.node.v1");
strictEqual(manifest.runtime.semantics, "tree-calculus.v1"); strictEqual(manifest.runtime.semantics, "tree-calculus.v1");
strictEqual(manifest.runtime.abi, "arborix.abi.tree.v1"); strictEqual(manifest.runtime.abi, "arborix.abi.tree.v1");
}); });
@@ -43,7 +45,7 @@ describe("bundle parsing", () => {
describe("hash verification", () => { describe("hash verification", () => {
it("valid bundle nodes verify", () => { it("valid bundle nodes verify", () => {
const data = bundleParseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.tri.bundle`) readFileSync(`${fixtureDir}/id.arborix`)
); );
const { nodeMap } = parseNodes(data); const { nodeMap } = parseNodes(data);
const { verified } = verifyNodeHashes(nodeMap); const { verified } = verifyNodeHashes(nodeMap);
@@ -64,4 +66,69 @@ describe("errors", () => {
buf.writeUInt16BE(2, 8); // major version 2 buf.writeUInt16BE(2, 8); // major version 2
throws(() => parseBundle(buf), /unsupported bundle major version/); throws(() => parseBundle(buf), /unsupported bundle major version/);
}); });
it("bad section digest fails", () => {
const buf = readFileSync(`${fixtureDir}/id.arborix`);
// Corrupt one byte in the manifest section
buf[152] ^= 0x01;
throws(() => parseBundle(buf), /digest mismatch/);
});
it("truncated bundle fails", () => {
const buf = readFileSync(`${fixtureDir}/id.arborix`);
const truncated = buf.slice(0, 40);
throws(() => parseBundle(truncated), /truncated/);
});
it("missing nodes section fails", () => {
// Build a bundle with only manifest entry in the directory (1 section instead of 2)
const header = Buffer.alloc(32, 0);
header.write("ARBORIX\0", 0, 8);
header.writeUInt16BE(1, 8); // major version
header.writeUInt16BE(0, 10); // minor version
header.writeUInt32BE(1, 12); // 1 section
// Build a manifest JSON
const manifestObj = {
schema: "arborix.bundle.manifest.v1",
bundleType: "tree-calculus-executable-object",
tree: {
calculus: "tree-calculus.v1",
nodeHash: {
algorithm: "sha256",
domain: "arborix.merkle.node.v1"
},
nodePayload: "arborix.merkle.payload.v1"
},
runtime: {
semantics: "tree-calculus.v1",
evaluation: "normal-order",
abi: "arborix.abi.tree.v1",
capabilities: []
},
closure: "complete",
roots: [{ hash: Buffer.alloc(32).toString("hex"), role: "default" }],
exports: [{ name: "root", root: Buffer.alloc(32).toString("hex"), kind: "term", abi: "arborix.abi.tree.v1" }],
metadata: { createdBy: "arborix" }
};
const manifestJson = JSON.stringify(manifestObj);
const manifestBytes = Buffer.from(manifestJson);
// Section directory entry (60 bytes, all fields are u64 after the u16s)
const entry = Buffer.alloc(60, 0);
entry.writeUInt32BE(1, 0); // type: manifest
entry.writeUInt16BE(1, 4); // version
entry.writeUInt16BE(1, 6); // flags: critical
entry.writeUInt16BE(0, 8); // compression: none
entry.writeUInt16BE(1, 10); // digest algorithm: sha256
entry.writeBigUInt64BE(BigInt(32 + 60), 12); // offset (u64)
entry.writeBigUInt64BE(BigInt(manifestBytes.length), 20); // length (u64)
entry.set(createHash("sha256").update(manifestBytes).digest(), 28); // digest (32 bytes)
// Set dirOffset to 32 so parseBundle reads directory from after header
header.writeBigUInt64BE(BigInt(32), 24);
const bundleBuf = Buffer.concat([header, entry, manifestBytes]);
throws(() => parseBundle(bundleBuf), /missing required section/);
});
}); });

View File

@@ -1,13 +1,14 @@
import { readFileSync } from "node:fs"; import { readFileSync } from "node:fs";
import { strictEqual, ok } from "node:assert"; import { strictEqual, ok } from "node:assert";
import { describe, it } from "node:test"; import { describe, it } from "node:test";
import { parseNodeSection } from "../src/bundle.js"; import { parseNodeSection as bundleParseNodeSection, parseBundle, parseManifest } from "../src/bundle.js";
import { import {
verifyNodeHashes, verifyNodeHashes,
verifyClosure, verifyClosure,
verifyRootClosure, verifyRootClosure,
deserializePayload, deserializePayload,
computeNodeHash, computeNodeHash,
parseNodeSection,
} from "../src/merkle.js"; } from "../src/merkle.js";
describe("merkle — deserializePayload", () => { describe("merkle — deserializePayload", () => {
@@ -49,46 +50,70 @@ describe("merkle — computeNodeHash", () => {
const hash = computeNodeHash(leaf); const hash = computeNodeHash(leaf);
strictEqual(hash.length, 64); strictEqual(hash.length, 64);
}); });
it("Leaf hash matches expected Arborix domain", () => {
const leaf = { type: "leaf" };
const hash = computeNodeHash(leaf);
strictEqual(hash, "e54db458aa8e94782f7c61ad6c1f19a1c0c6fca7ffe53674f0d2bc5ff7ab02ff");
});
}); });
describe("merkle — node section parsing", () => { describe("merkle — node section parsing", () => {
const fixtureDir = "test/fixtures"; const fixtureDir = "../../test/fixtures";
it("parses id.tri.bundle with correct node count", () => { it("parses id.arborix with correct node count", () => {
const data = parseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.tri.bundle`) readFileSync(`${fixtureDir}/id.arborix`)
); );
const { nodeMap } = parseNodes(data); const { nodeMap } = parseNodeSection(data);
strictEqual(nodeMap.size, 4); strictEqual(nodeMap.size, 4);
}); });
it("parses true.tri.bundle with correct node count", () => { it("parses true.arborix with correct node count", () => {
const data = parseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/true.tri.bundle`) readFileSync(`${fixtureDir}/true.arborix`)
); );
const { nodeMap } = parseNodes(data); const { nodeMap } = parseNodeSection(data);
strictEqual(nodeMap.size, 2); strictEqual(nodeMap.size, 2);
}); });
it("parses false.arborix with correct node count", () => {
const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/false.arborix`)
);
const { nodeMap } = parseNodeSection(data);
strictEqual(nodeMap.size, 1);
});
}); });
describe("merkle — hash verification", () => { describe("merkle — hash verification", () => {
const fixtureDir = "test/fixtures"; const fixtureDir = "../../test/fixtures";
it("id.tri.bundle nodes all verify", () => { it("id.arborix nodes all verify", () => {
const data = parseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.tri.bundle`) readFileSync(`${fixtureDir}/id.arborix`)
); );
const { nodeMap } = parseNodes(data); const { nodeMap } = parseNodeSection(data);
const { verified, mismatches } = verifyNodeHashes(nodeMap); const { verified, mismatches } = verifyNodeHashes(nodeMap);
ok(verified, "id.tri.bundle node hashes should verify"); ok(verified, "id.arborix node hashes should verify");
strictEqual(mismatches.length, 0);
});
it("true.arborix nodes all verify", () => {
const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/true.arborix`)
);
const { nodeMap } = parseNodeSection(data);
const { verified, mismatches } = verifyNodeHashes(nodeMap);
ok(verified, "true.arborix node hashes should verify");
strictEqual(mismatches.length, 0); strictEqual(mismatches.length, 0);
}); });
it("corrupted node payload fails hash verification", () => { it("corrupted node payload fails hash verification", () => {
const data = parseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.tri.bundle`) readFileSync(`${fixtureDir}/id.arborix`)
); );
const { nodeMap } = parseNodes(data); const { nodeMap } = parseNodeSection(data);
// Find a stem node to corrupt // Find a stem node to corrupt
let stemKey = null; let stemKey = null;
for (const [key, node] of nodeMap) { for (const [key, node] of nodeMap) {
@@ -110,32 +135,39 @@ describe("merkle — hash verification", () => {
}); });
describe("merkle — closure verification", () => { describe("merkle — closure verification", () => {
const fixtureDir = "test/fixtures"; const fixtureDir = "../../test/fixtures";
it("id.tri.bundle has complete closure", () => { it("id.arborix has complete closure", () => {
const data = parseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.tri.bundle`) readFileSync(`${fixtureDir}/id.arborix`)
); );
const { nodeMap } = parseNodes(data); const { nodeMap } = parseNodeSection(data);
const { complete, missing } = verifyClosure(nodeMap); const { complete, missing } = verifyClosure(nodeMap);
ok(complete, "id.tri.bundle should have complete closure"); ok(complete, "id.arborix should have complete closure");
strictEqual(missing.length, 0); strictEqual(missing.length, 0);
}); });
it("verifyRootClosure checks transitive reachability", () => { it("verifyRootClosure checks transitive reachability", () => {
const data = parseNodeSection( const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.tri.bundle`) readFileSync(`${fixtureDir}/id.arborix`)
); );
const { nodeMap } = parseNodes(data); const { nodeMap } = parseNodeSection(data);
const rootHash = "039cc9aacf5be78ec1975713e6ad154a36988e3f3df18589b0d0c801d0825d78"; // Use the actual root hash from the fixture's manifest
const manifest = parseManifest(readFileSync(`${fixtureDir}/id.arborix`));
const rootHash = manifest.exports[0].root;
const { complete, missingRoots } = verifyRootClosure(nodeMap, rootHash); const { complete, missingRoots } = verifyRootClosure(nodeMap, rootHash);
ok(complete, "root should be reachable"); ok(complete, "root should be reachable");
strictEqual(missingRoots.length, 0); strictEqual(missingRoots.length, 0);
}); });
});
// Helper import it("parseNodeSection returns correct node count", () => {
import { parseNodeSection as parseNodes } from "../src/merkle.js"; const data = bundleParseNodeSection(
readFileSync(`${fixtureDir}/id.arborix`)
);
const result = parseNodeSection(data);
strictEqual(result.count, 4);
});
});
// Helper for throws // Helper for throws
function throws(fn, expected) { function throws(fn, expected) {

View File

@@ -7,10 +7,10 @@ import { validateManifest, selectExport } from "../src/manifest.js";
import { verifyNodeHashes, parseNodeSection as parseNodes } from "../src/merkle.js"; import { verifyNodeHashes, parseNodeSection as parseNodes } from "../src/merkle.js";
import { buildTreeFromNodeMap } from "../src/cli.js"; import { buildTreeFromNodeMap } from "../src/cli.js";
const fixtureDir = "test/fixtures"; const fixtureDir = "../../test/fixtures";
describe("run bundle — id.tri.bundle", () => { describe("run bundle — id.arborix", () => {
const bundle = readFileSync(`${fixtureDir}/id.tri.bundle`); const bundle = readFileSync(`${fixtureDir}/id.arborix`);
const manifest = parseManifest(bundle); const manifest = parseManifest(bundle);
const nodeSectionData = bundleParseNodeSection(bundle); const nodeSectionData = bundleParseNodeSection(bundle);
const { nodeMap } = parseNodes(nodeSectionData); const { nodeMap } = parseNodes(nodeSectionData);
@@ -24,25 +24,21 @@ describe("run bundle — id.tri.bundle", () => {
ok(verified); ok(verified);
}); });
it("export 'id' is selectable", () => { it("export 'root' is selectable", () => {
const exp = selectExport(manifest, "id"); const exp = selectExport(manifest, "root");
strictEqual(exp.name, "id"); strictEqual(exp.name, "root");
}); });
it("tree reconstructs as a Fork", () => { it("tree reconstructs as a Fork", () => {
const exp = selectExport(manifest, "id"); const exp = selectExport(manifest, "root");
const tree = buildTreeFromNodeMap(nodeMap, exp.root); const tree = buildTreeFromNodeMap(nodeMap, exp.root);
ok(Array.isArray(tree)); ok(Array.isArray(tree));
// id = t (t t) = Fork (Stem Leaf) Leaf...
// In Haskell: id = S = t (t (t t)) t
// This is Fork (Fork (Fork Leaf Leaf) Leaf) Leaf
// In array form: [[[], []], [], []]
ok(tree.length >= 2, "tree should be a Fork (length >= 2)"); ok(tree.length >= 2, "tree should be a Fork (length >= 2)");
}); });
}); });
describe("run bundle — true.tri.bundle", () => { describe("run bundle — true.arborix", () => {
const bundle = readFileSync(`${fixtureDir}/true.tri.bundle`); const bundle = readFileSync(`${fixtureDir}/true.arborix`);
const manifest = parseManifest(bundle); const manifest = parseManifest(bundle);
const nodeSectionData = bundleParseNodeSection(bundle); const nodeSectionData = bundleParseNodeSection(bundle);
const { nodeMap } = parseNodes(nodeSectionData); const { nodeMap } = parseNodes(nodeSectionData);
@@ -51,20 +47,60 @@ describe("run bundle — true.tri.bundle", () => {
validateManifest(manifest); validateManifest(manifest);
}); });
it("export 'const' is selectable", () => { it("export 'root' is selectable", () => {
const exp = selectExport(manifest, "const"); const exp = selectExport(manifest, "root");
strictEqual(exp.name, "const"); strictEqual(exp.name, "root");
}); });
it("tree reconstructs", () => { it("tree reconstructs as Stem Leaf", () => {
const exp = selectExport(manifest, "const"); const exp = selectExport(manifest, "root");
const tree = buildTreeFromNodeMap(nodeMap, exp.root); const tree = buildTreeFromNodeMap(nodeMap, exp.root);
ok(Array.isArray(tree)); ok(Array.isArray(tree));
strictEqual(tree.length, 1, "true should be a Stem (single child)");
strictEqual(tree[0].length, 0, "child should be Leaf");
});
});
describe("run bundle — false.arborix", () => {
const bundle = readFileSync(`${fixtureDir}/false.arborix`);
const manifest = parseManifest(bundle);
const nodeSectionData = bundleParseNodeSection(bundle);
const { nodeMap } = parseNodes(nodeSectionData);
it("manifest validates", () => {
validateManifest(manifest);
});
it("export 'root' is selectable", () => {
const exp = selectExport(manifest, "root");
strictEqual(exp.name, "root");
});
it("tree reconstructs as Leaf", () => {
const exp = selectExport(manifest, "root");
const tree = buildTreeFromNodeMap(nodeMap, exp.root);
strictEqual(tree.length, 0, "false should be Leaf (empty array)");
});
});
describe("run bundle — notQ.arborix", () => {
const bundle = readFileSync(`${fixtureDir}/notQ.arborix`);
const manifest = parseManifest(bundle);
const nodeSectionData = bundleParseNodeSection(bundle);
const { nodeMap } = parseNodes(nodeSectionData);
it("manifest validates", () => {
validateManifest(manifest);
});
it("node hashes verify", () => {
const { verified } = verifyNodeHashes(nodeMap);
ok(verified);
}); });
}); });
describe("run bundle — missing export", () => { describe("run bundle — missing export", () => {
const bundle = readFileSync(`${fixtureDir}/id.tri.bundle`); const bundle = readFileSync(`${fixtureDir}/id.arborix`);
const manifest = parseManifest(bundle); const manifest = parseManifest(bundle);
it("nonexistent export fails clearly", () => { it("nonexistent export fails clearly", () => {
@@ -73,8 +109,8 @@ describe("run bundle — missing export", () => {
}); });
describe("run bundle — auto-select", () => { describe("run bundle — auto-select", () => {
// true.tri.bundle has only one export, should auto-select // true.arborix has only one export, should auto-select
const bundle = readFileSync(`${fixtureDir}/true.tri.bundle`); const bundle = readFileSync(`${fixtureDir}/true.arborix`);
const manifest = parseManifest(bundle); const manifest = parseManifest(bundle);
it("single export auto-selects", () => { it("single export auto-selects", () => {

View File

@@ -18,9 +18,14 @@
tricuStatic = hsLib.justStaticExecutables self.packages.${system}.default; tricuStatic = hsLib.justStaticExecutables self.packages.${system}.default;
tricuPackage = tricuPackageTests =
haskellPackages.callCabal2nix packageName self {}; haskellPackages.callCabal2nix packageName self {};
tricuPackage =
hsLib.dontCheck (
haskellPackages.callCabal2nix packageName self {}
);
customGHC = haskellPackages.ghcWithPackages (hpkgs: with hpkgs; [ customGHC = haskellPackages.ghcWithPackages (hpkgs: with hpkgs; [
megaparsec megaparsec
]); ]);
@@ -28,8 +33,8 @@
packages.${packageName} = tricuPackage; packages.${packageName} = tricuPackage;
packages.default = tricuPackage; packages.default = tricuPackage;
checks.${packageName} = tricuPackage; checks.${packageName} = tricuPackageTests;
checks.default = tricuPackage; checks.default = tricuPackageTests;
defaultPackage = self.packages.${system}.default; defaultPackage = self.packages.${system}.default;

View File

@@ -85,12 +85,12 @@ serializeNode (NFork l r) = BS.pack [0x02] <> go (decode (encodeUtf8 l)) <> go (
go (Right bs) = bs go (Right bs) = bs
-- | Hash a node per the Merkle content-addressing spec. -- | Hash a node per the Merkle content-addressing spec.
-- hash = SHA256( "tricu.merkle.node.v1" <> 0x00 <> node_payload ) -- hash = SHA256( "arborix.merkle.node.v1" <> 0x00 <> node_payload )
nodeHash :: Node -> MerkleHash nodeHash :: Node -> MerkleHash
nodeHash node = decodeUtf8 (encode (sha256WithPrefix (serializeNode node))) nodeHash node = decodeUtf8 (encode (sha256WithPrefix (serializeNode node)))
where sha256WithPrefix payload = where sha256WithPrefix payload =
convert . (hash :: BS.ByteString -> Digest SHA256) $ utf8Tag <> BS.pack [0x00] <> payload convert . (hash :: BS.ByteString -> Digest SHA256) $ utf8Tag <> BS.pack [0x00] <> payload
utf8Tag = BS.pack $ map fromIntegral $ BS.unpack "tricu.merkle.node.v1" utf8Tag = BS.pack $ map fromIntegral $ BS.unpack "arborix.merkle.node.v1"
-- | Deserialize a Node from canonical bytes. -- | Deserialize a Node from canonical bytes.
deserializeNode :: BS.ByteString -> Node deserializeNode :: BS.ByteString -> Node

View File

@@ -1,4 +1,5 @@
{-# LANGUAGE DeriveGeneric #-} {-# LANGUAGE DeriveGeneric #-}
{-# LANGUAGE OverloadedStrings #-}
module Wire module Wire
( Bundle (..) ( Bundle (..)
@@ -8,7 +9,7 @@ module Wire
, RuntimeSpec (..) , RuntimeSpec (..)
, BundleRoot (..) , BundleRoot (..)
, BundleExport (..) , BundleExport (..)
, BundleMetadata (..) , BundleMetadata
, ClosureMode (..) , ClosureMode (..)
, encodeBundle , encodeBundle
, decodeBundle , decodeBundle
@@ -23,21 +24,26 @@ module Wire
import ContentStore (getNodeMerkle, loadTree, putMerkleNode, storeTerm) import ContentStore (getNodeMerkle, loadTree, putMerkleNode, storeTerm)
import Research import Research
import Codec.CBOR.Decoding ( Decoder
, decodeString
, decodeBytes
, decodeListLen
, decodeMapLen
)
import Control.Monad (replicateM, forM)
import Codec.CBOR.Encoding ( Encoding
, encodeMapLen
, encodeListLen
, encodeString
, encodeBytes
)
import Codec.CBOR.Write (toLazyByteString)
import Data.Monoid (mconcat)
import Codec.CBOR.Read (deserialiseFromBytes, DeserialiseFailure(..))
import Control.Exception (SomeException, evaluate, try) import Control.Exception (SomeException, evaluate, try)
import Control.Monad (foldM, unless, when) import Control.Monad (foldM, unless, when)
import Crypto.Hash (Digest, SHA256, hash) import Crypto.Hash (Digest, SHA256, hash)
import Data.Aeson ( FromJSON (..)
, ToJSON (..)
, Value (String)
, eitherDecodeStrict'
, encode
, object
, withObject
, (.:)
, (.:?)
, (.!=)
, (.=)
)
import Data.Bits ((.&.), (.|.), shiftL, shiftR) import Data.Bits ((.&.), (.|.), shiftL, shiftR)
import Data.ByteArray (convert) import Data.ByteArray (convert)
import Data.ByteString (ByteString) import Data.ByteString (ByteString)
@@ -84,54 +90,121 @@ compressionNone, digestSha256 :: Word16
compressionNone = 0 compressionNone = 0
digestSha256 = 1 digestSha256 = 1
-- | Closure declaration. V1 only accepts complete bundles for import. -- ---------------------------------------------------------------------------
-- CBOR encoding helpers
-- ---------------------------------------------------------------------------
-- | Canonical CBOR map length encoder.
cmkLen :: Int -> Encoding
cmkLen n = encodeMapLen (fromIntegral n)
-- | Decode a CBOR array of n elements.
decodeListN :: Decoder s a -> Int -> Decoder s [a]
decodeListN dec n = replicateM n dec
-- | Decode a CBOR map (sequence of key-value pairs).
decodeMapN :: Decoder s a -> Decoder s b -> Int -> Decoder s [(a, b)]
decodeMapN keyDec valDec n = forM [1..n] $ \_ ->
keyDec >>= \k -> valDec >>= \v -> pure (k, v)
decodeKey :: Text -> Decoder s ()
decodeKey expected = do
actual <- decodeString
unless (actual == expected) $
fail $ "expected key " ++ show expected ++ ", got " ++ show actual
-- | Canonical CBOR array length encoder.
cakLen :: Int -> Encoding
cakLen n = encodeListLen (fromIntegral n)
-- | Encode a canonical CBOR map with key-value pairs as flat sequence.
cmkPairs :: [(Text, Encoding)] -> Encoding
cmkPairs [] = cmkLen 0
cmkPairs kvs = cmkLen (length kvs) <> mconcat [encodeString k <> v | (k, v) <- kvs]
-- | Encode a canonical CBOR array.
cakSeq :: [Encoding] -> Encoding
cakSeq [] = cakLen 0
cakSeq xs = cakLen (length xs) <> mconcat xs
-- | Encode a canonical CBOR text string.
encText :: Text -> Encoding
encText = encodeString
-- | Encode a canonical CBOR byte string.
encBytes :: ByteString -> Encoding
encBytes = encodeBytes
-- ---------------------------------------------------------------------------
-- Data types with CBOR instances
-- ---------------------------------------------------------------------------
-- | Closure declaration.
data ClosureMode = ClosureComplete | ClosurePartial data ClosureMode = ClosureComplete | ClosurePartial
deriving (Show, Eq, Ord, Generic) deriving (Show, Eq, Ord, Generic)
instance ToJSON ClosureMode where toCBORClosure :: ClosureMode -> Encoding
toJSON ClosureComplete = String "complete" toCBORClosure = encText . \case
toJSON ClosurePartial = String "partial" ClosureComplete -> "complete"
ClosurePartial -> "partial"
instance FromJSON ClosureMode where closureFromCBOR :: Decoder s ClosureMode
parseJSON (String "complete") = pure ClosureComplete closureFromCBOR = decodeString >>= \case
parseJSON (String "partial") = pure ClosurePartial "complete" -> pure ClosureComplete
parseJSON _ = fail "closure must be \"complete\" or \"partial\"" "partial" -> pure ClosurePartial
other -> fail $ "ClosureMode: " ++ show other
-- | Hash specification (algorithm + domain strings).
data NodeHashSpec = NodeHashSpec data NodeHashSpec = NodeHashSpec
{ nodeHashAlgorithm :: Text { nodeHashAlgorithm :: Text
, nodeHashDomain :: Text , nodeHashDomain :: Text
} deriving (Show, Eq, Ord, Generic) } deriving (Show, Eq, Ord, Generic)
instance ToJSON NodeHashSpec where toCBORNodeHashSpec :: NodeHashSpec -> Encoding
toJSON s = object toCBORNodeHashSpec (NodeHashSpec alg dom) =
[ "algorithm" .= nodeHashAlgorithm s cmkPairs
, "domain" .= nodeHashDomain s [ ("algorithm", encText alg)
, ("domain", encText dom)
] ]
instance FromJSON NodeHashSpec where nodeHashSpecFromCBOR :: Decoder s NodeHashSpec
parseJSON = withObject "NodeHashSpec" $ \o -> NodeHashSpec nodeHashSpecFromCBOR = do
<$> o .: "algorithm" n <- decodeMapLen
<*> o .: "domain" unless (n == 2) $ fail "NodeHashSpec: must have exactly 2 entries"
decodeKey "algorithm"
alg <- decodeString
decodeKey "domain"
dom <- decodeString
pure (NodeHashSpec alg dom)
-- | Tree specification.
data TreeSpec = TreeSpec data TreeSpec = TreeSpec
{ treeCalculus :: Text { treeCalculus :: Text
, treeNodeHash :: NodeHashSpec , treeNodeHash :: NodeHashSpec
, treeNodePayload :: Text , treeNodePayload :: Text
} deriving (Show, Eq, Ord, Generic) } deriving (Show, Eq, Ord, Generic)
instance ToJSON TreeSpec where toCBORTreeSpec :: TreeSpec -> Encoding
toJSON s = object toCBORTreeSpec (TreeSpec calc hspec payload) =
[ "calculus" .= treeCalculus s cmkPairs
, "nodeHash" .= treeNodeHash s [ ("calculus", encText calc)
, "nodePayload" .= treeNodePayload s , ("nodeHash", toCBORNodeHashSpec hspec)
, ("nodePayload", encText payload)
] ]
instance FromJSON TreeSpec where treeSpecFromCBOR :: Decoder s TreeSpec
parseJSON = withObject "TreeSpec" $ \o -> TreeSpec treeSpecFromCBOR = do
<$> o .: "calculus" n <- decodeMapLen
<*> o .: "nodeHash" unless (n == 3) $ fail "TreeSpec: must have exactly 3 entries"
<*> o .: "nodePayload" decodeKey "calculus"
calc <- decodeString
decodeKey "nodeHash"
hspec <- nodeHashSpecFromCBOR
decodeKey "nodePayload"
payload <- decodeString
pure (TreeSpec calc hspec payload)
-- | Runtime specification.
data RuntimeSpec = RuntimeSpec data RuntimeSpec = RuntimeSpec
{ runtimeSemantics :: Text { runtimeSemantics :: Text
, runtimeEvaluation :: Text , runtimeEvaluation :: Text
@@ -139,65 +212,85 @@ data RuntimeSpec = RuntimeSpec
, runtimeCapabilities :: [Text] , runtimeCapabilities :: [Text]
} deriving (Show, Eq, Ord, Generic) } deriving (Show, Eq, Ord, Generic)
instance ToJSON RuntimeSpec where toCBORRuntimeSpec :: RuntimeSpec -> Encoding
toJSON s = object toCBORRuntimeSpec (RuntimeSpec sem eval abi caps) =
[ "semantics" .= runtimeSemantics s cmkPairs
, "evaluation" .= runtimeEvaluation s [ ("semantics", encText sem)
, "abi" .= runtimeAbi s , ("evaluation", encText eval)
, "capabilities" .= runtimeCapabilities s , ("abi", encText abi)
, ("capabilities", cakSeq (map encText caps))
] ]
instance FromJSON RuntimeSpec where runtimeSpecFromCBOR :: Decoder s RuntimeSpec
parseJSON = withObject "RuntimeSpec" $ \o -> RuntimeSpec runtimeSpecFromCBOR = do
<$> o .: "semantics" n <- decodeMapLen
<*> o .: "evaluation" unless (n == 4) $ fail "RuntimeSpec: must have exactly 4 entries"
<*> o .: "abi" decodeKey "semantics"
<*> o .:? "capabilities" .!= [] sem <- decodeString
decodeKey "evaluation"
eval <- decodeString
decodeKey "abi"
abi <- decodeString
decodeKey "capabilities"
clen <- decodeListLen
caps <- decodeListN decodeString clen
pure (RuntimeSpec sem eval abi caps)
-- | A root hash reference.
data BundleRoot = BundleRoot data BundleRoot = BundleRoot
{ rootHash :: MerkleHash { rootHash :: MerkleHash
, rootRole :: Text , rootRole :: Text
} deriving (Show, Eq, Ord, Generic) } deriving (Show, Eq, Ord, Generic)
instance ToJSON BundleRoot where toCBORBundleRoot :: BundleRoot -> Encoding
toJSON r = object toCBORBundleRoot (BundleRoot h role) =
[ "hash" .= rootHash r cmkPairs
, "role" .= rootRole r [ ("hash", encBytes (merkleHashToRaw h))
, ("role", encText role)
] ]
instance FromJSON BundleRoot where bundleRootFromCBOR :: Decoder s BundleRoot
parseJSON = withObject "BundleRoot" $ \o -> BundleRoot bundleRootFromCBOR = do
<$> o .: "hash" n <- decodeMapLen
<*> o .:? "role" .!= "root" unless (n == 2) $ fail "BundleRoot: must have exactly 2 entries"
decodeKey "hash"
hRaw <- decodeBytes
decodeKey "role"
role <- decodeString
pure (BundleRoot (rawToMerkleHash hRaw) role)
-- | An export entry.
data BundleExport = BundleExport data BundleExport = BundleExport
{ exportName :: Text { exportName :: Text
, exportRoot :: MerkleHash , exportRoot :: MerkleHash
, exportKind :: Text , exportKind :: Text
, exportAbi :: Text , exportAbi :: Text
, exportInput :: Maybe Text
, exportOutput :: Maybe Text
} deriving (Show, Eq, Ord, Generic) } deriving (Show, Eq, Ord, Generic)
instance ToJSON BundleExport where toCBORBundleExport :: BundleExport -> Encoding
toJSON e = object toCBORBundleExport (BundleExport name h kind abi) =
[ "name" .= exportName e cmkPairs
, "root" .= exportRoot e [ ("name", encText name)
, "kind" .= exportKind e , ("root", encBytes (merkleHashToRaw h))
, "abi" .= exportAbi e , ("kind", encText kind)
, "input" .= exportInput e , ("abi", encText abi)
, "output" .= exportOutput e
] ]
instance FromJSON BundleExport where bundleExportFromCBOR :: Decoder s BundleExport
parseJSON = withObject "BundleExport" $ \o -> BundleExport bundleExportFromCBOR = do
<$> o .: "name" n <- decodeMapLen
<*> o .: "root" unless (n == 4) $ fail "BundleExport: must have exactly 4 entries"
<*> o .:? "kind" .!= "term" decodeKey "name"
<*> o .:? "abi" .!= "arborix.abi.tree.v1" name <- decodeString
<*> o .:? "input" decodeKey "root"
<*> o .:? "output" hRaw <- decodeBytes
decodeKey "kind"
kind <- decodeString
decodeKey "abi"
abi <- decodeString
pure (BundleExport name (rawToMerkleHash hRaw) kind abi)
-- | Optional package metadata.
data BundleMetadata = BundleMetadata data BundleMetadata = BundleMetadata
{ metadataPackage :: Maybe Text { metadataPackage :: Maybe Text
, metadataVersion :: Maybe Text , metadataVersion :: Maybe Text
@@ -206,23 +299,34 @@ data BundleMetadata = BundleMetadata
, metadataCreatedBy :: Maybe Text , metadataCreatedBy :: Maybe Text
} deriving (Show, Eq, Ord, Generic) } deriving (Show, Eq, Ord, Generic)
instance ToJSON BundleMetadata where metadataFromCBOR :: Decoder s BundleMetadata
toJSON m = object metadataFromCBOR = do
[ "package" .= metadataPackage m mlen <- decodeMapLen
, "version" .= metadataVersion m entries <- decodeMapN decodeString decodeString mlen
, "description" .= metadataDescription m let lookupText k = go k entries
, "license" .= metadataLicense m go _ [] = Nothing
, "createdBy" .= metadataCreatedBy m go k ((k', v):rest)
] | k == k' = Just v
| otherwise = go k rest
pure BundleMetadata
{ metadataPackage = lookupText "package"
, metadataVersion = lookupText "version"
, metadataDescription = lookupText "description"
, metadataLicense = lookupText "license"
, metadataCreatedBy = lookupText "createdBy"
}
instance FromJSON BundleMetadata where metadataToCBOR :: BundleMetadata -> Encoding
parseJSON = withObject "BundleMetadata" $ \o -> BundleMetadata metadataToCBOR (BundleMetadata pkg ver desc lic by) =
<$> o .:? "package" let pairs =
<*> o .:? "version" maybe [] (\v -> [("package", encText v)]) pkg
<*> o .:? "description" ++ maybe [] (\v -> [("version", encText v)]) ver
<*> o .:? "license" ++ maybe [] (\v -> [("description", encText v)]) desc
<*> o .:? "createdBy" ++ maybe [] (\v -> [("license", encText v)]) lic
++ maybe [] (\v -> [("createdBy", encText v)]) by
in cmkPairs pairs
-- | The manifest: top-level bundle metadata.
data BundleManifest = BundleManifest data BundleManifest = BundleManifest
{ manifestSchema :: Text { manifestSchema :: Text
, manifestBundleType :: Text , manifestBundleType :: Text
@@ -231,37 +335,45 @@ data BundleManifest = BundleManifest
, manifestClosure :: ClosureMode , manifestClosure :: ClosureMode
, manifestRoots :: [BundleRoot] , manifestRoots :: [BundleRoot]
, manifestExports :: [BundleExport] , manifestExports :: [BundleExport]
, manifestImports :: [Value]
, manifestSections :: Value
, manifestMetadata :: BundleMetadata , manifestMetadata :: BundleMetadata
} deriving (Show, Eq, Generic) } deriving (Show, Eq, Generic)
instance ToJSON BundleManifest where manifestToCBOR :: BundleManifest -> Encoding
toJSON m = object manifestToCBOR m =
[ "schema" .= manifestSchema m cmkPairs
, "bundleType" .= manifestBundleType m [ ("schema", encText (manifestSchema m))
, "tree" .= manifestTree m , ("bundleType", encText (manifestBundleType m))
, "runtime" .= manifestRuntime m , ("tree", toCBORTreeSpec (manifestTree m))
, "closure" .= manifestClosure m , ("runtime", toCBORRuntimeSpec (manifestRuntime m))
, "roots" .= manifestRoots m , ("closure", toCBORClosure (manifestClosure m))
, "exports" .= manifestExports m , ("roots", cakSeq (map toCBORBundleRoot (manifestRoots m)))
, "imports" .= manifestImports m , ("exports", cakSeq (map toCBORBundleExport (manifestExports m)))
, "sections" .= manifestSections m , ("metadata", metadataToCBOR (manifestMetadata m))
, "metadata" .= manifestMetadata m
] ]
instance FromJSON BundleManifest where manifestFromCBOR :: Decoder s BundleManifest
parseJSON = withObject "BundleManifest" $ \o -> BundleManifest manifestFromCBOR = do
<$> o .: "schema" n <- decodeMapLen
<*> o .: "bundleType" unless (n == 8) $ fail "BundleManifest: must have exactly 8 entries"
<*> o .: "tree" decodeKey "schema"
<*> o .: "runtime" schema <- decodeString
<*> o .: "closure" decodeKey "bundleType"
<*> o .: "roots" bundleType <- decodeString
<*> o .: "exports" decodeKey "tree"
<*> o .:? "imports" .!= [] tree <- treeSpecFromCBOR
<*> o .:? "sections" .!= object [] decodeKey "runtime"
<*> o .:? "metadata" .!= BundleMetadata Nothing Nothing Nothing Nothing Nothing runtime <- runtimeSpecFromCBOR
decodeKey "closure"
closure <- closureFromCBOR
decodeKey "roots"
rlen <- decodeListLen
roots <- decodeListN bundleRootFromCBOR rlen
decodeKey "exports"
elen <- decodeListLen
exports <- decodeListN bundleExportFromCBOR elen
decodeKey "metadata"
metadata <- metadataFromCBOR
pure (BundleManifest schema bundleType tree runtime closure roots exports metadata)
-- | Portable executable-object bundle. -- | Portable executable-object bundle.
-- --
@@ -276,12 +388,33 @@ data Bundle = Bundle
, bundleManifestBytes :: ByteString , bundleManifestBytes :: ByteString
} deriving (Show, Eq) } deriving (Show, Eq)
-- ---------------------------------------------------------------------------
-- CBOR manifest serialization
-- ---------------------------------------------------------------------------
-- | Encode the manifest as canonical CBOR.
encodeManifest :: BundleManifest -> ByteString
encodeManifest m = BL.toStrict (toLazyByteString (manifestToCBOR m))
-- | Decode a manifest from CBOR bytes.
decodeManifest :: ByteString -> Either String BundleManifest
decodeManifest bs =
case deserialiseFromBytes manifestFromCBOR (BL.fromStrict bs) of
Right (rest, m)
| BS.null (BL.toStrict rest) -> Right m
| otherwise -> Left "trailing bytes after manifest CBOR"
Left (DeserialiseFailure _ msg) -> Left msg
-- ---------------------------------------------------------------------------
-- Bundle encoding
-- ---------------------------------------------------------------------------
-- | Encode a Bundle to portable Bundle v1 bytes. -- | Encode a Bundle to portable Bundle v1 bytes.
encodeBundle :: Bundle -> ByteString encodeBundle :: Bundle -> ByteString
encodeBundle bundle = encodeBundle bundle =
let nodeSection = encodeNodeSection (bundleNodes bundle) let nodeSection = encodeNodeSection (bundleNodes bundle)
manifestBytes = if BS.null (bundleManifestBytes bundle) manifestBytes = if BS.null (bundleManifestBytes bundle)
then BL.toStrict (encode (bundleManifest bundle)) then encodeManifest (bundleManifest bundle)
else bundleManifestBytes bundle else bundleManifestBytes bundle
sectionCount = 2 sectionCount = 2
dirOffset = fromIntegral headerLength dirOffset = fromIntegral headerLength
@@ -346,15 +479,14 @@ decodePortableBundle bs = do
dirBytes = fromIntegral sectionCount * sectionEntryLength dirBytes = fromIntegral sectionCount * sectionEntryLength
when (BS.length bs < dirStart + dirBytes) $ when (BS.length bs < dirStart + dirBytes) $
Left "bundle truncated in section directory" Left "bundle truncated in section directory"
entries <- decodeSectionEntries sectionCount (BS.take dirBytes $ BS.drop dirStart bs) let dirRaw = BS.take dirBytes $ BS.drop dirStart bs
entries <- decodeSectionEntries sectionCount dirRaw
traverse_ rejectUnknownCritical entries traverse_ rejectUnknownCritical entries
manifestEntry <- requireSection sectionManifest entries manifestEntry <- requireSection sectionManifest entries
nodesEntry <- requireSection sectionNodes entries nodesEntry <- requireSection sectionNodes entries
manifestBytes <- readAndVerifySection bs manifestEntry manifestBytes <- readAndVerifySection bs manifestEntry
nodesBytes <- readAndVerifySection bs nodesEntry nodesBytes <- readAndVerifySection bs nodesEntry
manifest <- case eitherDecodeStrict' manifestBytes of manifest <- decodeManifest manifestBytes
Left err -> Left $ "invalid manifest JSON: " ++ err
Right m -> Right m
nodes <- decodeNodeSection nodesBytes nodes <- decodeNodeSection nodesBytes
let roots = map rootHash (manifestRoots manifest) let roots = map rootHash (manifestRoots manifest)
return Bundle return Bundle
@@ -429,8 +561,8 @@ decodeSectionEntries count bytes = reverse <$> go count bytes []
-- Manifest construction -- Manifest construction
-- --------------------------------------------------------------------------- -- ---------------------------------------------------------------------------
defaultManifest :: [(Text, MerkleHash)] -> Int -> BundleManifest defaultManifest :: [(Text, MerkleHash)] -> BundleManifest
defaultManifest namedRoots nodeCount = BundleManifest defaultManifest namedRoots = BundleManifest
{ manifestSchema = "arborix.bundle.manifest.v1" { manifestSchema = "arborix.bundle.manifest.v1"
, manifestBundleType = "tree-calculus-executable-object" , manifestBundleType = "tree-calculus-executable-object"
, manifestTree = TreeSpec , manifestTree = TreeSpec
@@ -450,13 +582,6 @@ defaultManifest namedRoots nodeCount = BundleManifest
, manifestClosure = ClosureComplete , manifestClosure = ClosureComplete
, manifestRoots = zipWith mkRoot [0 :: Int ..] (map snd namedRoots) , manifestRoots = zipWith mkRoot [0 :: Int ..] (map snd namedRoots)
, manifestExports = map mkExport namedRoots , manifestExports = map mkExport namedRoots
, manifestImports = []
, manifestSections = object
[ "nodes" .= object
[ "count" .= nodeCount
, "payload" .= ("arborix.merkle.payload.v1" :: Text)
]
]
, manifestMetadata = BundleMetadata , manifestMetadata = BundleMetadata
{ metadataPackage = Nothing { metadataPackage = Nothing
, metadataVersion = Nothing , metadataVersion = Nothing
@@ -473,8 +598,6 @@ defaultManifest namedRoots nodeCount = BundleManifest
, exportRoot = h , exportRoot = h
, exportKind = "term" , exportKind = "term"
, exportAbi = "arborix.abi.tree.v1" , exportAbi = "arborix.abi.tree.v1"
, exportInput = Nothing
, exportOutput = Nothing
} }
-- --------------------------------------------------------------------------- -- ---------------------------------------------------------------------------
@@ -568,12 +691,10 @@ verifyManifest manifest = do
Left $ "unsupported runtime semantics: " ++ unpack (runtimeSemantics runtimeSpec) Left $ "unsupported runtime semantics: " ++ unpack (runtimeSemantics runtimeSpec)
when (runtimeAbi runtimeSpec /= "arborix.abi.tree.v1") $ when (runtimeAbi runtimeSpec /= "arborix.abi.tree.v1") $
Left $ "unsupported runtime ABI: " ++ unpack (runtimeAbi runtimeSpec) Left $ "unsupported runtime ABI: " ++ unpack (runtimeAbi runtimeSpec)
unless (null $ runtimeCapabilities runtimeSpec) $ when (not (null (runtimeCapabilities runtimeSpec))) $
Left "host/runtime capabilities are not supported by bundle v1" Left "unsupported runtime capabilities"
when (manifestClosure manifest /= ClosureComplete) $ when (manifestClosure manifest /= ClosureComplete) $
Left "bundle v1 imports require closure = complete" Left "bundle v1 requires closure = complete"
unless (null $ manifestImports manifest) $
Left "bundle v1 imports require an empty imports list"
when (null $ manifestRoots manifest) $ when (null $ manifestRoots manifest) $
Left "manifest has no roots" Left "manifest has no roots"
when (null $ manifestExports manifest) $ when (null $ manifestExports manifest) $
@@ -674,8 +795,8 @@ exportNamedBundle conn namedHashes = do
let hashes = map snd namedHashes let hashes = map snd namedHashes
entries <- concat <$> mapM (collectReachableNodes conn) hashes entries <- concat <$> mapM (collectReachableNodes conn) hashes
let nodeMap = Map.fromList entries let nodeMap = Map.fromList entries
manifest = defaultManifest namedHashes (Map.size nodeMap) manifest = defaultManifest namedHashes
manifestBytes = BL.toStrict (encode manifest) manifestBytes = encodeManifest manifest
bundle = Bundle bundle = Bundle
{ bundleVersion = bundleMajorVersion * 1000 + bundleMinorVersion { bundleVersion = bundleMajorVersion * 1000 + bundleMinorVersion
, bundleRoots = hashes , bundleRoots = hashes
@@ -793,6 +914,8 @@ rawToMerkleHash bs = decodeUtf8 (Base16.encode bs)
sha256 :: ByteString -> ByteString sha256 :: ByteString -> ByteString
sha256 bytes = convert ((hash bytes) :: Digest SHA256) sha256 bytes = convert ((hash bytes) :: Digest SHA256)
defaultExportNames :: Int -> [Text] defaultExportNames :: Int -> [Text]
defaultExportNames n = defaultExportNames n =
case n of case n of

View File

@@ -2041,7 +2041,7 @@ binaryReaderTests = testGroup "Binary Reader Tests"
result env @?= okT Leaf (bytesT ([101,102,103] ++ nodesBytes)) result env @?= okT Leaf (bytesT ([101,102,103] ++ nodesBytes))
, testCase "readArborixNodesSection: reads id fixture bundle" $ do , testCase "readArborixNodesSection: reads id fixture bundle" $ do
fixtureBytes <- BS.readFile "test/fixtures/id.tri.bundle" fixtureBytes <- BS.readFile "test/fixtures/id.arborix"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right _ -> do Right _ -> do
@@ -2053,7 +2053,7 @@ binaryReaderTests = testGroup "Binary Reader Tests"
result env @?= ofNumber 0 result env @?= ofNumber 0
, testCase "readArborixNodesSection: reads notQ fixture bundle" $ do , testCase "readArborixNodesSection: reads notQ fixture bundle" $ do
fixtureBytes <- BS.readFile "test/fixtures/notQ.tri.bundle" fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right _ -> do Right _ -> do
@@ -2065,7 +2065,7 @@ binaryReaderTests = testGroup "Binary Reader Tests"
result env @?= ofNumber 0 result env @?= ofNumber 0
, testCase "readArborixNodesSection: reads map fixture bundle" $ do , testCase "readArborixNodesSection: reads map fixture bundle" $ do
fixtureBytes <- BS.readFile "test/fixtures/map.tri.bundle" fixtureBytes <- BS.readFile "test/fixtures/map.arborix"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right _ -> do Right _ -> do
@@ -2077,7 +2077,7 @@ binaryReaderTests = testGroup "Binary Reader Tests"
result env @?= ofNumber 0 result env @?= ofNumber 0
, testCase "readArborixExecutableFromHash: reconstructs id fixture root" $ do , testCase "readArborixExecutableFromHash: reconstructs id fixture root" $ do
fixtureBytes <- BS.readFile "test/fixtures/id.tri.bundle" fixtureBytes <- BS.readFile "test/fixtures/id.arborix"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of
@@ -2093,7 +2093,7 @@ binaryReaderTests = testGroup "Binary Reader Tests"
result env @?= ofNumber 0 result env @?= ofNumber 0
, testCase "readArborixExecutableFromHash: reconstructs notQ fixture root" $ do , testCase "readArborixExecutableFromHash: reconstructs notQ fixture root" $ do
fixtureBytes <- BS.readFile "test/fixtures/notQ.tri.bundle" fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of
@@ -2109,7 +2109,7 @@ binaryReaderTests = testGroup "Binary Reader Tests"
result env @?= ofNumber 0 result env @?= ofNumber 0
, testCase "readArborixExecutableFromHash: reconstructs map fixture root" $ do , testCase "readArborixExecutableFromHash: reconstructs map fixture root" $ do
fixtureBytes <- BS.readFile "test/fixtures/map.tri.bundle" fixtureBytes <- BS.readFile "test/fixtures/map.arborix"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of
@@ -2125,7 +2125,7 @@ binaryReaderTests = testGroup "Binary Reader Tests"
result env @?= ofNumber 0 result env @?= ofNumber 0
, testCase "readArborixExecutableFromHash: executes id fixture root" $ do , testCase "readArborixExecutableFromHash: executes id fixture root" $ do
fixtureBytes <- BS.readFile "test/fixtures/id.tri.bundle" fixtureBytes <- BS.readFile "test/fixtures/id.arborix"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of
@@ -2141,7 +2141,7 @@ binaryReaderTests = testGroup "Binary Reader Tests"
result env @?= ofNumber 42 result env @?= ofNumber 42
, testCase "readArborixExecutableFromHash: executes notQ fixture on true" $ do , testCase "readArborixExecutableFromHash: executes notQ fixture on true" $ do
fixtureBytes <- BS.readFile "test/fixtures/notQ.tri.bundle" fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of
@@ -2157,7 +2157,7 @@ binaryReaderTests = testGroup "Binary Reader Tests"
result env @?= falseT result env @?= falseT
, testCase "readArborixExecutableFromHash: executes notQ fixture on false" $ do , testCase "readArborixExecutableFromHash: executes notQ fixture on false" $ do
fixtureBytes <- BS.readFile "test/fixtures/notQ.tri.bundle" fixtureBytes <- BS.readFile "test/fixtures/notQ.arborix"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of
@@ -2173,7 +2173,7 @@ binaryReaderTests = testGroup "Binary Reader Tests"
result env @?= trueT result env @?= trueT
, testCase "readArborixExecutableFromHash: executes map fixture root" $ do , testCase "readArborixExecutableFromHash: executes map fixture root" $ do
fixtureBytes <- BS.readFile "test/fixtures/map.tri.bundle" fixtureBytes <- BS.readFile "test/fixtures/map.arborix"
case decodeBundle fixtureBytes of case decodeBundle fixtureBytes of
Left err -> assertFailure $ "decodeBundle failed: " ++ err Left err -> assertFailure $ "decodeBundle failed: " ++ err
Right bundle -> case bundleRoots bundle of Right bundle -> case bundleRoots bundle of

BIN
test/fixtures/false.arborix vendored Normal file

Binary file not shown.

BIN
test/fixtures/id.arborix vendored Normal file

Binary file not shown.

Binary file not shown.

BIN
test/fixtures/map.arborix vendored Normal file

Binary file not shown.

Binary file not shown.

BIN
test/fixtures/notQ.arborix vendored Normal file

Binary file not shown.

Binary file not shown.

BIN
test/fixtures/true.arborix vendored Normal file

Binary file not shown.

View File

@@ -37,11 +37,11 @@ executable tricu
-fPIC -fPIC
build-depends: build-depends:
base >=4.7 base >=4.7
, aeson
, ansi-terminal , ansi-terminal
, base16-bytestring , base16-bytestring
, base64-bytestring , base64-bytestring
, bytestring , bytestring
, cborg
, cmdargs , cmdargs
, containers , containers
, cryptonite , cryptonite
@@ -90,11 +90,11 @@ test-suite tricu-tests
ScopedTypeVariables ScopedTypeVariables
build-depends: build-depends:
base >=4.7 base >=4.7
, aeson
, ansi-terminal , ansi-terminal
, base16-bytestring , base16-bytestring
, base64-bytestring , base64-bytestring
, bytestring , bytestring
, cborg
, cmdargs , cmdargs
, containers , containers
, cryptonite , cryptonite
@@ -115,8 +115,8 @@ test-suite tricu-tests
, text , text
, time , time
, transformers , transformers
, warp
, wai , wai
, warp
, zlib , zlib
default-language: Haskell2010 default-language: Haskell2010
other-modules: other-modules: