feat: HTTP server for exporting Arborix bundles

Introduces a read-only HTTP server (WAI/Warp) backed by the content
store, exposing three bundle-export endpoints:

- GET /bundle/name/:name   — export by stored term name
- GET /bundle/hash/:hash   — export by full Merkle hash
- GET /terms               — plain-text listing (debug)

Also adds `tricu server` (aka `--serve`) CLI mode, move `resolveExportTarget` /
 `namesForHash` / `looksLikeHash` out of `Main.hs` into `ContentStore.hs`,
and cleans up unused exports and imports across `FileEval.hs` and `Wire.hs`.
This commit is contained in:
2026-05-06 14:09:56 -05:00
parent 0cd849447f
commit a36ff638a9
9 changed files with 279 additions and 60 deletions

View File

@@ -314,3 +314,21 @@ Without `TRICU_DB_PATH` set, `eval` uses only the terms defined in the input fil
- **Evaluate files:** `nix run .# -- eval -f demos/equality.tri`
- **GHC options:** `-threaded -rtsopts -with-rtsopts=-N` for parallel runtime. Use `-N` RTS flag for multi-core.
- **Upx** is in the devShell for binary compression if needed.
## 12. Viewing Haskell Dependency Docs from Nix
When you need Haddock documentation for a Haskell dependency available in Nixpkgs, build the package's `doc` output directly with `^doc`.
Example:
Replace `megaparsec` with the dependency name you need:
```sh
nix build "nixpkgs#haskellPackages.${pkg}^doc"
```
View the available documentation files:
```sh
find ./result-doc -type f \( -name '*.html' -o -name '*.haddock' \) | sort
```