Zero Warnings Plan
Zero GHC warnings with new opts. General cleanup and updates.
This commit is contained in:
177
AGENTS.md
Normal file
177
AGENTS.md
Normal file
@@ -0,0 +1,177 @@
|
|||||||
|
# AGENTS.md — tricu Project Guide
|
||||||
|
|
||||||
|
> For AI agents and contributors working in this repository.
|
||||||
|
|
||||||
|
## 1. Build & Test
|
||||||
|
|
||||||
|
**`nix build .#` always runs tests.** This is the primary and only way to build and validate.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Full build + tests (this is the default)
|
||||||
|
nix build .#
|
||||||
|
|
||||||
|
# Build only (skip tests)
|
||||||
|
nix build .#package
|
||||||
|
|
||||||
|
# Build the test-specific variant with doCheck enforced
|
||||||
|
nix build .#test
|
||||||
|
nix flake check
|
||||||
|
|
||||||
|
# Dev shell (includes ghcid, cabal-install, ghc, upx)
|
||||||
|
nix develop .#
|
||||||
|
```
|
||||||
|
|
||||||
|
### ⚠️ Never call `cabal` directly
|
||||||
|
|
||||||
|
This project uses a Nix flake that wraps `callCabal2nix` to produce the cabal package. All compilation, linking, and test execution are driven through Nix. Running `cabal build`, `cabal test`, `cabal repl`, or `cabal install` directly will use the system GHC (or `.stack-work`) and can produce artifacts that differ from the Nix-built ones — especially regarding `megaparsec` which is a project dependency.
|
||||||
|
|
||||||
|
> **Rule of thumb:** if it builds, links, or tests, it goes through `nix`.
|
||||||
|
|
||||||
|
## 2. Project Overview
|
||||||
|
|
||||||
|
**tricu** (pronounced "tree-shoe") is a programming-language experiment written in Haskell. It implements [Triage Calculus](https://olydis.medium.com/a-visual-introduction-to-tree-calculus-2f4a34ceffc2), an extension of Barry Jay's Tree Calculus, with lambda-abstraction sugar that gets eliminated back to pure tree calculus terms.
|
||||||
|
|
||||||
|
tricu is Lojban for "tree".
|
||||||
|
|
||||||
|
### Core types (in `src/Research.hs`)
|
||||||
|
|
||||||
|
| Type | Description |
|
||||||
|
|------|-------------|
|
||||||
|
| `T = Leaf \| Stem T \| Fork T T` | Tree Calculus term (the runtime value) |
|
||||||
|
| `TricuAST` | Parsed AST with `SDef`, `SApp`, `SLambda`, etc. |
|
||||||
|
| `LToken` | Lexer tokens |
|
||||||
|
| `Node` / `MerkleHash` | Content-addressed Merkle DAG nodes |
|
||||||
|
|
||||||
|
### Source modules
|
||||||
|
|
||||||
|
| Module | Purpose |
|
||||||
|
|--------|---------|
|
||||||
|
| `Main.hs` | CLI entry point (`cmdargs`), three modes: `repl`, `eval`, `decode` |
|
||||||
|
| `Eval.hs` | Interpreter: `evalTricu`, `result`, `evalSingle` |
|
||||||
|
| `Parser.hs` | Megaparsec parser → `TricuAST` |
|
||||||
|
| `Lexer.hs` | Megaparsec lexer → `LToken` |
|
||||||
|
| `FileEval.hs` | File loading, module imports, `!import` |
|
||||||
|
| `REPL.hs` | Interactive Read-Eval-Print Loop (haskeline) |
|
||||||
|
| `Research.hs` | Core types, `apply` reduction, booleans, marshalling (`ofString`, `ofNumber`), output formatters (`toAscii`, `toTernaryString`, `decodeResult`) |
|
||||||
|
| `ContentStore.hs` | SQLite-backed term persistence |
|
||||||
|
|
||||||
|
### File extensions
|
||||||
|
|
||||||
|
- `.hs` — Haskell source
|
||||||
|
- `.tri` — tricu language source (used in `lib/`, `test/`, `demos/`)
|
||||||
|
|
||||||
|
## 3. Test Suite
|
||||||
|
|
||||||
|
Tests live in `test/Spec.hs` and use **Tasty** + **HUnit**.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
nix flake check # or: nix build .#test
|
||||||
|
```
|
||||||
|
|
||||||
|
### Test groups
|
||||||
|
|
||||||
|
| Group | What it covers |
|
||||||
|
|-------|----------------|
|
||||||
|
| `lexer` | Megaparsec lexer — identifiers, keywords, strings, escapes, invalid tokens |
|
||||||
|
| `parser` | Parser — defs, lambda, applications, lists, comments, parentheses |
|
||||||
|
| `simpleEvaluation` | Core `apply` reduction rules, variable substitution, immutability |
|
||||||
|
| `lambdas` | Lambda elimination, SKI calculus, higher-order functions, currying, shadowing, free vars |
|
||||||
|
| `providedLibraries` | `lib/list.tri` — triage, booleans, list ops (`head`, `tail`, `map`, `emptyList?`, `append`, `equal?`) |
|
||||||
|
| `fileEval` | Loading `.tri` files, multi-file context, decode |
|
||||||
|
| `modules` | `!import`, cyclic deps, namespacing, multi-level imports, unresolved vars, local namespaces |
|
||||||
|
| `demos` | `demos/*.tri` — structural equality, `toSource`, `size`, level-order traversal |
|
||||||
|
| `decoding` | `decodeResult` — Leaf, numbers, strings, lists, mixed |
|
||||||
|
| `elimLambdaSingle` | Lambda elimination: eta reduction, SDef binding, semantics preservation |
|
||||||
|
| `stressElimLambda` | Lambda elimination stress test: 200 vars, 800-body curried lambda |
|
||||||
|
|
||||||
|
### Adding tests
|
||||||
|
|
||||||
|
1. Append a `testCase "Description" $ do ...` block to the appropriate test group in `test/Spec.hs`.
|
||||||
|
2. Import any modules you need (lexer/parser are available via `runParser` from `Text.Megapparsec`; evaluation via `evalTricu`, `parseTricu`, `result`).
|
||||||
|
3. Run `nix flake check` to verify.
|
||||||
|
|
||||||
|
> The test-suite in `tricu.cabal` pulls in `src/` as `hs-source-dirs`, so tests import modules directly (e.g., `import Eval`, `import Lexer`). This is intentional — tests exercise the full pipeline end-to-end.
|
||||||
|
|
||||||
|
## 4. tricu Language Quick Reference
|
||||||
|
|
||||||
|
```
|
||||||
|
t → Leaf (the base term)
|
||||||
|
t t → Stem Leaf
|
||||||
|
t t t → Fork Leaf Leaf
|
||||||
|
|
||||||
|
x = t → Define term x = Leaf
|
||||||
|
id = (a : a) → Lambda identity (eliminates to tree calculus)
|
||||||
|
head (map f xs) → From lib/list.tri
|
||||||
|
|
||||||
|
!import "./path.tri" NS → Import file under namespace
|
||||||
|
|
||||||
|
-- line comment
|
||||||
|
|- block comment -|
|
||||||
|
```
|
||||||
|
|
||||||
|
## 5. Output Formats
|
||||||
|
|
||||||
|
The `eval` command accepts `--form` (shorthand `-t`):
|
||||||
|
|
||||||
|
| Format | Value | Description |
|
||||||
|
|--------|-------|-------------|
|
||||||
|
| `tree` | `TreeCalculus` | Simple `t` form (default) |
|
||||||
|
| `fsl` | `FSL` | Full show representation |
|
||||||
|
| `ast` | `AST` | Parsed AST representation |
|
||||||
|
| `ternary` | `Ternary` | Ternary string encoding |
|
||||||
|
| `ascii` | `Ascii` | ASCII-art tree diagram |
|
||||||
|
| `decode` | `Decode` | Human-readable (strings, numbers, lists) |
|
||||||
|
|
||||||
|
## 6. Content Addressing
|
||||||
|
|
||||||
|
Each `T` term is content-addressed via a Merkle DAG:
|
||||||
|
|
||||||
|
```
|
||||||
|
NLeaf → 0x00
|
||||||
|
NStem(h) → 0x01 || h (32 bytes)
|
||||||
|
NFork(l,r) → 0x02 || l (32 bytes) || r (32 bytes)
|
||||||
|
|
||||||
|
hash = SHA256("tricu.merkle.node.v1" <> 0x00 <> serialized_node)
|
||||||
|
```
|
||||||
|
|
||||||
|
This is stored in SQLite via `ContentStore.hs`. Hash suffixes on identifiers (e.g., `foo_abc123...`) are validated: 16–64 hex characters (SHA256).
|
||||||
|
|
||||||
|
## 7. Directory Layout
|
||||||
|
|
||||||
|
```
|
||||||
|
tricu/
|
||||||
|
├── flake.nix # Nix flake: packages, tests, devShell
|
||||||
|
├── tricu.cabal # Cabal package (used via callCabal2nix)
|
||||||
|
├── src/ # Haskell modules
|
||||||
|
│ ├── Main.hs
|
||||||
|
│ ├── Eval.hs
|
||||||
|
│ ├── Parser.hs
|
||||||
|
│ ├── Lexer.hs
|
||||||
|
│ ├── FileEval.hs
|
||||||
|
│ ├── REPL.hs
|
||||||
|
│ ├── Research.hs
|
||||||
|
│ └── ContentStore.hs
|
||||||
|
├── test/
|
||||||
|
│ ├── Spec.hs # Tasty + HUnit tests
|
||||||
|
│ ├── *.tri # tricu test programs
|
||||||
|
│ └── local-ns/ # Module namespace test files
|
||||||
|
├── lib/
|
||||||
|
│ ├── base.tri
|
||||||
|
│ ├── list.tri
|
||||||
|
│ └── patterns.tri
|
||||||
|
├── demos/
|
||||||
|
│ ├── equality.tri
|
||||||
|
│ ├── size.tri
|
||||||
|
│ ├── toSource.tri
|
||||||
|
│ ├── levelOrderTraversal.tri
|
||||||
|
│ └── patternMatching.tri
|
||||||
|
└── AGENTS.md # This file
|
||||||
|
```
|
||||||
|
|
||||||
|
## 8. Development Tips
|
||||||
|
|
||||||
|
- **Quick iteration:** `nix develop` then `ghcid` (provided in the devShell) watches files and re-runs.
|
||||||
|
- **REPL:** `nix run .#` starts the interactive REPL.
|
||||||
|
- **Evaluate files:** `nix run .# -- eval -f demos/equality.tri`
|
||||||
|
- **GHC options:** `-threaded -rtsopts -with-rtsopts=-N` for parallel runtime. Use `-N` RTS flag for multi-core.
|
||||||
|
- **Upx** is in the devShell for binary compression if needed.
|
||||||
24
README.md
24
README.md
@@ -36,15 +36,21 @@ tricu < -- or calculate its size (/demos/size.tri)
|
|||||||
tricu < size not?
|
tricu < size not?
|
||||||
tricu > 12
|
tricu > 12
|
||||||
|
|
||||||
tricu < -- REPL Commands:
|
tricu < !help
|
||||||
tricu < !definitions -- Lists all available definitions
|
tricu version 1.1.0
|
||||||
tricu < !output -- Change output format (Tree, FSL, AST, etc.)
|
Available commands:
|
||||||
tricu < !import -- Import definitions from a file
|
!exit - Exit the REPL
|
||||||
tricu < !exit -- Exit the REPL
|
!clear - Clear the screen
|
||||||
tricu < !clear -- ANSI screen clear
|
!reset - Reset preferences for selected versions
|
||||||
tricu < !save -- Save all REPL definitions to a file that you can !import
|
!help - Show tricu version and available commands
|
||||||
tricu < !reset -- Clear all REPL definitions
|
!output - Change output format (tree|fsl|ast|ternary|ascii|decode)
|
||||||
tricu < !version -- Print tricu version
|
!definitions - List all defined terms in the content store
|
||||||
|
!import - Import definitions from file to the content store
|
||||||
|
!watch - Watch a file for changes, evaluate terms, and store them
|
||||||
|
!refresh - Refresh environment from content store (definitions are live)
|
||||||
|
!versions - Show all versions of a term by name
|
||||||
|
!select - Select a specific version of a term for subsequent lookups
|
||||||
|
!tag - Add or update a tag for a term by hash or name
|
||||||
```
|
```
|
||||||
|
|
||||||
## Installation and Use
|
## Installation and Use
|
||||||
|
|||||||
@@ -1,7 +1,6 @@
|
|||||||
module ContentStore where
|
module ContentStore where
|
||||||
|
|
||||||
import Research
|
import Research
|
||||||
import Parser
|
|
||||||
|
|
||||||
import Control.Monad (foldM, forM_, void)
|
import Control.Monad (foldM, forM_, void)
|
||||||
import Data.ByteString (ByteString)
|
import Data.ByteString (ByteString)
|
||||||
@@ -9,11 +8,9 @@ import Data.List (nub, sort)
|
|||||||
import Data.Maybe (catMaybes, fromMaybe)
|
import Data.Maybe (catMaybes, fromMaybe)
|
||||||
import Data.Text (Text)
|
import Data.Text (Text)
|
||||||
import Database.SQLite.Simple
|
import Database.SQLite.Simple
|
||||||
import Database.SQLite.Simple.FromRow (FromRow(..), field)
|
|
||||||
import System.Directory (createDirectoryIfMissing, getXdgDirectory, XdgDirectory(..))
|
import System.Directory (createDirectoryIfMissing, getXdgDirectory, XdgDirectory(..))
|
||||||
import System.FilePath ((</>), takeDirectory)
|
import System.FilePath ((</>), takeDirectory)
|
||||||
|
|
||||||
|
|
||||||
import qualified Data.Map as Map
|
import qualified Data.Map as Map
|
||||||
import qualified Data.Text as T
|
import qualified Data.Text as T
|
||||||
|
|
||||||
@@ -97,6 +94,7 @@ storeTerm conn newNamesStrList term = do
|
|||||||
|
|
||||||
-- | Reconstruct a Tree Calculus term from its Merkle root hash.
|
-- | Reconstruct a Tree Calculus term from its Merkle root hash.
|
||||||
-- Recursively loads nodes and rebuilds the T structure.
|
-- Recursively loads nodes and rebuilds the T structure.
|
||||||
|
loadTree :: Connection -> MerkleHash -> IO (Maybe T)
|
||||||
loadTree conn h
|
loadTree conn h
|
||||||
| h == nodeHash NLeaf = return (Just Leaf) -- NLeaf is implicit, not stored
|
| h == nodeHash NLeaf = return (Just Leaf) -- NLeaf is implicit, not stored
|
||||||
| otherwise = do
|
| otherwise = do
|
||||||
@@ -106,6 +104,7 @@ loadTree conn h
|
|||||||
Just node -> Just <$> buildTree node
|
Just node -> Just <$> buildTree node
|
||||||
where
|
where
|
||||||
buildTree :: Node -> IO T
|
buildTree :: Node -> IO T
|
||||||
|
buildTree NLeaf = return Leaf
|
||||||
buildTree (NStem childHash) = do
|
buildTree (NStem childHash) = do
|
||||||
child <- fromMaybe (errorWithoutStackTrace "BUG: stored hash not found") <$> loadTree conn childHash
|
child <- fromMaybe (errorWithoutStackTrace "BUG: stored hash not found") <$> loadTree conn childHash
|
||||||
return (Stem child)
|
return (Stem child)
|
||||||
@@ -166,7 +165,7 @@ storeEnvironment conn env = do
|
|||||||
let groupedDefs = Map.toList $ Map.fromListWith (++) [(term, [name]) | (name, term) <- defs]
|
let groupedDefs = Map.toList $ Map.fromListWith (++) [(term, [name]) | (name, term) <- defs]
|
||||||
|
|
||||||
forM_ groupedDefs $ \(term, namesList) -> case namesList of
|
forM_ groupedDefs $ \(term, namesList) -> case namesList of
|
||||||
n:ns -> void $ storeTerm conn namesList term
|
_:_ -> void $ storeTerm conn namesList term
|
||||||
_ -> errorWithoutStackTrace "storeEnvironment: empty names list"
|
_ -> errorWithoutStackTrace "storeEnvironment: empty names list"
|
||||||
|
|
||||||
loadTerm :: Connection -> String -> IO (Maybe T)
|
loadTerm :: Connection -> String -> IO (Maybe T)
|
||||||
|
|||||||
54
src/Eval.hs
54
src/Eval.hs
@@ -6,18 +6,18 @@ import Research
|
|||||||
|
|
||||||
import Control.Monad (foldM)
|
import Control.Monad (foldM)
|
||||||
import Data.List (partition, (\\), elemIndex, foldl')
|
import Data.List (partition, (\\), elemIndex, foldl')
|
||||||
import Data.Map (Map)
|
import Data.Map ()
|
||||||
import Data.Set (Set)
|
import Data.Set (Set)
|
||||||
import Database.SQLite.Simple
|
import Database.SQLite.Simple
|
||||||
|
|
||||||
import qualified Data.Foldable as F
|
import qualified Data.Foldable as F ()
|
||||||
import qualified Data.Map as Map
|
import qualified Data.Map as Map
|
||||||
import qualified Data.Set as Set
|
import qualified Data.Set as Set
|
||||||
import qualified Data.Text as T
|
import qualified Data.Text as T
|
||||||
|
|
||||||
data DB
|
data DB
|
||||||
= BVar Int -- bound (0 = nearest binder)
|
= BVar Int
|
||||||
| BFree String -- free/global
|
| BFree String
|
||||||
| BLam DB
|
| BLam DB
|
||||||
| BApp DB DB
|
| BApp DB DB
|
||||||
| BLeaf
|
| BLeaf
|
||||||
@@ -59,12 +59,12 @@ evalSingle env term
|
|||||||
evalTricu :: Env -> [TricuAST] -> Env
|
evalTricu :: Env -> [TricuAST] -> Env
|
||||||
evalTricu env x = go env (reorderDefs env x)
|
evalTricu env x = go env (reorderDefs env x)
|
||||||
where
|
where
|
||||||
go env [] = env
|
go env' [] = env'
|
||||||
go env [x] =
|
go env' [def] =
|
||||||
let updatedEnv = evalSingle env x
|
let updatedEnv = evalSingle env' def
|
||||||
in Map.insert "!result" (result updatedEnv) updatedEnv
|
in Map.insert "!result" (result updatedEnv) updatedEnv
|
||||||
go env (x:xs) =
|
go env' (def:xs) =
|
||||||
evalTricu (evalSingle env x) xs
|
evalTricu (evalSingle env' def) xs
|
||||||
|
|
||||||
evalASTSync :: Env -> TricuAST -> T
|
evalASTSync :: Env -> TricuAST -> T
|
||||||
evalASTSync env term = case term of
|
evalASTSync env term = case term of
|
||||||
@@ -129,7 +129,7 @@ resolveTermFromStore conn selectedVersions name mhash = case mhash of
|
|||||||
case matchingVersions of
|
case matchingVersions of
|
||||||
[] -> return Nothing
|
[] -> return Nothing
|
||||||
[(_, term, _)] -> return $ Just term
|
[(_, term, _)] -> return $ Just term
|
||||||
_ -> return Nothing -- Ambiguous or too many matches
|
_ -> return Nothing
|
||||||
Nothing -> case Map.lookup name selectedVersions of
|
Nothing -> case Map.lookup name selectedVersions of
|
||||||
Just hash -> loadTree conn hash
|
Just hash -> loadTree conn hash
|
||||||
Nothing -> do
|
Nothing -> do
|
||||||
@@ -137,7 +137,7 @@ resolveTermFromStore conn selectedVersions name mhash = case mhash of
|
|||||||
case versions of
|
case versions of
|
||||||
[] -> return Nothing
|
[] -> return Nothing
|
||||||
[(_, term, _)] -> return $ Just term
|
[(_, term, _)] -> return $ Just term
|
||||||
_ -> return $ Just $ (\(_, t, _) -> t) $ case versions of (_:_) -> head versions; _ -> error "resolveTermFromStore: unexpected empty versions list"
|
_ -> return $ Just (head (map (\(_, t, _) -> t) versions))
|
||||||
|
|
||||||
elimLambda :: TricuAST -> TricuAST
|
elimLambda :: TricuAST -> TricuAST
|
||||||
elimLambda = go
|
elimLambda = go
|
||||||
@@ -155,12 +155,10 @@ elimLambda = go
|
|||||||
etaReduction (SLambda [v] (SApp f (SVar x Nothing))) = v == x && not (usesBinder v f)
|
etaReduction (SLambda [v] (SApp f (SVar x Nothing))) = v == x && not (usesBinder v f)
|
||||||
etaReduction _ = False
|
etaReduction _ = False
|
||||||
|
|
||||||
-- triage: \a b c -> TLeaf (TLeaf a b) c (checked in DB with a↦2, b↦1, c↦0)
|
|
||||||
triagePattern (SLambda [a] (SLambda [b] (SLambda [c] body))) =
|
triagePattern (SLambda [a] (SLambda [b] (SLambda [c] body))) =
|
||||||
toDB [c,b,a] body == triageBodyDB
|
toDB [c,b,a] body == triageBodyDB
|
||||||
triagePattern _ = False
|
triagePattern _ = False
|
||||||
|
|
||||||
-- compose: \f g x -> f (g x) (checked in DB with f↦2, g↦1, x↦0)
|
|
||||||
composePattern (SLambda [f] (SLambda [g] (SLambda [x] body))) =
|
composePattern (SLambda [f] (SLambda [g] (SLambda [x] body))) =
|
||||||
toDB [x,g,f] body == composeBodyDB
|
toDB [x,g,f] body == composeBodyDB
|
||||||
composePattern _ = False
|
composePattern _ = False
|
||||||
@@ -174,13 +172,14 @@ elimLambda = go
|
|||||||
application (SApp _ _) = True
|
application (SApp _ _) = True
|
||||||
application _ = False
|
application _ = False
|
||||||
|
|
||||||
-- rewrites
|
|
||||||
etaReduceResult (SLambda [_] (SApp f _)) = f
|
etaReduceResult (SLambda [_] (SApp f _)) = f
|
||||||
|
etaReduceResult _ = error "etaReduceResult: expected SLambda [v] (SApp f _)"
|
||||||
|
|
||||||
lambdaListResult (SLambda [v] (SList xs)) =
|
lambdaListResult (SLambda [v] (SList xs)) =
|
||||||
SLambda [v] (foldr wrapTLeaf TLeaf xs)
|
SLambda [v] (foldr wrapTLeaf TLeaf xs)
|
||||||
where
|
where
|
||||||
wrapTLeaf m r = SApp (SApp TLeaf m) r
|
wrapTLeaf m r = SApp (SApp TLeaf m) r
|
||||||
|
lambdaListResult _ = error "lambdaListResult: expected SLambda [v] (SList xs)"
|
||||||
|
|
||||||
nestedLambdaResult (SLambda (v:vs) body)
|
nestedLambdaResult (SLambda (v:vs) body)
|
||||||
| null vs =
|
| null vs =
|
||||||
@@ -188,16 +187,19 @@ elimLambda = go
|
|||||||
db = toDB [v] body'
|
db = toDB [v] body'
|
||||||
in toSKIKiselyov db
|
in toSKIKiselyov db
|
||||||
| otherwise = go (SLambda [v] (SLambda vs body))
|
| otherwise = go (SLambda [v] (SLambda vs body))
|
||||||
|
nestedLambdaResult _ = error "nestedLambdaResult: expected SLambda (_:_) _"
|
||||||
|
|
||||||
applicationResult (SApp f g) = SApp (go f) (go g)
|
applicationResult (SApp f g) = SApp (go f) (go g)
|
||||||
|
applicationResult _ = error "applicationResult: expected SApp _ _"
|
||||||
|
|
||||||
isSList (SList _) = True
|
isSList (SList _) = True
|
||||||
isSList _ = False
|
isSList _ = False
|
||||||
|
|
||||||
slistTransform :: TricuAST -> TricuAST
|
slistTransform :: TricuAST -> TricuAST
|
||||||
slistTransform (SList xs) = foldr (\m r -> SApp (SApp TLeaf (go m)) r) TLeaf xs
|
slistTransform (SList xs) = foldr (\m r -> SApp (SApp TLeaf (go m)) r) TLeaf xs
|
||||||
slistTransform ast = ast -- Should not be reached if isSList is the guard
|
slistTransform ast = ast -- Should not be reached
|
||||||
|
|
||||||
|
_S, _K, _I, _R, _C, _B, _T, _TRI :: TricuAST
|
||||||
_S = parseSingle "t (t (t t t)) t"
|
_S = parseSingle "t (t (t t t)) t"
|
||||||
_K = parseSingle "t t"
|
_K = parseSingle "t t"
|
||||||
_I = parseSingle "t (t (t t)) t"
|
_I = parseSingle "t (t (t t)) t"
|
||||||
@@ -207,7 +209,9 @@ _B = parseSingle "t (t (t t (t (t (t t t)) t))) (t t)"
|
|||||||
_T = SApp _C _I
|
_T = SApp _C _I
|
||||||
_TRI = parseSingle "t (t (t t (t (t (t t t))))) t"
|
_TRI = parseSingle "t (t (t t (t (t (t t t))))) t"
|
||||||
|
|
||||||
|
triageBody :: String -> String -> String -> TricuAST
|
||||||
triageBody a b c = SApp (SApp TLeaf (SApp (SApp TLeaf (SVar a Nothing)) (SVar b Nothing))) (SVar c Nothing)
|
triageBody a b c = SApp (SApp TLeaf (SApp (SApp TLeaf (SVar a Nothing)) (SVar b Nothing))) (SVar c Nothing)
|
||||||
|
composeBody :: String -> String -> String -> TricuAST
|
||||||
composeBody f g x = SApp (SVar f Nothing) (SApp (SVar g Nothing) (SVar x Nothing))
|
composeBody f g x = SApp (SVar f Nothing) (SApp (SVar g Nothing) (SVar x Nothing))
|
||||||
|
|
||||||
isFree :: String -> TricuAST -> Bool
|
isFree :: String -> TricuAST -> Bool
|
||||||
@@ -270,7 +274,7 @@ buildDepGraph topDefs
|
|||||||
sortDeps :: Map.Map String (Set.Set String) -> [String]
|
sortDeps :: Map.Map String (Set.Set String) -> [String]
|
||||||
sortDeps graph = go [] Set.empty (Map.keys graph)
|
sortDeps graph = go [] Set.empty (Map.keys graph)
|
||||||
where
|
where
|
||||||
go sorted sortedSet [] = sorted
|
go sorted _sortedSet [] = sorted
|
||||||
go sorted sortedSet remaining =
|
go sorted sortedSet remaining =
|
||||||
let ready = [ name | name <- remaining
|
let ready = [ name | name <- remaining
|
||||||
, let deps = Map.findWithDefault Set.empty name graph
|
, let deps = Map.findWithDefault Set.empty name graph
|
||||||
@@ -354,7 +358,7 @@ freeDBNames = \case
|
|||||||
BList xs -> foldMap freeDBNames xs
|
BList xs -> foldMap freeDBNames xs
|
||||||
BEmpty -> mempty
|
BEmpty -> mempty
|
||||||
|
|
||||||
-- Helper: “is the binder named v used in body?”
|
-- Helper: "is the binder named v used in body?"
|
||||||
usesBinder :: String -> TricuAST -> Bool
|
usesBinder :: String -> TricuAST -> Bool
|
||||||
usesBinder v body = dependsOnLevel 0 (toDB [v] body)
|
usesBinder v body = dependsOnLevel 0 (toDB [v] body)
|
||||||
|
|
||||||
@@ -395,9 +399,7 @@ toSKIDB (BList xs) =
|
|||||||
in if not anyUses
|
in if not anyUses
|
||||||
then SApp _K (SList (map fromDBClosed xs))
|
then SApp _K (SList (map fromDBClosed xs))
|
||||||
else SList (map toSKIDB xs)
|
else SList (map toSKIDB xs)
|
||||||
toSKIDB other
|
toSKIDB _other = _K `SApp` TLeaf
|
||||||
| not (dependsOnLevel 0 other) = SApp _K (fromDBClosed other)
|
|
||||||
toSKIDB other = _K `SApp` TLeaf
|
|
||||||
|
|
||||||
app2 :: TricuAST -> TricuAST -> TricuAST
|
app2 :: TricuAST -> TricuAST -> TricuAST
|
||||||
app2 f x = SApp f x
|
app2 f x = SApp f x
|
||||||
@@ -415,11 +417,13 @@ kisConv = \case
|
|||||||
BVar n | n > 0 -> do
|
BVar n | n > 0 -> do
|
||||||
(g,d) <- kisConv (BVar (n - 1))
|
(g,d) <- kisConv (BVar (n - 1))
|
||||||
Right (False:g, d)
|
Right (False:g, d)
|
||||||
|
BVar n -> Right ([], SVar ("BVar" ++ show n) Nothing)
|
||||||
|
BFree s -> Right ([], SVar s Nothing)
|
||||||
BApp e1 e2 -> do
|
BApp e1 e2 -> do
|
||||||
(g1,d1) <- kisConv e1
|
(g1,d1) <- kisConv e1
|
||||||
(g2,d2) <- kisConv e2
|
(g2,d2) <- kisConv e2
|
||||||
let g = zipWithDefault False (||) g1 g2 -- <— propagate Γ outside (#)
|
let g = zipWithDefault False (||) g1 g2 -- <- propagate Γ outside (#)
|
||||||
d = kisHash (g1,d1) (g2,d2) -- <— (#) yields only the term
|
d = kisHash (g1,d1) (g2,d2) -- <- (#) yields only the term
|
||||||
Right (g, d)
|
Right (g, d)
|
||||||
-- Treat closed constants as free 'combinator leaves' (no binder use).
|
-- Treat closed constants as free 'combinator leaves' (no binder use).
|
||||||
BLeaf -> Right ([], TLeaf)
|
BLeaf -> Right ([], TLeaf)
|
||||||
@@ -437,12 +441,11 @@ kisConv = \case
|
|||||||
BFork l r
|
BFork l r
|
||||||
| dependsOnLevel 0 l || dependsOnLevel 0 r -> Left "Fork with binder use: fallback"
|
| dependsOnLevel 0 l || dependsOnLevel 0 r -> Left "Fork with binder use: fallback"
|
||||||
| otherwise -> Right ([], TFork (fromDBClosed l) (fromDBClosed r))
|
| otherwise -> Right ([], TFork (fromDBClosed l) (fromDBClosed r))
|
||||||
-- We shouldn’t see BLam under elim; treat as unsupported so we fallback.
|
-- We shouldn't see BLam under elim; treat as unsupported so we fallback.
|
||||||
BLam _ -> Left "Nested lambda under body: fallback"
|
BLam _ -> Left "Nested lambda under body: fallback"
|
||||||
BFree s -> Right ([], SVar s Nothing)
|
|
||||||
|
|
||||||
-- Application combiner with K-optimization (lazy weakening).
|
-- Application combiner with K-optimization (lazy weakening).
|
||||||
-- Mirrors Lynn’s 'optK' rules: choose among S, B, C, R based on leading flags.
|
-- Mirrors Lynn's 'optK' rules: choose among S, B, C, R based on leading flags.
|
||||||
-- η-aware (#) with K-optimization (adapted from TS kiselyov_eta)
|
-- η-aware (#) with K-optimization (adapted from TS kiselyov_eta)
|
||||||
kisHash :: (Uses, TricuAST) -> (Uses, TricuAST) -> TricuAST
|
kisHash :: (Uses, TricuAST) -> (Uses, TricuAST) -> TricuAST
|
||||||
kisHash (g1, d1) (g2, d2) =
|
kisHash (g1, d1) (g2, d2) =
|
||||||
@@ -563,7 +566,6 @@ bulkS :: Int -> TricuAST
|
|||||||
bulkS n | n <= 1 = _S
|
bulkS n | n <= 1 = _S
|
||||||
| otherwise = SApp sPrime (bulkS (n - 1))
|
| otherwise = SApp sPrime (bulkS (n - 1))
|
||||||
|
|
||||||
-- Count how many leading pairs (a,b) repeat at the head of zip g1 g2
|
|
||||||
headPairRun :: [Bool] -> [Bool] -> ((Bool, Bool), Int)
|
headPairRun :: [Bool] -> [Bool] -> ((Bool, Bool), Int)
|
||||||
headPairRun g1 g2 =
|
headPairRun g1 g2 =
|
||||||
case zip g1 g2 of
|
case zip g1 g2 of
|
||||||
|
|||||||
@@ -5,11 +5,11 @@ import Lexer
|
|||||||
import Parser
|
import Parser
|
||||||
import Research
|
import Research
|
||||||
|
|
||||||
|
import Control.Monad ()
|
||||||
import Data.List (partition)
|
import Data.List (partition)
|
||||||
import Data.Maybe (mapMaybe)
|
import Data.Maybe (mapMaybe)
|
||||||
import Control.Monad (foldM)
|
|
||||||
import System.IO
|
|
||||||
import System.FilePath (takeDirectory, normalise, (</>))
|
import System.FilePath (takeDirectory, normalise, (</>))
|
||||||
|
import System.IO ()
|
||||||
|
|
||||||
import qualified Data.Map as Map
|
import qualified Data.Map as Map
|
||||||
import qualified Data.Set as Set
|
import qualified Data.Set as Set
|
||||||
@@ -17,12 +17,12 @@ import qualified Data.Set as Set
|
|||||||
extractMain :: Env -> Either String T
|
extractMain :: Env -> Either String T
|
||||||
extractMain env =
|
extractMain env =
|
||||||
case Map.lookup "main" env of
|
case Map.lookup "main" env of
|
||||||
Just result -> Right result
|
Just evalResult -> Right evalResult
|
||||||
Nothing -> Left "No `main` function detected"
|
Nothing -> Left "No `main` function detected"
|
||||||
|
|
||||||
processImports :: Set.Set FilePath -> FilePath -> FilePath -> [TricuAST]
|
processImports :: Set.Set FilePath -> FilePath -> FilePath -> [TricuAST]
|
||||||
-> Either String ([TricuAST], [(FilePath, String, FilePath)])
|
-> Either String ([TricuAST], [(FilePath, String, FilePath)])
|
||||||
processImports seen base currentPath asts =
|
processImports seen _base currentPath asts =
|
||||||
let (imports, nonImports) = partition isImp asts
|
let (imports, nonImports) = partition isImp asts
|
||||||
importPaths = mapMaybe getImportInfo imports
|
importPaths = mapMaybe getImportInfo imports
|
||||||
in if currentPath `Set.member` seen
|
in if currentPath `Set.member` seen
|
||||||
@@ -40,11 +40,11 @@ evaluateFileResult filePath = do
|
|||||||
let tokens = lexTricu contents
|
let tokens = lexTricu contents
|
||||||
case parseProgram tokens of
|
case parseProgram tokens of
|
||||||
Left err -> errorWithoutStackTrace (handleParseError err)
|
Left err -> errorWithoutStackTrace (handleParseError err)
|
||||||
Right ast -> do
|
Right _ast -> do
|
||||||
processedAst <- preprocessFile filePath
|
processedAst <- preprocessFile filePath
|
||||||
let finalEnv = evalTricu Map.empty processedAst
|
let finalEnv = evalTricu Map.empty processedAst
|
||||||
case extractMain finalEnv of
|
case extractMain finalEnv of
|
||||||
Right result -> return result
|
Right evalResult -> return evalResult
|
||||||
Left err -> errorWithoutStackTrace err
|
Left err -> errorWithoutStackTrace err
|
||||||
|
|
||||||
evaluateFile :: FilePath -> IO Env
|
evaluateFile :: FilePath -> IO Env
|
||||||
@@ -53,7 +53,7 @@ evaluateFile filePath = do
|
|||||||
let tokens = lexTricu contents
|
let tokens = lexTricu contents
|
||||||
case parseProgram tokens of
|
case parseProgram tokens of
|
||||||
Left err -> errorWithoutStackTrace (handleParseError err)
|
Left err -> errorWithoutStackTrace (handleParseError err)
|
||||||
Right ast -> do
|
Right _ast -> do
|
||||||
ast <- preprocessFile filePath
|
ast <- preprocessFile filePath
|
||||||
pure $ evalTricu Map.empty ast
|
pure $ evalTricu Map.empty ast
|
||||||
|
|
||||||
@@ -63,7 +63,7 @@ evaluateFileWithContext env filePath = do
|
|||||||
let tokens = lexTricu contents
|
let tokens = lexTricu contents
|
||||||
case parseProgram tokens of
|
case parseProgram tokens of
|
||||||
Left err -> errorWithoutStackTrace (handleParseError err)
|
Left err -> errorWithoutStackTrace (handleParseError err)
|
||||||
Right ast -> do
|
Right _ast -> do
|
||||||
ast <- preprocessFile filePath
|
ast <- preprocessFile filePath
|
||||||
pure $ evalTricu env ast
|
pure $ evalTricu env ast
|
||||||
|
|
||||||
@@ -84,8 +84,8 @@ preprocessFile' seen base currentPath = do
|
|||||||
imported <- concat <$> mapM (processImportPath seen' base) importPaths
|
imported <- concat <$> mapM (processImportPath seen' base) importPaths
|
||||||
pure $ imported ++ nonImports
|
pure $ imported ++ nonImports
|
||||||
where
|
where
|
||||||
processImportPath seen base (path, name, importPath) = do
|
processImportPath _seen _base (_path, name, importPath) = do
|
||||||
ast <- preprocessFile' seen base importPath
|
ast <- preprocessFile' _seen _base importPath
|
||||||
pure $ map (nsDefinition (if name == "!Local" then "" else name))
|
pure $ map (nsDefinition (if name == "!Local" then "" else name))
|
||||||
$ filter (not . isImp) ast
|
$ filter (not . isImp) ast
|
||||||
isImp (SImport _ _) = True
|
isImp (SImport _ _) = True
|
||||||
|
|||||||
14
src/Lexer.hs
14
src/Lexer.hs
@@ -4,13 +4,12 @@ import Research
|
|||||||
|
|
||||||
import Control.Monad (void)
|
import Control.Monad (void)
|
||||||
import Data.Functor (($>))
|
import Data.Functor (($>))
|
||||||
|
import Data.Set ()
|
||||||
import Data.Void
|
import Data.Void
|
||||||
import Text.Megaparsec
|
import Text.Megaparsec
|
||||||
import Text.Megaparsec.Char hiding (space)
|
import Text.Megaparsec.Char hiding (space)
|
||||||
import Text.Megaparsec.Char.Lexer
|
import Text.Megaparsec.Char.Lexer
|
||||||
|
|
||||||
import qualified Data.Set as Set
|
|
||||||
|
|
||||||
type Lexer = Parsec Void String
|
type Lexer = Parsec Void String
|
||||||
|
|
||||||
tricuLexer :: Lexer [LToken]
|
tricuLexer :: Lexer [LToken]
|
||||||
@@ -23,13 +22,13 @@ tricuLexer = do
|
|||||||
]
|
]
|
||||||
sc
|
sc
|
||||||
pure tok
|
pure tok
|
||||||
tokens <- many $ do
|
toks <- many $ do
|
||||||
tok <- choice tricuLexer'
|
tok <- choice tricuLexer'
|
||||||
sc
|
sc
|
||||||
pure tok
|
pure tok
|
||||||
sc
|
sc
|
||||||
eof
|
eof
|
||||||
pure (header ++ tokens)
|
pure (header ++ toks)
|
||||||
where
|
where
|
||||||
tricuLexer' =
|
tricuLexer' =
|
||||||
[ try lnewline
|
[ try lnewline
|
||||||
@@ -51,7 +50,7 @@ tricuLexer = do
|
|||||||
lexTricu :: String -> [LToken]
|
lexTricu :: String -> [LToken]
|
||||||
lexTricu input = case runParser tricuLexer "" input of
|
lexTricu input = case runParser tricuLexer "" input of
|
||||||
Left err -> errorWithoutStackTrace $ "Lexical error:\n" ++ errorBundlePretty err
|
Left err -> errorWithoutStackTrace $ "Lexical error:\n" ++ errorBundlePretty err
|
||||||
Right tokens -> tokens
|
Right toks -> toks
|
||||||
|
|
||||||
|
|
||||||
keywordT :: Lexer LToken
|
keywordT :: Lexer LToken
|
||||||
@@ -143,8 +142,8 @@ integerLiteral = do
|
|||||||
|
|
||||||
stringLiteral :: Lexer LToken
|
stringLiteral :: Lexer LToken
|
||||||
stringLiteral = do
|
stringLiteral = do
|
||||||
char '"'
|
void (char '"')
|
||||||
content <- manyTill Lexer.charLiteral (char '"')
|
content <- manyTill Lexer.charLiteral (void (char '"'))
|
||||||
return (LStringLiteral content)
|
return (LStringLiteral content)
|
||||||
|
|
||||||
charLiteral :: Lexer Char
|
charLiteral :: Lexer Char
|
||||||
@@ -163,3 +162,4 @@ charLiteral = escapedChar <|> normalChar
|
|||||||
'\\' -> '\\'
|
'\\' -> '\\'
|
||||||
'"' -> '"'
|
'"' -> '"'
|
||||||
'\'' -> '\''
|
'\'' -> '\''
|
||||||
|
_ -> c
|
||||||
|
|||||||
16
src/Main.hs
16
src/Main.hs
@@ -1,18 +1,18 @@
|
|||||||
module Main where
|
module Main where
|
||||||
|
|
||||||
|
import ContentStore ()
|
||||||
import Eval (evalTricu, mainResult, result)
|
import Eval (evalTricu, mainResult, result)
|
||||||
import FileEval
|
import FileEval
|
||||||
import Parser (parseTricu)
|
import Parser (parseTricu)
|
||||||
import REPL
|
import REPL
|
||||||
import Research
|
import Research
|
||||||
import ContentStore
|
|
||||||
|
|
||||||
import Control.Monad (foldM)
|
import Control.Monad (foldM)
|
||||||
import Control.Monad.IO.Class (liftIO)
|
import Control.Monad.IO.Class ()
|
||||||
import Data.Version (showVersion)
|
import Data.Version (showVersion)
|
||||||
import Text.Megaparsec (runParser)
|
|
||||||
import Paths_tricu (version)
|
import Paths_tricu (version)
|
||||||
import System.Console.CmdArgs
|
import System.Console.CmdArgs
|
||||||
|
import Text.Megaparsec ()
|
||||||
|
|
||||||
import qualified Data.Map as Map
|
import qualified Data.Map as Map
|
||||||
|
|
||||||
@@ -56,24 +56,24 @@ decodeMode = TDecode
|
|||||||
main :: IO ()
|
main :: IO ()
|
||||||
main = do
|
main = do
|
||||||
let versionStr = "tricu Evaluator and REPL " ++ showVersion version
|
let versionStr = "tricu Evaluator and REPL " ++ showVersion version
|
||||||
args <- cmdArgs $ modes [replMode, evaluateMode, decodeMode]
|
cmdArgsParsed <- cmdArgs $ modes [replMode, evaluateMode, decodeMode]
|
||||||
&= help "tricu: Exploring Tree Calculus"
|
&= help "tricu: Exploring Tree Calculus"
|
||||||
&= program "tricu"
|
&= program "tricu"
|
||||||
&= summary versionStr
|
&= summary versionStr
|
||||||
&= versionArg [explicit, name "version", summary versionStr]
|
&= versionArg [explicit, name "version", summary versionStr]
|
||||||
case args of
|
case cmdArgsParsed of
|
||||||
Repl -> do
|
Repl -> do
|
||||||
putStrLn "Welcome to the tricu REPL"
|
putStrLn "Welcome to the tricu REPL"
|
||||||
putStrLn "You may exit with `CTRL+D` or the `!exit` command."
|
putStrLn "You may exit with `CTRL+D` or the `!exit` command."
|
||||||
repl
|
repl
|
||||||
Evaluate { file = filePaths, form = form } -> do
|
Evaluate { file = filePaths, form = outputForm } -> do
|
||||||
result <- case filePaths of
|
evalResult <- case filePaths of
|
||||||
[] -> runTricuT <$> getContents
|
[] -> runTricuT <$> getContents
|
||||||
(filePath:restFilePaths) -> do
|
(filePath:restFilePaths) -> do
|
||||||
initialEnv <- evaluateFile filePath
|
initialEnv <- evaluateFile filePath
|
||||||
finalEnv <- foldM evaluateFileWithContext initialEnv restFilePaths
|
finalEnv <- foldM evaluateFileWithContext initialEnv restFilePaths
|
||||||
pure $ mainResult finalEnv
|
pure $ mainResult finalEnv
|
||||||
let fRes = formatT form result
|
let fRes = formatT outputForm evalResult
|
||||||
putStr fRes
|
putStr fRes
|
||||||
TDecode { file = filePaths } -> do
|
TDecode { file = filePaths } -> do
|
||||||
value <- case filePaths of
|
value <- case filePaths of
|
||||||
|
|||||||
@@ -8,7 +8,7 @@ import Control.Monad.State
|
|||||||
import Data.List.NonEmpty (toList)
|
import Data.List.NonEmpty (toList)
|
||||||
import Data.Void (Void)
|
import Data.Void (Void)
|
||||||
import Text.Megaparsec
|
import Text.Megaparsec
|
||||||
import Text.Megaparsec.Error (ParseErrorBundle, errorBundlePretty)
|
|
||||||
import qualified Data.Set as Set
|
import qualified Data.Set as Set
|
||||||
|
|
||||||
data PState = PState
|
data PState = PState
|
||||||
@@ -20,9 +20,9 @@ type ParserM = StateT PState (Parsec Void [LToken])
|
|||||||
|
|
||||||
satisfyM :: (LToken -> Bool) -> ParserM LToken
|
satisfyM :: (LToken -> Bool) -> ParserM LToken
|
||||||
satisfyM f = do
|
satisfyM f = do
|
||||||
token <- lift (satisfy f)
|
tok <- lift (satisfy f)
|
||||||
modify' (updateDepth token)
|
modify' (updateDepth tok)
|
||||||
return token
|
return tok
|
||||||
|
|
||||||
updateDepth :: LToken -> PState -> PState
|
updateDepth :: LToken -> PState -> PState
|
||||||
updateDepth LOpenParen st = st { parenDepth = parenDepth st + 1 }
|
updateDepth LOpenParen st = st { parenDepth = parenDepth st + 1 }
|
||||||
@@ -39,12 +39,12 @@ topLevelNewline = do
|
|||||||
else fail "Top-level exit in nested context (paren or bracket)"
|
else fail "Top-level exit in nested context (paren or bracket)"
|
||||||
|
|
||||||
parseProgram :: [LToken] -> Either (ParseErrorBundle [LToken] Void) [TricuAST]
|
parseProgram :: [LToken] -> Either (ParseErrorBundle [LToken] Void) [TricuAST]
|
||||||
parseProgram tokens =
|
parseProgram toks =
|
||||||
runParser (evalStateT (parseProgramM <* finalizeDepth <* eof) (PState 0 0)) "" tokens
|
runParser (evalStateT (parseProgramM <* finalizeDepth <* eof) (PState 0 0)) "" toks
|
||||||
|
|
||||||
parseSingleExpr :: [LToken] -> Either (ParseErrorBundle [LToken] Void) TricuAST
|
parseSingleExpr :: [LToken] -> Either (ParseErrorBundle [LToken] Void) TricuAST
|
||||||
parseSingleExpr tokens =
|
parseSingleExpr toks =
|
||||||
runParser (evalStateT (scnParserM *> parseExpressionM <* finalizeDepth <* eof) (PState 0 0)) "" tokens
|
runParser (evalStateT (scnParserM *> parseExpressionM <* finalizeDepth <* eof) (PState 0 0)) "" toks
|
||||||
|
|
||||||
finalizeDepth :: ParserM ()
|
finalizeDepth :: ParserM ()
|
||||||
finalizeDepth = do
|
finalizeDepth = do
|
||||||
@@ -195,6 +195,7 @@ parseTreeTermM = do
|
|||||||
| TLeaf <- acc = TStem next
|
| TLeaf <- acc = TStem next
|
||||||
| TStem t <- acc = TFork t next
|
| TStem t <- acc = TFork t next
|
||||||
| TFork _ _ <- acc = TFork acc next
|
| TFork _ _ <- acc = TFork acc next
|
||||||
|
| otherwise = SApp acc next
|
||||||
|
|
||||||
parseTreeLeafOrParenthesizedM :: ParserM TricuAST
|
parseTreeLeafOrParenthesizedM :: ParserM TricuAST
|
||||||
parseTreeLeafOrParenthesizedM = choice
|
parseTreeLeafOrParenthesizedM = choice
|
||||||
@@ -248,20 +249,20 @@ parseGroupedItemM = do
|
|||||||
|
|
||||||
parseSingleItemM :: ParserM TricuAST
|
parseSingleItemM :: ParserM TricuAST
|
||||||
parseSingleItemM = do
|
parseSingleItemM = do
|
||||||
token <- satisfyM (\case LIdentifier _ -> True; LKeywordT -> True; _ -> False)
|
tok <- satisfyM (\case LIdentifier _ -> True; LKeywordT -> True; _ -> False)
|
||||||
if | LIdentifier name <- token -> pure (SVar name Nothing)
|
if | LIdentifier name <- tok -> pure (SVar name Nothing)
|
||||||
| token == LKeywordT -> pure TLeaf
|
| tok == LKeywordT -> pure TLeaf
|
||||||
| otherwise -> fail "Unexpected token in list item"
|
| otherwise -> fail "Unexpected token in list item"
|
||||||
|
|
||||||
parseVarM :: ParserM TricuAST
|
parseVarM :: ParserM TricuAST
|
||||||
parseVarM = do
|
parseVarM = do
|
||||||
token <- satisfyM (\case
|
tok <- satisfyM (\case
|
||||||
LNamespace _ -> True
|
LNamespace _ -> True
|
||||||
LIdentifier _ -> True
|
LIdentifier _ -> True
|
||||||
LIdentifierWithHash _ _ -> True
|
LIdentifierWithHash _ _ -> True
|
||||||
_ -> False)
|
_ -> False)
|
||||||
|
|
||||||
case token of
|
case tok of
|
||||||
LNamespace ns -> do
|
LNamespace ns -> do
|
||||||
_ <- satisfyM (== LDot)
|
_ <- satisfyM (== LDot)
|
||||||
LIdentifier name <- satisfyM (\case LIdentifier _ -> True; _ -> False)
|
LIdentifier name <- satisfyM (\case LIdentifier _ -> True; _ -> False)
|
||||||
@@ -282,8 +283,8 @@ parseVarM = do
|
|||||||
parseIntLiteralM :: ParserM TricuAST
|
parseIntLiteralM :: ParserM TricuAST
|
||||||
parseIntLiteralM = do
|
parseIntLiteralM = do
|
||||||
let intL = (\case LIntegerLiteral _ -> True; _ -> False)
|
let intL = (\case LIntegerLiteral _ -> True; _ -> False)
|
||||||
token <- satisfyM intL
|
tok <- satisfyM intL
|
||||||
if | LIntegerLiteral value <- token ->
|
if | LIntegerLiteral value <- tok ->
|
||||||
pure (SInt (fromIntegral value))
|
pure (SInt (fromIntegral value))
|
||||||
| otherwise ->
|
| otherwise ->
|
||||||
fail "Unexpected token while parsing integer literal"
|
fail "Unexpected token while parsing integer literal"
|
||||||
@@ -291,8 +292,8 @@ parseIntLiteralM = do
|
|||||||
parseStrLiteralM :: ParserM TricuAST
|
parseStrLiteralM :: ParserM TricuAST
|
||||||
parseStrLiteralM = do
|
parseStrLiteralM = do
|
||||||
let strL = (\case LStringLiteral _ -> True; _ -> False)
|
let strL = (\case LStringLiteral _ -> True; _ -> False)
|
||||||
token <- satisfyM strL
|
tok <- satisfyM strL
|
||||||
if | LStringLiteral value <- token ->
|
if | LStringLiteral value <- tok ->
|
||||||
pure (SStr value)
|
pure (SStr value)
|
||||||
| otherwise ->
|
| otherwise ->
|
||||||
fail "Unexpected token while parsing string literal"
|
fail "Unexpected token while parsing string literal"
|
||||||
@@ -308,8 +309,8 @@ handleParseError bundle =
|
|||||||
in unlines ("Parse error(s) encountered:" : formattedErrors)
|
in unlines ("Parse error(s) encountered:" : formattedErrors)
|
||||||
|
|
||||||
formatError :: ParseError [LToken] Void -> String
|
formatError :: ParseError [LToken] Void -> String
|
||||||
formatError (TrivialError offset unexpected expected) =
|
formatError (TrivialError offset msgUnexpected expected) =
|
||||||
let unexpectedMsg = case unexpected of
|
let unexpectedMsg = case msgUnexpected of
|
||||||
Just x -> "unexpected token " ++ show x
|
Just x -> "unexpected token " ++ show x
|
||||||
Nothing -> "unexpected end of input"
|
Nothing -> "unexpected end of input"
|
||||||
expectedMsg = if null expected
|
expectedMsg = if null expected
|
||||||
|
|||||||
101
src/REPL.hs
101
src/REPL.hs
@@ -1,48 +1,41 @@
|
|||||||
module REPL where
|
module REPL where
|
||||||
|
|
||||||
|
import ContentStore
|
||||||
import Eval
|
import Eval
|
||||||
import FileEval
|
import FileEval
|
||||||
import Lexer
|
import Lexer ()
|
||||||
import Parser
|
import Parser
|
||||||
import Research
|
import Research
|
||||||
import ContentStore
|
|
||||||
|
|
||||||
import Control.Concurrent (forkIO, threadDelay, killThread, ThreadId)
|
import Control.Concurrent (forkIO, threadDelay, killThread, ThreadId)
|
||||||
import Control.Monad (forever, void, when, forM, forM_, foldM, unless)
|
import Control.Exception (SomeException, catch, displayException)
|
||||||
import Data.ByteString (ByteString)
|
import Control.Monad ()
|
||||||
import Data.Maybe (isNothing, isJust, fromJust, catMaybes)
|
import Control.Monad (forever, when, forM_, foldM, unless)
|
||||||
import Database.SQLite.Simple (Connection, Only(..), query, query_, execute, execute_, open)
|
import Control.Monad.Catch (handle)
|
||||||
|
import Control.Monad.IO.Class (liftIO)
|
||||||
|
import Control.Monad.Trans.Class ()
|
||||||
|
import Control.Monad.Trans.Maybe (MaybeT(..), runMaybeT)
|
||||||
|
import Data.ByteString ()
|
||||||
|
import Data.Char (isSpace)
|
||||||
|
import Data.IORef (newIORef, readIORef, writeIORef)
|
||||||
|
import Data.List (dropWhileEnd, isPrefixOf, find)
|
||||||
|
import Data.Maybe (isJust, fromJust)
|
||||||
|
import Data.Time (getCurrentTime, diffUTCTime)
|
||||||
|
import Data.Time.Clock.POSIX (posixSecondsToUTCTime)
|
||||||
|
import Data.Time.Format (formatTime, defaultTimeLocale)
|
||||||
|
import Data.Version (showVersion)
|
||||||
|
import Database.SQLite.Simple (Connection, Only(..), query)
|
||||||
|
import Paths_tricu (version)
|
||||||
|
import System.Console.ANSI (setSGR, SGR(..), ConsoleLayer(..), ColorIntensity(..), Color(..))
|
||||||
|
import System.Console.Haskeline
|
||||||
import System.Directory (doesFileExist, createDirectoryIfMissing)
|
import System.Directory (doesFileExist, createDirectoryIfMissing)
|
||||||
import System.FSNotify
|
import System.FSNotify
|
||||||
import System.FilePath (takeDirectory, (</>))
|
import System.FilePath (takeDirectory, (</>))
|
||||||
import Text.Read (readMaybe)
|
import Text.Read (readMaybe)
|
||||||
|
|
||||||
import Control.Exception (IOException, SomeException, catch
|
|
||||||
, displayException)
|
|
||||||
import Control.Monad (forM_)
|
|
||||||
import Control.Monad.Catch (handle, MonadCatch)
|
|
||||||
import Control.Monad.IO.Class (liftIO)
|
|
||||||
import Control.Monad.Trans.Class (lift)
|
|
||||||
import Control.Monad.Trans.Maybe (MaybeT(..), runMaybeT)
|
|
||||||
import Data.Char (isSpace, isUpper)
|
|
||||||
import Data.List ((\\), dropWhile, dropWhileEnd, isPrefixOf, nub, sortBy, groupBy, intercalate, find)
|
|
||||||
import Data.Version (showVersion)
|
|
||||||
import Paths_tricu (version)
|
|
||||||
import System.Console.Haskeline
|
|
||||||
import System.Console.ANSI (setSGR, SGR(..), ConsoleLayer(..), ColorIntensity(..),
|
|
||||||
Color(..), ConsoleIntensity(..), clearFromCursorToLineEnd)
|
|
||||||
|
|
||||||
import qualified Data.Map as Map
|
import qualified Data.Map as Map
|
||||||
import qualified Data.Text as T
|
import qualified Data.Text as T
|
||||||
import qualified Data.Text.IO as T
|
import qualified Data.Text.IO as T ()
|
||||||
|
|
||||||
import Control.Concurrent (forkIO, threadDelay)
|
|
||||||
import Data.IORef (IORef, newIORef, readIORef, writeIORef)
|
|
||||||
import Data.Time (UTCTime, getCurrentTime, diffUTCTime)
|
|
||||||
import Control.Concurrent.MVar (MVar, newMVar, putMVar, takeMVar)
|
|
||||||
|
|
||||||
import Data.Time.Format (formatTime, defaultTimeLocale)
|
|
||||||
import Data.Time.Clock.POSIX (posixSecondsToUTCTime)
|
|
||||||
|
|
||||||
data REPLState = REPLState
|
data REPLState = REPLState
|
||||||
{ replForm :: EvaluatedForm
|
{ replForm :: EvaluatedForm
|
||||||
@@ -121,26 +114,26 @@ repl = do
|
|||||||
| "!tag" `isPrefixOf` strip s -> handleTag state
|
| "!tag" `isPrefixOf` strip s -> handleTag state
|
||||||
| take 2 s == "--" -> loop state
|
| take 2 s == "--" -> loop state
|
||||||
| otherwise -> do
|
| otherwise -> do
|
||||||
result <- liftIO $ catch
|
evalResult <- liftIO $ catch
|
||||||
(processInput state s)
|
(processInput state s)
|
||||||
(errorHandler state)
|
(errorHandler state)
|
||||||
loop result
|
loop evalResult
|
||||||
|
|
||||||
handleOutput :: REPLState -> InputT IO ()
|
handleOutput :: REPLState -> InputT IO ()
|
||||||
handleOutput state = do
|
handleOutput state = do
|
||||||
let formats = [Decode, TreeCalculus, FSL, AST, Ternary, Ascii]
|
let formats = [Decode, TreeCalculus, FSL, AST, Ternary, Ascii]
|
||||||
outputStrLn "Available output formats:"
|
outputStrLn "Available output formats:"
|
||||||
mapM_ (\(i, f) -> outputStrLn $ show i ++ ". " ++ show f)
|
mapM_ (\(i, f) -> outputStrLn $ show (i :: Int) ++ ". " ++ show f)
|
||||||
(zip [1..] formats)
|
(zip [1..] formats)
|
||||||
|
|
||||||
result <- runMaybeT $ do
|
evalResult <- runMaybeT $ do
|
||||||
input <- MaybeT $ getInputLine "Select output format (1-6) < "
|
input <- MaybeT $ getInputLine "Select output format (1-6) < "
|
||||||
case reads input of
|
case reads input of
|
||||||
[(n, "")] | n >= 1 && n <= 6 ->
|
[(n, "")] | n >= 1 && n <= 6 ->
|
||||||
return $ formats !! (n-1)
|
return $ formats !! (n-1)
|
||||||
_ -> MaybeT $ return Nothing
|
_ -> MaybeT $ return Nothing
|
||||||
|
|
||||||
case result of
|
case evalResult of
|
||||||
Nothing -> do
|
Nothing -> do
|
||||||
outputStrLn "Invalid selection. Keeping current output format."
|
outputStrLn "Invalid selection. Keeping current output format."
|
||||||
loop state
|
loop state
|
||||||
@@ -201,7 +194,7 @@ repl = do
|
|||||||
|
|
||||||
importFile :: REPLState -> String -> InputT IO ()
|
importFile :: REPLState -> String -> InputT IO ()
|
||||||
importFile state cleanFilename = do
|
importFile state cleanFilename = do
|
||||||
code <- liftIO $ readFile cleanFilename
|
_code <- liftIO $ readFile cleanFilename
|
||||||
case replContentStore state of
|
case replContentStore state of
|
||||||
Nothing -> do
|
Nothing -> do
|
||||||
liftIO $ printError "Content store not initialized"
|
liftIO $ printError "Content store not initialized"
|
||||||
@@ -216,7 +209,7 @@ repl = do
|
|||||||
importedCount <- foldM (\count (name, term) -> do
|
importedCount <- foldM (\count (name, term) -> do
|
||||||
hash <- ContentStore.storeTerm conn [name] term
|
hash <- ContentStore.storeTerm conn [name] term
|
||||||
printSuccess $ "Stored definition: " ++ name ++ " with hash " ++ T.unpack hash
|
printSuccess $ "Stored definition: " ++ name ++ " with hash " ++ T.unpack hash
|
||||||
return (count + 1)
|
return (count + (1 :: Int))
|
||||||
) 0 defs
|
) 0 defs
|
||||||
|
|
||||||
printSuccess $ "Imported " ++ show importedCount ++ " definitions successfully"
|
printSuccess $ "Imported " ++ show importedCount ++ " definitions successfully"
|
||||||
@@ -248,7 +241,7 @@ repl = do
|
|||||||
lastProcessedRef <- liftIO $ newIORef =<< getCurrentTime
|
lastProcessedRef <- liftIO $ newIORef =<< getCurrentTime
|
||||||
|
|
||||||
watcherId <- liftIO $ forkIO $ withManager $ \mgr -> do
|
watcherId <- liftIO $ forkIO $ withManager $ \mgr -> do
|
||||||
stopAction <- watchDir mgr dirPath (\event -> eventPath event == filepath) $ \event -> do
|
_stopAction <- watchDir mgr dirPath (\ev -> eventPath ev == filepath) $ \_ -> do
|
||||||
now <- getCurrentTime
|
now <- getCurrentTime
|
||||||
lastProcessed <- readIORef lastProcessedRef
|
lastProcessed <- readIORef lastProcessedRef
|
||||||
when (diffUTCTime now lastProcessed > 0.5) $ do
|
when (diffUTCTime now lastProcessed > 0.5) $ do
|
||||||
@@ -259,8 +252,8 @@ repl = do
|
|||||||
|
|
||||||
watchLoop state { replWatchedFile = Just filepath, replWatcherThread = Just watcherId }
|
watchLoop state { replWatchedFile = Just filepath, replWatcherThread = Just watcherId }
|
||||||
|
|
||||||
handleUnwatch :: REPLState -> InputT IO ()
|
_handleUnwatch :: REPLState -> InputT IO ()
|
||||||
handleUnwatch state = case replWatchedFile state of
|
_handleUnwatch state = case replWatchedFile state of
|
||||||
Nothing -> do
|
Nothing -> do
|
||||||
outputStrLn "No file is currently being watched"
|
outputStrLn "No file is currently being watched"
|
||||||
loop state
|
loop state
|
||||||
@@ -275,7 +268,7 @@ repl = do
|
|||||||
Nothing -> do
|
Nothing -> do
|
||||||
outputStrLn "Content store not initialized"
|
outputStrLn "Content store not initialized"
|
||||||
loop state
|
loop state
|
||||||
Just conn -> do
|
Just _conn -> do
|
||||||
outputStrLn "Environment refreshed from content store (definitions are live)"
|
outputStrLn "Environment refreshed from content store (definitions are live)"
|
||||||
loop state
|
loop state
|
||||||
|
|
||||||
@@ -486,8 +479,8 @@ repl = do
|
|||||||
forM_ asts $ \ast -> do
|
forM_ asts $ \ast -> do
|
||||||
case ast of
|
case ast of
|
||||||
SDef name [] body -> do
|
SDef name [] body -> do
|
||||||
result <- evalAST (Just conn) (replSelectedVersions newState) body
|
evalResult <- evalAST (Just conn) (replSelectedVersions newState) body
|
||||||
hash <- ContentStore.storeTerm conn [name] result
|
hash <- ContentStore.storeTerm conn [name] evalResult
|
||||||
|
|
||||||
liftIO $ do
|
liftIO $ do
|
||||||
putStr "tricu > "
|
putStr "tricu > "
|
||||||
@@ -498,14 +491,14 @@ repl = do
|
|||||||
putStrLn ""
|
putStrLn ""
|
||||||
|
|
||||||
putStr "tricu > "
|
putStr "tricu > "
|
||||||
printResult $ formatT (replForm newState) result
|
printResult $ formatT (replForm newState) evalResult
|
||||||
putStrLn ""
|
putStrLn ""
|
||||||
|
|
||||||
_ -> do
|
_ -> do
|
||||||
result <- evalAST (Just conn) (replSelectedVersions newState) ast
|
evalResult <- evalAST (Just conn) (replSelectedVersions newState) ast
|
||||||
liftIO $ do
|
liftIO $ do
|
||||||
putStr "tricu > "
|
putStr "tricu > "
|
||||||
printResult $ formatT (replForm newState) result
|
printResult $ formatT (replForm newState) evalResult
|
||||||
putStrLn ""
|
putStrLn ""
|
||||||
return newState
|
return newState
|
||||||
|
|
||||||
@@ -531,13 +524,13 @@ repl = do
|
|||||||
Just conn -> do
|
Just conn -> do
|
||||||
forM_ asts $ \ast -> case ast of
|
forM_ asts $ \ast -> case ast of
|
||||||
SDef name [] body -> do
|
SDef name [] body -> do
|
||||||
result <- evalAST (Just conn) selectedVersions body
|
evalResult <- evalAST (Just conn) selectedVersions body
|
||||||
hash <- ContentStore.storeTerm conn [name] result
|
hash <- ContentStore.storeTerm conn [name] evalResult
|
||||||
putStrLn $ "tricu > Stored definition: " ++ name ++ " with hash " ++ T.unpack hash
|
putStrLn $ "tricu > Stored definition: " ++ name ++ " with hash " ++ T.unpack hash
|
||||||
putStrLn $ "tricu > " ++ name ++ " = " ++ formatT outputForm result
|
putStrLn $ "tricu > " ++ name ++ " = " ++ formatT outputForm evalResult
|
||||||
_ -> do
|
_ -> do
|
||||||
result <- evalAST (Just conn) selectedVersions ast
|
evalResult <- evalAST (Just conn) selectedVersions ast
|
||||||
putStrLn $ "tricu > Result: " ++ formatT outputForm result
|
putStrLn $ "tricu > Result: " ++ formatT outputForm evalResult
|
||||||
putStrLn $ "tricu > Processed file: " ++ filepath
|
putStrLn $ "tricu > Processed file: " ++ filepath
|
||||||
|
|
||||||
formatTimestamp :: Integer -> String
|
formatTimestamp :: Integer -> String
|
||||||
@@ -552,12 +545,6 @@ repl = do
|
|||||||
putStr $ T.unpack rest
|
putStr $ T.unpack rest
|
||||||
setSGR [Reset]
|
setSGR [Reset]
|
||||||
|
|
||||||
coloredHashString :: T.Text -> String
|
|
||||||
coloredHashString hash =
|
|
||||||
"\ESC[1;36m" ++ T.unpack (T.take 16 hash) ++
|
|
||||||
"\ESC[0;37m" ++ T.unpack (T.drop 16 hash) ++
|
|
||||||
"\ESC[0m"
|
|
||||||
|
|
||||||
withColor :: ColorIntensity -> Color -> IO () -> IO ()
|
withColor :: ColorIntensity -> Color -> IO () -> IO ()
|
||||||
withColor intensity color action = do
|
withColor intensity color action = do
|
||||||
setSGR [SetColor Foreground intensity color]
|
setSGR [SetColor Foreground intensity color]
|
||||||
|
|||||||
@@ -1,17 +1,17 @@
|
|||||||
module Research where
|
module Research where
|
||||||
|
|
||||||
|
import Crypto.Hash (hash, SHA256, Digest)
|
||||||
import Data.ByteArray (convert)
|
import Data.ByteArray (convert)
|
||||||
import Data.ByteString.Base16 (decode, encode)
|
import Data.ByteString.Base16 (decode, encode)
|
||||||
import Data.List (intercalate)
|
import Data.List (intercalate)
|
||||||
import Data.Map (Map)
|
import Data.Map ()
|
||||||
import Data.Text (Text, replace, pack)
|
import Data.Text (Text, replace)
|
||||||
import Data.Text.Encoding (decodeUtf8, encodeUtf8)
|
import Data.Text.Encoding (decodeUtf8, encodeUtf8)
|
||||||
import System.Console.CmdArgs (Data, Typeable)
|
import System.Console.CmdArgs (Data, Typeable)
|
||||||
|
|
||||||
import qualified Data.ByteString as BS
|
import qualified Data.ByteString as BS
|
||||||
import qualified Data.Map as Map
|
import qualified Data.Map as Map
|
||||||
import qualified Data.Text as T
|
import qualified Data.Text as T
|
||||||
import Crypto.Hash (hash, SHA256, Digest)
|
|
||||||
|
|
||||||
-- Tree Calculus Types
|
-- Tree Calculus Types
|
||||||
data T = Leaf | Stem T | Fork T T
|
data T = Leaf | Stem T | Fork T T
|
||||||
@@ -19,7 +19,7 @@ data T = Leaf | Stem T | Fork T T
|
|||||||
|
|
||||||
-- Abstract Syntax Tree for tricu
|
-- Abstract Syntax Tree for tricu
|
||||||
data TricuAST
|
data TricuAST
|
||||||
= SVar String (Maybe String) -- Variable name and optional hash prefix
|
= SVar String (Maybe String)
|
||||||
| SInt Integer
|
| SInt Integer
|
||||||
| SStr String
|
| SStr String
|
||||||
| SList [TricuAST]
|
| SList [TricuAST]
|
||||||
@@ -131,9 +131,9 @@ buildMerkle (Fork l r) = NFork (nodeHash left) (nodeHash right)
|
|||||||
apply :: T -> T -> T
|
apply :: T -> T -> T
|
||||||
apply (Fork Leaf a) _ = a
|
apply (Fork Leaf a) _ = a
|
||||||
apply (Fork (Stem a) b) c = apply (apply a c) (apply b c)
|
apply (Fork (Stem a) b) c = apply (apply a c) (apply b c)
|
||||||
apply (Fork (Fork a b) c) Leaf = a
|
apply (Fork (Fork _a _b) _c) Leaf = _a
|
||||||
apply (Fork (Fork a b) c) (Stem u) = apply b u
|
apply (Fork (Fork _a _b) _c) (Stem u) = apply _b u
|
||||||
apply (Fork (Fork a b) c) (Fork u v) = apply (apply c u) v
|
apply (Fork (Fork _a _b) _c) (Fork u v) = apply (apply _c u) v
|
||||||
-- Left associative `t`
|
-- Left associative `t`
|
||||||
apply Leaf b = Stem b
|
apply Leaf b = Stem b
|
||||||
apply (Stem a) b = Fork a b
|
apply (Stem a) b = Fork a b
|
||||||
@@ -175,7 +175,7 @@ toNumber _ = Left "Invalid Tree Calculus number"
|
|||||||
toString :: T -> Either String String
|
toString :: T -> Either String String
|
||||||
toString tc = case toList tc of
|
toString tc = case toList tc of
|
||||||
Right list -> traverse (fmap (toEnum . fromInteger) . toNumber) list
|
Right list -> traverse (fmap (toEnum . fromInteger) . toNumber) list
|
||||||
Left err -> Left "Invalid Tree Calculus string"
|
Left _ -> Left "Invalid Tree Calculus string"
|
||||||
|
|
||||||
toList :: T -> Either String [T]
|
toList :: T -> Either String [T]
|
||||||
toList Leaf = Right []
|
toList Leaf = Right []
|
||||||
|
|||||||
18
tricu.cabal
18
tricu.cabal
@@ -22,7 +22,19 @@ executable tricu
|
|||||||
MultiWayIf
|
MultiWayIf
|
||||||
OverloadedStrings
|
OverloadedStrings
|
||||||
ScopedTypeVariables
|
ScopedTypeVariables
|
||||||
ghc-options: -threaded -rtsopts -with-rtsopts=-N -optl-pthread -fPIC
|
ghc-options:
|
||||||
|
-Wall
|
||||||
|
-Wcompat
|
||||||
|
-Wunused-imports
|
||||||
|
-Wunused-top-binds
|
||||||
|
-Wunused-local-binds
|
||||||
|
-Wunused-matches
|
||||||
|
-Wredundant-constraints
|
||||||
|
-threaded
|
||||||
|
-rtsopts
|
||||||
|
-with-rtsopts=-N
|
||||||
|
-optl-pthread
|
||||||
|
-fPIC
|
||||||
build-depends:
|
build-depends:
|
||||||
base >=4.7
|
base >=4.7
|
||||||
, aeson
|
, aeson
|
||||||
@@ -50,10 +62,12 @@ executable tricu
|
|||||||
, transformers
|
, transformers
|
||||||
, zlib
|
, zlib
|
||||||
other-modules:
|
other-modules:
|
||||||
|
ContentStore
|
||||||
Eval
|
Eval
|
||||||
FileEval
|
FileEval
|
||||||
Lexer
|
Lexer
|
||||||
Parser
|
Parser
|
||||||
|
Paths_tricu
|
||||||
REPL
|
REPL
|
||||||
Research
|
Research
|
||||||
default-language: Haskell2010
|
default-language: Haskell2010
|
||||||
@@ -96,9 +110,11 @@ test-suite tricu-tests
|
|||||||
, zlib
|
, zlib
|
||||||
default-language: Haskell2010
|
default-language: Haskell2010
|
||||||
other-modules:
|
other-modules:
|
||||||
|
ContentStore
|
||||||
Eval
|
Eval
|
||||||
FileEval
|
FileEval
|
||||||
Lexer
|
Lexer
|
||||||
Parser
|
Parser
|
||||||
|
Paths_tricu
|
||||||
REPL
|
REPL
|
||||||
Research
|
Research
|
||||||
|
|||||||
Reference in New Issue
Block a user