Zero Warnings Plan

Zero GHC warnings with new opts. General cleanup and updates.
This commit is contained in:
2026-05-05 18:30:14 -05:00
parent 2627627493
commit efbe9350ed
11 changed files with 344 additions and 156 deletions

177
AGENTS.md Normal file
View File

@@ -0,0 +1,177 @@
# AGENTS.md — tricu Project Guide
> For AI agents and contributors working in this repository.
## 1. Build & Test
**`nix build .#` always runs tests.** This is the primary and only way to build and validate.
```bash
# Full build + tests (this is the default)
nix build .#
# Build only (skip tests)
nix build .#package
# Build the test-specific variant with doCheck enforced
nix build .#test
nix flake check
# Dev shell (includes ghcid, cabal-install, ghc, upx)
nix develop .#
```
### ⚠️ Never call `cabal` directly
This project uses a Nix flake that wraps `callCabal2nix` to produce the cabal package. All compilation, linking, and test execution are driven through Nix. Running `cabal build`, `cabal test`, `cabal repl`, or `cabal install` directly will use the system GHC (or `.stack-work`) and can produce artifacts that differ from the Nix-built ones — especially regarding `megaparsec` which is a project dependency.
> **Rule of thumb:** if it builds, links, or tests, it goes through `nix`.
## 2. Project Overview
**tricu** (pronounced "tree-shoe") is a programming-language experiment written in Haskell. It implements [Triage Calculus](https://olydis.medium.com/a-visual-introduction-to-tree-calculus-2f4a34ceffc2), an extension of Barry Jay's Tree Calculus, with lambda-abstraction sugar that gets eliminated back to pure tree calculus terms.
tricu is Lojban for "tree".
### Core types (in `src/Research.hs`)
| Type | Description |
|------|-------------|
| `T = Leaf \| Stem T \| Fork T T` | Tree Calculus term (the runtime value) |
| `TricuAST` | Parsed AST with `SDef`, `SApp`, `SLambda`, etc. |
| `LToken` | Lexer tokens |
| `Node` / `MerkleHash` | Content-addressed Merkle DAG nodes |
### Source modules
| Module | Purpose |
|--------|---------|
| `Main.hs` | CLI entry point (`cmdargs`), three modes: `repl`, `eval`, `decode` |
| `Eval.hs` | Interpreter: `evalTricu`, `result`, `evalSingle` |
| `Parser.hs` | Megaparsec parser → `TricuAST` |
| `Lexer.hs` | Megaparsec lexer → `LToken` |
| `FileEval.hs` | File loading, module imports, `!import` |
| `REPL.hs` | Interactive Read-Eval-Print Loop (haskeline) |
| `Research.hs` | Core types, `apply` reduction, booleans, marshalling (`ofString`, `ofNumber`), output formatters (`toAscii`, `toTernaryString`, `decodeResult`) |
| `ContentStore.hs` | SQLite-backed term persistence |
### File extensions
- `.hs` — Haskell source
- `.tri` — tricu language source (used in `lib/`, `test/`, `demos/`)
## 3. Test Suite
Tests live in `test/Spec.hs` and use **Tasty** + **HUnit**.
```bash
nix flake check # or: nix build .#test
```
### Test groups
| Group | What it covers |
|-------|----------------|
| `lexer` | Megaparsec lexer — identifiers, keywords, strings, escapes, invalid tokens |
| `parser` | Parser — defs, lambda, applications, lists, comments, parentheses |
| `simpleEvaluation` | Core `apply` reduction rules, variable substitution, immutability |
| `lambdas` | Lambda elimination, SKI calculus, higher-order functions, currying, shadowing, free vars |
| `providedLibraries` | `lib/list.tri` — triage, booleans, list ops (`head`, `tail`, `map`, `emptyList?`, `append`, `equal?`) |
| `fileEval` | Loading `.tri` files, multi-file context, decode |
| `modules` | `!import`, cyclic deps, namespacing, multi-level imports, unresolved vars, local namespaces |
| `demos` | `demos/*.tri` — structural equality, `toSource`, `size`, level-order traversal |
| `decoding` | `decodeResult` — Leaf, numbers, strings, lists, mixed |
| `elimLambdaSingle` | Lambda elimination: eta reduction, SDef binding, semantics preservation |
| `stressElimLambda` | Lambda elimination stress test: 200 vars, 800-body curried lambda |
### Adding tests
1. Append a `testCase "Description" $ do ...` block to the appropriate test group in `test/Spec.hs`.
2. Import any modules you need (lexer/parser are available via `runParser` from `Text.Megapparsec`; evaluation via `evalTricu`, `parseTricu`, `result`).
3. Run `nix flake check` to verify.
> The test-suite in `tricu.cabal` pulls in `src/` as `hs-source-dirs`, so tests import modules directly (e.g., `import Eval`, `import Lexer`). This is intentional — tests exercise the full pipeline end-to-end.
## 4. tricu Language Quick Reference
```
t → Leaf (the base term)
t t → Stem Leaf
t t t → Fork Leaf Leaf
x = t → Define term x = Leaf
id = (a : a) → Lambda identity (eliminates to tree calculus)
head (map f xs) → From lib/list.tri
!import "./path.tri" NS → Import file under namespace
-- line comment
|- block comment -|
```
## 5. Output Formats
The `eval` command accepts `--form` (shorthand `-t`):
| Format | Value | Description |
|--------|-------|-------------|
| `tree` | `TreeCalculus` | Simple `t` form (default) |
| `fsl` | `FSL` | Full show representation |
| `ast` | `AST` | Parsed AST representation |
| `ternary` | `Ternary` | Ternary string encoding |
| `ascii` | `Ascii` | ASCII-art tree diagram |
| `decode` | `Decode` | Human-readable (strings, numbers, lists) |
## 6. Content Addressing
Each `T` term is content-addressed via a Merkle DAG:
```
NLeaf → 0x00
NStem(h) → 0x01 || h (32 bytes)
NFork(l,r) → 0x02 || l (32 bytes) || r (32 bytes)
hash = SHA256("tricu.merkle.node.v1" <> 0x00 <> serialized_node)
```
This is stored in SQLite via `ContentStore.hs`. Hash suffixes on identifiers (e.g., `foo_abc123...`) are validated: 1664 hex characters (SHA256).
## 7. Directory Layout
```
tricu/
├── flake.nix # Nix flake: packages, tests, devShell
├── tricu.cabal # Cabal package (used via callCabal2nix)
├── src/ # Haskell modules
│ ├── Main.hs
│ ├── Eval.hs
│ ├── Parser.hs
│ ├── Lexer.hs
│ ├── FileEval.hs
│ ├── REPL.hs
│ ├── Research.hs
│ └── ContentStore.hs
├── test/
│ ├── Spec.hs # Tasty + HUnit tests
│ ├── *.tri # tricu test programs
│ └── local-ns/ # Module namespace test files
├── lib/
│ ├── base.tri
│ ├── list.tri
│ └── patterns.tri
├── demos/
│ ├── equality.tri
│ ├── size.tri
│ ├── toSource.tri
│ ├── levelOrderTraversal.tri
│ └── patternMatching.tri
└── AGENTS.md # This file
```
## 8. Development Tips
- **Quick iteration:** `nix develop` then `ghcid` (provided in the devShell) watches files and re-runs.
- **REPL:** `nix run .#` starts the interactive REPL.
- **Evaluate files:** `nix run .# -- eval -f demos/equality.tri`
- **GHC options:** `-threaded -rtsopts -with-rtsopts=-N` for parallel runtime. Use `-N` RTS flag for multi-core.
- **Upx** is in the devShell for binary compression if needed.

View File

@@ -36,15 +36,21 @@ tricu < -- or calculate its size (/demos/size.tri)
tricu < size not?
tricu > 12
tricu < -- REPL Commands:
tricu < !definitions -- Lists all available definitions
tricu < !output -- Change output format (Tree, FSL, AST, etc.)
tricu < !import -- Import definitions from a file
tricu < !exit -- Exit the REPL
tricu < !clear -- ANSI screen clear
tricu < !save -- Save all REPL definitions to a file that you can !import
tricu < !reset -- Clear all REPL definitions
tricu < !version -- Print tricu version
tricu < !help
tricu version 1.1.0
Available commands:
!exit - Exit the REPL
!clear - Clear the screen
!reset - Reset preferences for selected versions
!help - Show tricu version and available commands
!output - Change output format (tree|fsl|ast|ternary|ascii|decode)
!definitions - List all defined terms in the content store
!import - Import definitions from file to the content store
!watch - Watch a file for changes, evaluate terms, and store them
!refresh - Refresh environment from content store (definitions are live)
!versions - Show all versions of a term by name
!select - Select a specific version of a term for subsequent lookups
!tag - Add or update a tag for a term by hash or name
```
## Installation and Use

View File

@@ -1,7 +1,6 @@
module ContentStore where
import Research
import Parser
import Control.Monad (foldM, forM_, void)
import Data.ByteString (ByteString)
@@ -9,11 +8,9 @@ import Data.List (nub, sort)
import Data.Maybe (catMaybes, fromMaybe)
import Data.Text (Text)
import Database.SQLite.Simple
import Database.SQLite.Simple.FromRow (FromRow(..), field)
import System.Directory (createDirectoryIfMissing, getXdgDirectory, XdgDirectory(..))
import System.FilePath ((</>), takeDirectory)
import qualified Data.Map as Map
import qualified Data.Text as T
@@ -97,6 +94,7 @@ storeTerm conn newNamesStrList term = do
-- | Reconstruct a Tree Calculus term from its Merkle root hash.
-- Recursively loads nodes and rebuilds the T structure.
loadTree :: Connection -> MerkleHash -> IO (Maybe T)
loadTree conn h
| h == nodeHash NLeaf = return (Just Leaf) -- NLeaf is implicit, not stored
| otherwise = do
@@ -106,6 +104,7 @@ loadTree conn h
Just node -> Just <$> buildTree node
where
buildTree :: Node -> IO T
buildTree NLeaf = return Leaf
buildTree (NStem childHash) = do
child <- fromMaybe (errorWithoutStackTrace "BUG: stored hash not found") <$> loadTree conn childHash
return (Stem child)
@@ -166,7 +165,7 @@ storeEnvironment conn env = do
let groupedDefs = Map.toList $ Map.fromListWith (++) [(term, [name]) | (name, term) <- defs]
forM_ groupedDefs $ \(term, namesList) -> case namesList of
n:ns -> void $ storeTerm conn namesList term
_:_ -> void $ storeTerm conn namesList term
_ -> errorWithoutStackTrace "storeEnvironment: empty names list"
loadTerm :: Connection -> String -> IO (Maybe T)

View File

@@ -6,18 +6,18 @@ import Research
import Control.Monad (foldM)
import Data.List (partition, (\\), elemIndex, foldl')
import Data.Map (Map)
import Data.Map ()
import Data.Set (Set)
import Database.SQLite.Simple
import qualified Data.Foldable as F
import qualified Data.Foldable as F ()
import qualified Data.Map as Map
import qualified Data.Set as Set
import qualified Data.Text as T
data DB
= BVar Int -- bound (0 = nearest binder)
| BFree String -- free/global
= BVar Int
| BFree String
| BLam DB
| BApp DB DB
| BLeaf
@@ -59,12 +59,12 @@ evalSingle env term
evalTricu :: Env -> [TricuAST] -> Env
evalTricu env x = go env (reorderDefs env x)
where
go env [] = env
go env [x] =
let updatedEnv = evalSingle env x
go env' [] = env'
go env' [def] =
let updatedEnv = evalSingle env' def
in Map.insert "!result" (result updatedEnv) updatedEnv
go env (x:xs) =
evalTricu (evalSingle env x) xs
go env' (def:xs) =
evalTricu (evalSingle env' def) xs
evalASTSync :: Env -> TricuAST -> T
evalASTSync env term = case term of
@@ -129,7 +129,7 @@ resolveTermFromStore conn selectedVersions name mhash = case mhash of
case matchingVersions of
[] -> return Nothing
[(_, term, _)] -> return $ Just term
_ -> return Nothing -- Ambiguous or too many matches
_ -> return Nothing
Nothing -> case Map.lookup name selectedVersions of
Just hash -> loadTree conn hash
Nothing -> do
@@ -137,7 +137,7 @@ resolveTermFromStore conn selectedVersions name mhash = case mhash of
case versions of
[] -> return Nothing
[(_, term, _)] -> return $ Just term
_ -> return $ Just $ (\(_, t, _) -> t) $ case versions of (_:_) -> head versions; _ -> error "resolveTermFromStore: unexpected empty versions list"
_ -> return $ Just (head (map (\(_, t, _) -> t) versions))
elimLambda :: TricuAST -> TricuAST
elimLambda = go
@@ -155,12 +155,10 @@ elimLambda = go
etaReduction (SLambda [v] (SApp f (SVar x Nothing))) = v == x && not (usesBinder v f)
etaReduction _ = False
-- triage: \a b c -> TLeaf (TLeaf a b) c (checked in DB with a↦2, b↦1, c↦0)
triagePattern (SLambda [a] (SLambda [b] (SLambda [c] body))) =
toDB [c,b,a] body == triageBodyDB
triagePattern _ = False
-- compose: \f g x -> f (g x) (checked in DB with f↦2, g↦1, x↦0)
composePattern (SLambda [f] (SLambda [g] (SLambda [x] body))) =
toDB [x,g,f] body == composeBodyDB
composePattern _ = False
@@ -174,30 +172,34 @@ elimLambda = go
application (SApp _ _) = True
application _ = False
-- rewrites
etaReduceResult (SLambda [_] (SApp f _)) = f
etaReduceResult _ = error "etaReduceResult: expected SLambda [v] (SApp f _)"
lambdaListResult (SLambda [v] (SList xs)) =
SLambda [v] (foldr wrapTLeaf TLeaf xs)
where
wrapTLeaf m r = SApp (SApp TLeaf m) r
lambdaListResult _ = error "lambdaListResult: expected SLambda [v] (SList xs)"
nestedLambdaResult (SLambda (v:vs) body)
| null vs =
let body' = go body
db = toDB [v] body'
in toSKIKiselyov db
in toSKIKiselyov db
| otherwise = go (SLambda [v] (SLambda vs body))
nestedLambdaResult _ = error "nestedLambdaResult: expected SLambda (_:_) _"
applicationResult (SApp f g) = SApp (go f) (go g)
applicationResult _ = error "applicationResult: expected SApp _ _"
isSList (SList _) = True
isSList _ = False
slistTransform :: TricuAST -> TricuAST
slistTransform (SList xs) = foldr (\m r -> SApp (SApp TLeaf (go m)) r) TLeaf xs
slistTransform ast = ast -- Should not be reached if isSList is the guard
slistTransform ast = ast -- Should not be reached
_S, _K, _I, _R, _C, _B, _T, _TRI :: TricuAST
_S = parseSingle "t (t (t t t)) t"
_K = parseSingle "t t"
_I = parseSingle "t (t (t t)) t"
@@ -207,7 +209,9 @@ _B = parseSingle "t (t (t t (t (t (t t t)) t))) (t t)"
_T = SApp _C _I
_TRI = parseSingle "t (t (t t (t (t (t t t))))) t"
triageBody :: String -> String -> String -> TricuAST
triageBody a b c = SApp (SApp TLeaf (SApp (SApp TLeaf (SVar a Nothing)) (SVar b Nothing))) (SVar c Nothing)
composeBody :: String -> String -> String -> TricuAST
composeBody f g x = SApp (SVar f Nothing) (SApp (SVar g Nothing) (SVar x Nothing))
isFree :: String -> TricuAST -> Bool
@@ -270,7 +274,7 @@ buildDepGraph topDefs
sortDeps :: Map.Map String (Set.Set String) -> [String]
sortDeps graph = go [] Set.empty (Map.keys graph)
where
go sorted sortedSet [] = sorted
go sorted _sortedSet [] = sorted
go sorted sortedSet remaining =
let ready = [ name | name <- remaining
, let deps = Map.findWithDefault Set.empty name graph
@@ -354,7 +358,7 @@ freeDBNames = \case
BList xs -> foldMap freeDBNames xs
BEmpty -> mempty
-- Helper: is the binder named v used in body?
-- Helper: "is the binder named v used in body?"
usesBinder :: String -> TricuAST -> Bool
usesBinder v body = dependsOnLevel 0 (toDB [v] body)
@@ -395,9 +399,7 @@ toSKIDB (BList xs) =
in if not anyUses
then SApp _K (SList (map fromDBClosed xs))
else SList (map toSKIDB xs)
toSKIDB other
| not (dependsOnLevel 0 other) = SApp _K (fromDBClosed other)
toSKIDB other = _K `SApp` TLeaf
toSKIDB _other = _K `SApp` TLeaf
app2 :: TricuAST -> TricuAST -> TricuAST
app2 f x = SApp f x
@@ -415,11 +417,13 @@ kisConv = \case
BVar n | n > 0 -> do
(g,d) <- kisConv (BVar (n - 1))
Right (False:g, d)
BVar n -> Right ([], SVar ("BVar" ++ show n) Nothing)
BFree s -> Right ([], SVar s Nothing)
BApp e1 e2 -> do
(g1,d1) <- kisConv e1
(g2,d2) <- kisConv e2
let g = zipWithDefault False (||) g1 g2 -- < propagate Γ outside (#)
d = kisHash (g1,d1) (g2,d2) -- < (#) yields only the term
let g = zipWithDefault False (||) g1 g2 -- <- propagate Γ outside (#)
d = kisHash (g1,d1) (g2,d2) -- <- (#) yields only the term
Right (g, d)
-- Treat closed constants as free 'combinator leaves' (no binder use).
BLeaf -> Right ([], TLeaf)
@@ -437,12 +441,11 @@ kisConv = \case
BFork l r
| dependsOnLevel 0 l || dependsOnLevel 0 r -> Left "Fork with binder use: fallback"
| otherwise -> Right ([], TFork (fromDBClosed l) (fromDBClosed r))
-- We shouldnt see BLam under elim; treat as unsupported so we fallback.
-- We shouldn't see BLam under elim; treat as unsupported so we fallback.
BLam _ -> Left "Nested lambda under body: fallback"
BFree s -> Right ([], SVar s Nothing)
-- Application combiner with K-optimization (lazy weakening).
-- Mirrors Lynns 'optK' rules: choose among S, B, C, R based on leading flags.
-- Mirrors Lynn's 'optK' rules: choose among S, B, C, R based on leading flags.
-- η-aware (#) with K-optimization (adapted from TS kiselyov_eta)
kisHash :: (Uses, TricuAST) -> (Uses, TricuAST) -> TricuAST
kisHash (g1, d1) (g2, d2) =
@@ -563,7 +566,6 @@ bulkS :: Int -> TricuAST
bulkS n | n <= 1 = _S
| otherwise = SApp sPrime (bulkS (n - 1))
-- Count how many leading pairs (a,b) repeat at the head of zip g1 g2
headPairRun :: [Bool] -> [Bool] -> ((Bool, Bool), Int)
headPairRun g1 g2 =
case zip g1 g2 of

View File

@@ -5,11 +5,11 @@ import Lexer
import Parser
import Research
import Control.Monad ()
import Data.List (partition)
import Data.Maybe (mapMaybe)
import Control.Monad (foldM)
import System.IO
import System.FilePath (takeDirectory, normalise, (</>))
import System.IO ()
import qualified Data.Map as Map
import qualified Data.Set as Set
@@ -17,12 +17,12 @@ import qualified Data.Set as Set
extractMain :: Env -> Either String T
extractMain env =
case Map.lookup "main" env of
Just result -> Right result
Just evalResult -> Right evalResult
Nothing -> Left "No `main` function detected"
processImports :: Set.Set FilePath -> FilePath -> FilePath -> [TricuAST]
-> Either String ([TricuAST], [(FilePath, String, FilePath)])
processImports seen base currentPath asts =
processImports seen _base currentPath asts =
let (imports, nonImports) = partition isImp asts
importPaths = mapMaybe getImportInfo imports
in if currentPath `Set.member` seen
@@ -40,11 +40,11 @@ evaluateFileResult filePath = do
let tokens = lexTricu contents
case parseProgram tokens of
Left err -> errorWithoutStackTrace (handleParseError err)
Right ast -> do
Right _ast -> do
processedAst <- preprocessFile filePath
let finalEnv = evalTricu Map.empty processedAst
case extractMain finalEnv of
Right result -> return result
Right evalResult -> return evalResult
Left err -> errorWithoutStackTrace err
evaluateFile :: FilePath -> IO Env
@@ -53,7 +53,7 @@ evaluateFile filePath = do
let tokens = lexTricu contents
case parseProgram tokens of
Left err -> errorWithoutStackTrace (handleParseError err)
Right ast -> do
Right _ast -> do
ast <- preprocessFile filePath
pure $ evalTricu Map.empty ast
@@ -63,7 +63,7 @@ evaluateFileWithContext env filePath = do
let tokens = lexTricu contents
case parseProgram tokens of
Left err -> errorWithoutStackTrace (handleParseError err)
Right ast -> do
Right _ast -> do
ast <- preprocessFile filePath
pure $ evalTricu env ast
@@ -84,8 +84,8 @@ preprocessFile' seen base currentPath = do
imported <- concat <$> mapM (processImportPath seen' base) importPaths
pure $ imported ++ nonImports
where
processImportPath seen base (path, name, importPath) = do
ast <- preprocessFile' seen base importPath
processImportPath _seen _base (_path, name, importPath) = do
ast <- preprocessFile' _seen _base importPath
pure $ map (nsDefinition (if name == "!Local" then "" else name))
$ filter (not . isImp) ast
isImp (SImport _ _) = True

View File

@@ -4,13 +4,12 @@ import Research
import Control.Monad (void)
import Data.Functor (($>))
import Data.Set ()
import Data.Void
import Text.Megaparsec
import Text.Megaparsec.Char hiding (space)
import Text.Megaparsec.Char.Lexer
import qualified Data.Set as Set
type Lexer = Parsec Void String
tricuLexer :: Lexer [LToken]
@@ -23,13 +22,13 @@ tricuLexer = do
]
sc
pure tok
tokens <- many $ do
toks <- many $ do
tok <- choice tricuLexer'
sc
pure tok
sc
eof
pure (header ++ tokens)
pure (header ++ toks)
where
tricuLexer' =
[ try lnewline
@@ -51,7 +50,7 @@ tricuLexer = do
lexTricu :: String -> [LToken]
lexTricu input = case runParser tricuLexer "" input of
Left err -> errorWithoutStackTrace $ "Lexical error:\n" ++ errorBundlePretty err
Right tokens -> tokens
Right toks -> toks
keywordT :: Lexer LToken
@@ -143,8 +142,8 @@ integerLiteral = do
stringLiteral :: Lexer LToken
stringLiteral = do
char '"'
content <- manyTill Lexer.charLiteral (char '"')
void (char '"')
content <- manyTill Lexer.charLiteral (void (char '"'))
return (LStringLiteral content)
charLiteral :: Lexer Char
@@ -163,3 +162,4 @@ charLiteral = escapedChar <|> normalChar
'\\' -> '\\'
'"' -> '"'
'\'' -> '\''
_ -> c

View File

@@ -1,18 +1,18 @@
module Main where
import ContentStore ()
import Eval (evalTricu, mainResult, result)
import FileEval
import Parser (parseTricu)
import REPL
import Research
import ContentStore
import Control.Monad (foldM)
import Control.Monad.IO.Class (liftIO)
import Control.Monad.IO.Class ()
import Data.Version (showVersion)
import Text.Megaparsec (runParser)
import Paths_tricu (version)
import System.Console.CmdArgs
import Text.Megaparsec ()
import qualified Data.Map as Map
@@ -56,24 +56,24 @@ decodeMode = TDecode
main :: IO ()
main = do
let versionStr = "tricu Evaluator and REPL " ++ showVersion version
args <- cmdArgs $ modes [replMode, evaluateMode, decodeMode]
cmdArgsParsed <- cmdArgs $ modes [replMode, evaluateMode, decodeMode]
&= help "tricu: Exploring Tree Calculus"
&= program "tricu"
&= summary versionStr
&= versionArg [explicit, name "version", summary versionStr]
case args of
case cmdArgsParsed of
Repl -> do
putStrLn "Welcome to the tricu REPL"
putStrLn "You may exit with `CTRL+D` or the `!exit` command."
repl
Evaluate { file = filePaths, form = form } -> do
result <- case filePaths of
Evaluate { file = filePaths, form = outputForm } -> do
evalResult <- case filePaths of
[] -> runTricuT <$> getContents
(filePath:restFilePaths) -> do
initialEnv <- evaluateFile filePath
finalEnv <- foldM evaluateFileWithContext initialEnv restFilePaths
pure $ mainResult finalEnv
let fRes = formatT form result
let fRes = formatT outputForm evalResult
putStr fRes
TDecode { file = filePaths } -> do
value <- case filePaths of

View File

@@ -8,7 +8,7 @@ import Control.Monad.State
import Data.List.NonEmpty (toList)
import Data.Void (Void)
import Text.Megaparsec
import Text.Megaparsec.Error (ParseErrorBundle, errorBundlePretty)
import qualified Data.Set as Set
data PState = PState
@@ -20,9 +20,9 @@ type ParserM = StateT PState (Parsec Void [LToken])
satisfyM :: (LToken -> Bool) -> ParserM LToken
satisfyM f = do
token <- lift (satisfy f)
modify' (updateDepth token)
return token
tok <- lift (satisfy f)
modify' (updateDepth tok)
return tok
updateDepth :: LToken -> PState -> PState
updateDepth LOpenParen st = st { parenDepth = parenDepth st + 1 }
@@ -39,12 +39,12 @@ topLevelNewline = do
else fail "Top-level exit in nested context (paren or bracket)"
parseProgram :: [LToken] -> Either (ParseErrorBundle [LToken] Void) [TricuAST]
parseProgram tokens =
runParser (evalStateT (parseProgramM <* finalizeDepth <* eof) (PState 0 0)) "" tokens
parseProgram toks =
runParser (evalStateT (parseProgramM <* finalizeDepth <* eof) (PState 0 0)) "" toks
parseSingleExpr :: [LToken] -> Either (ParseErrorBundle [LToken] Void) TricuAST
parseSingleExpr tokens =
runParser (evalStateT (scnParserM *> parseExpressionM <* finalizeDepth <* eof) (PState 0 0)) "" tokens
parseSingleExpr toks =
runParser (evalStateT (scnParserM *> parseExpressionM <* finalizeDepth <* eof) (PState 0 0)) "" toks
finalizeDepth :: ParserM ()
finalizeDepth = do
@@ -195,6 +195,7 @@ parseTreeTermM = do
| TLeaf <- acc = TStem next
| TStem t <- acc = TFork t next
| TFork _ _ <- acc = TFork acc next
| otherwise = SApp acc next
parseTreeLeafOrParenthesizedM :: ParserM TricuAST
parseTreeLeafOrParenthesizedM = choice
@@ -248,20 +249,20 @@ parseGroupedItemM = do
parseSingleItemM :: ParserM TricuAST
parseSingleItemM = do
token <- satisfyM (\case LIdentifier _ -> True; LKeywordT -> True; _ -> False)
if | LIdentifier name <- token -> pure (SVar name Nothing)
| token == LKeywordT -> pure TLeaf
tok <- satisfyM (\case LIdentifier _ -> True; LKeywordT -> True; _ -> False)
if | LIdentifier name <- tok -> pure (SVar name Nothing)
| tok == LKeywordT -> pure TLeaf
| otherwise -> fail "Unexpected token in list item"
parseVarM :: ParserM TricuAST
parseVarM = do
token <- satisfyM (\case
tok <- satisfyM (\case
LNamespace _ -> True
LIdentifier _ -> True
LIdentifierWithHash _ _ -> True
_ -> False)
case token of
case tok of
LNamespace ns -> do
_ <- satisfyM (== LDot)
LIdentifier name <- satisfyM (\case LIdentifier _ -> True; _ -> False)
@@ -282,8 +283,8 @@ parseVarM = do
parseIntLiteralM :: ParserM TricuAST
parseIntLiteralM = do
let intL = (\case LIntegerLiteral _ -> True; _ -> False)
token <- satisfyM intL
if | LIntegerLiteral value <- token ->
tok <- satisfyM intL
if | LIntegerLiteral value <- tok ->
pure (SInt (fromIntegral value))
| otherwise ->
fail "Unexpected token while parsing integer literal"
@@ -291,8 +292,8 @@ parseIntLiteralM = do
parseStrLiteralM :: ParserM TricuAST
parseStrLiteralM = do
let strL = (\case LStringLiteral _ -> True; _ -> False)
token <- satisfyM strL
if | LStringLiteral value <- token ->
tok <- satisfyM strL
if | LStringLiteral value <- tok ->
pure (SStr value)
| otherwise ->
fail "Unexpected token while parsing string literal"
@@ -308,8 +309,8 @@ handleParseError bundle =
in unlines ("Parse error(s) encountered:" : formattedErrors)
formatError :: ParseError [LToken] Void -> String
formatError (TrivialError offset unexpected expected) =
let unexpectedMsg = case unexpected of
formatError (TrivialError offset msgUnexpected expected) =
let unexpectedMsg = case msgUnexpected of
Just x -> "unexpected token " ++ show x
Nothing -> "unexpected end of input"
expectedMsg = if null expected

View File

@@ -1,48 +1,41 @@
module REPL where
import ContentStore
import Eval
import FileEval
import Lexer
import Lexer ()
import Parser
import Research
import ContentStore
import Control.Concurrent (forkIO, threadDelay, killThread, ThreadId)
import Control.Monad (forever, void, when, forM, forM_, foldM, unless)
import Data.ByteString (ByteString)
import Data.Maybe (isNothing, isJust, fromJust, catMaybes)
import Database.SQLite.Simple (Connection, Only(..), query, query_, execute, execute_, open)
import Control.Exception (SomeException, catch, displayException)
import Control.Monad ()
import Control.Monad (forever, when, forM_, foldM, unless)
import Control.Monad.Catch (handle)
import Control.Monad.IO.Class (liftIO)
import Control.Monad.Trans.Class ()
import Control.Monad.Trans.Maybe (MaybeT(..), runMaybeT)
import Data.ByteString ()
import Data.Char (isSpace)
import Data.IORef (newIORef, readIORef, writeIORef)
import Data.List (dropWhileEnd, isPrefixOf, find)
import Data.Maybe (isJust, fromJust)
import Data.Time (getCurrentTime, diffUTCTime)
import Data.Time.Clock.POSIX (posixSecondsToUTCTime)
import Data.Time.Format (formatTime, defaultTimeLocale)
import Data.Version (showVersion)
import Database.SQLite.Simple (Connection, Only(..), query)
import Paths_tricu (version)
import System.Console.ANSI (setSGR, SGR(..), ConsoleLayer(..), ColorIntensity(..), Color(..))
import System.Console.Haskeline
import System.Directory (doesFileExist, createDirectoryIfMissing)
import System.FSNotify
import System.FilePath (takeDirectory, (</>))
import Text.Read (readMaybe)
import Control.Exception (IOException, SomeException, catch
, displayException)
import Control.Monad (forM_)
import Control.Monad.Catch (handle, MonadCatch)
import Control.Monad.IO.Class (liftIO)
import Control.Monad.Trans.Class (lift)
import Control.Monad.Trans.Maybe (MaybeT(..), runMaybeT)
import Data.Char (isSpace, isUpper)
import Data.List ((\\), dropWhile, dropWhileEnd, isPrefixOf, nub, sortBy, groupBy, intercalate, find)
import Data.Version (showVersion)
import Paths_tricu (version)
import System.Console.Haskeline
import System.Console.ANSI (setSGR, SGR(..), ConsoleLayer(..), ColorIntensity(..),
Color(..), ConsoleIntensity(..), clearFromCursorToLineEnd)
import qualified Data.Map as Map
import qualified Data.Text as T
import qualified Data.Text.IO as T
import Control.Concurrent (forkIO, threadDelay)
import Data.IORef (IORef, newIORef, readIORef, writeIORef)
import Data.Time (UTCTime, getCurrentTime, diffUTCTime)
import Control.Concurrent.MVar (MVar, newMVar, putMVar, takeMVar)
import Data.Time.Format (formatTime, defaultTimeLocale)
import Data.Time.Clock.POSIX (posixSecondsToUTCTime)
import qualified Data.Text.IO as T ()
data REPLState = REPLState
{ replForm :: EvaluatedForm
@@ -121,26 +114,26 @@ repl = do
| "!tag" `isPrefixOf` strip s -> handleTag state
| take 2 s == "--" -> loop state
| otherwise -> do
result <- liftIO $ catch
evalResult <- liftIO $ catch
(processInput state s)
(errorHandler state)
loop result
loop evalResult
handleOutput :: REPLState -> InputT IO ()
handleOutput state = do
let formats = [Decode, TreeCalculus, FSL, AST, Ternary, Ascii]
outputStrLn "Available output formats:"
mapM_ (\(i, f) -> outputStrLn $ show i ++ ". " ++ show f)
mapM_ (\(i, f) -> outputStrLn $ show (i :: Int) ++ ". " ++ show f)
(zip [1..] formats)
result <- runMaybeT $ do
evalResult <- runMaybeT $ do
input <- MaybeT $ getInputLine "Select output format (1-6) < "
case reads input of
[(n, "")] | n >= 1 && n <= 6 ->
return $ formats !! (n-1)
_ -> MaybeT $ return Nothing
case result of
case evalResult of
Nothing -> do
outputStrLn "Invalid selection. Keeping current output format."
loop state
@@ -201,7 +194,7 @@ repl = do
importFile :: REPLState -> String -> InputT IO ()
importFile state cleanFilename = do
code <- liftIO $ readFile cleanFilename
_code <- liftIO $ readFile cleanFilename
case replContentStore state of
Nothing -> do
liftIO $ printError "Content store not initialized"
@@ -216,7 +209,7 @@ repl = do
importedCount <- foldM (\count (name, term) -> do
hash <- ContentStore.storeTerm conn [name] term
printSuccess $ "Stored definition: " ++ name ++ " with hash " ++ T.unpack hash
return (count + 1)
return (count + (1 :: Int))
) 0 defs
printSuccess $ "Imported " ++ show importedCount ++ " definitions successfully"
@@ -248,7 +241,7 @@ repl = do
lastProcessedRef <- liftIO $ newIORef =<< getCurrentTime
watcherId <- liftIO $ forkIO $ withManager $ \mgr -> do
stopAction <- watchDir mgr dirPath (\event -> eventPath event == filepath) $ \event -> do
_stopAction <- watchDir mgr dirPath (\ev -> eventPath ev == filepath) $ \_ -> do
now <- getCurrentTime
lastProcessed <- readIORef lastProcessedRef
when (diffUTCTime now lastProcessed > 0.5) $ do
@@ -259,8 +252,8 @@ repl = do
watchLoop state { replWatchedFile = Just filepath, replWatcherThread = Just watcherId }
handleUnwatch :: REPLState -> InputT IO ()
handleUnwatch state = case replWatchedFile state of
_handleUnwatch :: REPLState -> InputT IO ()
_handleUnwatch state = case replWatchedFile state of
Nothing -> do
outputStrLn "No file is currently being watched"
loop state
@@ -275,7 +268,7 @@ repl = do
Nothing -> do
outputStrLn "Content store not initialized"
loop state
Just conn -> do
Just _conn -> do
outputStrLn "Environment refreshed from content store (definitions are live)"
loop state
@@ -486,8 +479,8 @@ repl = do
forM_ asts $ \ast -> do
case ast of
SDef name [] body -> do
result <- evalAST (Just conn) (replSelectedVersions newState) body
hash <- ContentStore.storeTerm conn [name] result
evalResult <- evalAST (Just conn) (replSelectedVersions newState) body
hash <- ContentStore.storeTerm conn [name] evalResult
liftIO $ do
putStr "tricu > "
@@ -498,14 +491,14 @@ repl = do
putStrLn ""
putStr "tricu > "
printResult $ formatT (replForm newState) result
printResult $ formatT (replForm newState) evalResult
putStrLn ""
_ -> do
result <- evalAST (Just conn) (replSelectedVersions newState) ast
evalResult <- evalAST (Just conn) (replSelectedVersions newState) ast
liftIO $ do
putStr "tricu > "
printResult $ formatT (replForm newState) result
printResult $ formatT (replForm newState) evalResult
putStrLn ""
return newState
@@ -531,13 +524,13 @@ repl = do
Just conn -> do
forM_ asts $ \ast -> case ast of
SDef name [] body -> do
result <- evalAST (Just conn) selectedVersions body
hash <- ContentStore.storeTerm conn [name] result
evalResult <- evalAST (Just conn) selectedVersions body
hash <- ContentStore.storeTerm conn [name] evalResult
putStrLn $ "tricu > Stored definition: " ++ name ++ " with hash " ++ T.unpack hash
putStrLn $ "tricu > " ++ name ++ " = " ++ formatT outputForm result
putStrLn $ "tricu > " ++ name ++ " = " ++ formatT outputForm evalResult
_ -> do
result <- evalAST (Just conn) selectedVersions ast
putStrLn $ "tricu > Result: " ++ formatT outputForm result
evalResult <- evalAST (Just conn) selectedVersions ast
putStrLn $ "tricu > Result: " ++ formatT outputForm evalResult
putStrLn $ "tricu > Processed file: " ++ filepath
formatTimestamp :: Integer -> String
@@ -552,12 +545,6 @@ repl = do
putStr $ T.unpack rest
setSGR [Reset]
coloredHashString :: T.Text -> String
coloredHashString hash =
"\ESC[1;36m" ++ T.unpack (T.take 16 hash) ++
"\ESC[0;37m" ++ T.unpack (T.drop 16 hash) ++
"\ESC[0m"
withColor :: ColorIntensity -> Color -> IO () -> IO ()
withColor intensity color action = do
setSGR [SetColor Foreground intensity color]

View File

@@ -1,17 +1,17 @@
module Research where
import Crypto.Hash (hash, SHA256, Digest)
import Data.ByteArray (convert)
import Data.ByteString.Base16 (decode, encode)
import Data.List (intercalate)
import Data.Map (Map)
import Data.Text (Text, replace, pack)
import Data.Map ()
import Data.Text (Text, replace)
import Data.Text.Encoding (decodeUtf8, encodeUtf8)
import System.Console.CmdArgs (Data, Typeable)
import qualified Data.ByteString as BS
import qualified Data.Map as Map
import qualified Data.Text as T
import Crypto.Hash (hash, SHA256, Digest)
-- Tree Calculus Types
data T = Leaf | Stem T | Fork T T
@@ -19,7 +19,7 @@ data T = Leaf | Stem T | Fork T T
-- Abstract Syntax Tree for tricu
data TricuAST
= SVar String (Maybe String) -- Variable name and optional hash prefix
= SVar String (Maybe String)
| SInt Integer
| SStr String
| SList [TricuAST]
@@ -131,9 +131,9 @@ buildMerkle (Fork l r) = NFork (nodeHash left) (nodeHash right)
apply :: T -> T -> T
apply (Fork Leaf a) _ = a
apply (Fork (Stem a) b) c = apply (apply a c) (apply b c)
apply (Fork (Fork a b) c) Leaf = a
apply (Fork (Fork a b) c) (Stem u) = apply b u
apply (Fork (Fork a b) c) (Fork u v) = apply (apply c u) v
apply (Fork (Fork _a _b) _c) Leaf = _a
apply (Fork (Fork _a _b) _c) (Stem u) = apply _b u
apply (Fork (Fork _a _b) _c) (Fork u v) = apply (apply _c u) v
-- Left associative `t`
apply Leaf b = Stem b
apply (Stem a) b = Fork a b
@@ -175,7 +175,7 @@ toNumber _ = Left "Invalid Tree Calculus number"
toString :: T -> Either String String
toString tc = case toList tc of
Right list -> traverse (fmap (toEnum . fromInteger) . toNumber) list
Left err -> Left "Invalid Tree Calculus string"
Left _ -> Left "Invalid Tree Calculus string"
toList :: T -> Either String [T]
toList Leaf = Right []

View File

@@ -15,14 +15,26 @@ extra-source-files:
executable tricu
main-is: Main.hs
hs-source-dirs:
src
src
default-extensions:
DeriveDataTypeable
LambdaCase
MultiWayIf
OverloadedStrings
ScopedTypeVariables
ghc-options: -threaded -rtsopts -with-rtsopts=-N -optl-pthread -fPIC
DeriveDataTypeable
LambdaCase
MultiWayIf
OverloadedStrings
ScopedTypeVariables
ghc-options:
-Wall
-Wcompat
-Wunused-imports
-Wunused-top-binds
-Wunused-local-binds
-Wunused-matches
-Wredundant-constraints
-threaded
-rtsopts
-with-rtsopts=-N
-optl-pthread
-fPIC
build-depends:
base >=4.7
, aeson
@@ -50,10 +62,12 @@ executable tricu
, transformers
, zlib
other-modules:
ContentStore
Eval
FileEval
Lexer
Parser
Paths_tricu
REPL
Research
default-language: Haskell2010
@@ -96,9 +110,11 @@ test-suite tricu-tests
, zlib
default-language: Haskell2010
other-modules:
ContentStore
Eval
FileEval
Lexer
Parser
Paths_tricu
REPL
Research