Compare commits

...

106 commits

Author SHA1 Message Date
d809d3b52d
json-pawarser: test grammar::object 2024-10-30 12:27:27 +01:00
ef1a9f5029
json-pawarser: test grammar::member 2024-10-30 10:57:52 +01:00
662cb8ba0e
json-pawarser: make return of object grammar easier to understand 2024-10-29 19:40:50 +01:00
fcf91f25e3
json-pawarser: test stuffs uwu 2024-10-29 19:36:46 +01:00
958857cb58
handle debug pretty printing 2024-10-27 17:21:42 +01:00
883b0c804e
add implicit root node to avoid crash on multiple root nodes 2024-10-27 16:59:18 +01:00
f7d05ead2c
rename trait meta syntaxkinds 2024-10-27 16:56:39 +01:00
cee9b97dbf
extract modules to files 2024-10-23 13:27:36 +02:00
e5ccebe679
add arrays 2024-10-23 13:01:39 +02:00
3164328568
implement multiple members, member_values and trailing commata 2024-10-23 10:52:44 +02:00
c564d0f24c
implement Marker::abandon 2024-10-23 10:51:43 +02:00
b8720b2df9
pawarser, json-pawarser: get first debug print working! 2024-10-21 18:29:46 +02:00
af6886214b
flake.lock: Update 2024-10-21 15:31:08 +02:00
ac75978c01
pawarser: Implement Parser::finish 2024-10-21 15:16:36 +02:00
9b1f6a1dc1
pawarser: Implement CompletedMarker::precede 2024-10-21 15:15:40 +02:00
fed8cf2466
pawarser: require/derive PartialEq + Eq for NodeKind and its contents 2024-10-21 15:15:06 +02:00
91f766c18e
pawarser: make raw_tokens vec owned in input 2024-10-21 15:13:38 +02:00
becc4b4041
json-pawarser: init 2024-10-18 14:05:27 +02:00
21bcf62ea5
pawarser(setup): continue working on the bare basics 2024-10-17 09:54:09 +02:00
34ddaacb58
pawarser(chore): split up files 2024-10-13 16:47:53 +02:00
ec2ff5778b
pawarser(setup): basic parser stuff and types around it. also, a builder. 2024-10-13 16:44:59 +02:00
a3ab844ba7
pawarser(init): start extracting the parser lib 2024-10-13 15:32:26 +02:00
a693b57447
yet another attempt at building an evaluator 2024-10-10 10:23:54 +02:00
3412eb9395
executor (poc): init proof of concept executor crate 2024-07-18 19:12:58 +02:00
ccc6d4f532
update/fix cargo.lock 2024-07-08 20:51:51 +02:00
54401d2a21
app: apply review 2024-07-08 20:49:13 +02:00
18309ec919
app, prowocessing: move dev commands to tests 2024-07-08 20:49:13 +02:00
0705702d4a
experimentation: implement some basic traits for io and data types 2024-07-08 20:49:13 +02:00
31a044577a
experimentation: use dynamic type ids for signatures and add qol macro 2024-07-08 20:49:13 +02:00
911339fc2a
simplified by entirely removing DataRef 2024-07-08 20:49:12 +02:00
619b7acf94
prowocessing: let-else refatoring as according to review 2024-07-08 20:49:12 +02:00
b7bc0366c2
prowocessing: apply most basic reviews 2024-07-08 20:48:56 +02:00
734a734f09
prowocessing: add documentation of trait experiment 2024-07-08 20:48:10 +02:00
dddbcccf72
prowocessing: refactor trait based experiment to individual files 2024-07-08 20:48:10 +02:00
26996fbd00
prowocessing: add trait based experiment 2024-07-08 20:48:10 +02:00
d9a07c8898
prowocessing: extract experiment into its own file 2024-07-08 20:48:10 +02:00
db9228dec4
cli: add dev command for enums experiment 2024-07-08 20:48:10 +02:00
56ec11e143
cli: add subcommand support 2024-07-08 20:48:09 +02:00
1e9648966f
experimentation: write experiment for enum architecture 2024-07-08 20:48:09 +02:00
a2695a2a11
processing library: init 2024-07-08 20:46:39 +02:00
dc44244e7b
lang: work on some basics 2024-07-08 20:23:29 +02:00
1e0741e600
lang: add credit to macro 2024-07-08 20:20:45 +02:00
3eee768ce1
lang: work on various things
- work on new world
  - add file db
  - source_file parsing
  - locs
- fix some test stuff
2024-07-06 21:57:42 +02:00
eb7806572b
lang: remove this attempt 2024-06-23 20:53:05 +02:00
1c6180aabc
lang: current state for archival purposes 2024-06-23 20:32:10 +02:00
37651a83bc
lang: current state idk try to parallelize 2024-06-10 09:47:48 +02:00
3e2c5946c8
lang: add registry/namespace 2024-06-06 12:59:30 +02:00
1a533eb788
lang: smol module tree things and details 2024-06-06 09:53:28 +02:00
7bc603f7e7
lang: module resolvin 2024-06-05 21:10:52 +02:00
d6bc644fb6
lang: basic ast work 2024-06-05 18:00:14 +02:00
cfefab9fd0
lang: fix some details in the parser 2024-06-05 09:57:08 +02:00
0de076ace1
lang: finish module/top level syntax 2024-06-03 12:05:38 +02:00
946ac879a7
lang: basic module syntax grammar 2024-06-03 11:22:36 +02:00
f6da90a354
lang: improve and simplify error handling and storage
fixes wrong error ordering with errors using `forward_parents`.
2024-06-03 10:53:59 +02:00
ed151c2e3c
lang: handle and recover some errors in lists 2024-05-04 23:12:47 +02:00
4bcaf945d7
lang: add highlighting to errors 2024-05-04 22:35:18 +02:00
29cdcfbe0c
lang: make output errors debuggable 2024-05-04 21:56:12 +02:00
afd493be16
lang: parse pipelines 2024-05-04 21:44:02 +02:00
30f17773a8
lang: add pipelines and rename parser to lst_parser 2024-04-30 12:21:06 +02:00
db2643359c
lang: basic attrset parser 2024-04-30 10:18:59 +02:00
9af71ed3f4
lang: implement vec, list and matrix 2024-04-30 09:45:36 +02:00
8a541546d9
app: fix error_reporting not being used 2024-04-28 13:22:04 +02:00
4df0118aa4
lang: first test and stuff 2024-04-24 21:09:55 +02:00
ba0da33509
split up files a bit 2024-04-24 20:00:17 +02:00
9510d9254c
lang: fix matrix recovery 2024-04-24 19:55:25 +02:00
e62b50a51a
lang: make Markers debuggable 2024-04-24 19:55:16 +02:00
2bea3994c2
lang: matrix parser! 2024-04-24 19:37:52 +02:00
86b1481943
lang: remove empty recursive parser module 2024-04-24 11:15:07 +02:00
06c9094227
lang: fix main 2024-04-24 11:09:48 +02:00
381ab45edc
lang: rewrite parser 2024-04-24 11:07:38 +02:00
6d8b79e8f7
lang: apparently add event debug printer 2024-04-15 16:22:33 +02:00
be637846b1
lang: kinda fun parsing things that can now parse attribute sets with one attribute 2024-04-12 21:31:55 +02:00
1711d17fa6
lang: parsing to events now 2024-04-12 20:55:55 +02:00
f7b61f9e0e
lang: start working on parser that parses to events 2024-04-12 15:43:34 +02:00
2d59a7f560
lang: start implementing parser combinators (i have no idea what i'm doing)
also, the current test.owo crashes for some reason. this is a headache.
manual/imperative parsers are a nightmare.
2024-04-12 01:02:07 +02:00
9da157ff4a
lang: massive amounts of parser and ast pain 2024-04-11 03:23:03 +02:00
881a987b2f
flake.lock: Update
Flake lock file updates:

• Updated input 'devenv':
    'github:cachix/devenv/18ef9849d1ecac7a9a7920eb4f2e4adcf67a8c3a' (2024-01-09)
  → 'github:cachix/devenv/a71323c618664a6b7a39bc183b0ce22ac8511cf9' (2024-04-08)
• Added input 'devenv/cachix':
    'github:cachix/cachix/661bbb7f8b55722a0406456b15267b5426a3bda6' (2024-03-15)
• Added input 'devenv/cachix/devenv':
    'github:cachix/devenv/2ee4450b0f4b95a1b90f2eb5ffea98b90e48c196' (2024-02-23)
• Added input 'devenv/cachix/devenv/flake-compat':
    follows 'devenv/cachix/flake-compat'
• Added input 'devenv/cachix/devenv/nix':
    'github:domenkozar/nix/ecd0af0c1f56de32cbad14daa1d82a132bf298f8' (2024-02-22)
• Added input 'devenv/cachix/devenv/nix/flake-compat':
    'github:edolstra/flake-compat/35bb57c0c8d8b62bbfd284272c928ceb64ddbde9' (2023-01-17)
• Added input 'devenv/cachix/devenv/nix/nixpkgs':
    follows 'devenv/cachix/devenv/nixpkgs'
• Added input 'devenv/cachix/devenv/nix/nixpkgs-regression':
    'github:NixOS/nixpkgs/215d4d0fd80ca5163643b03a33fde804a29cc1e2' (2022-01-24)
• Added input 'devenv/cachix/devenv/nixpkgs':
    'github:NixOS/nixpkgs/9201b5ff357e781bf014d0330d18555695df7ba8' (2023-08-23)
• Added input 'devenv/cachix/devenv/poetry2nix':
    'github:nix-community/poetry2nix/d5006be9c2c2417dafb2e2e5034d83fabd207ee3' (2023-08-24)
• Added input 'devenv/cachix/devenv/poetry2nix/flake-utils':
    'github:numtide/flake-utils/919d646de7be200f3bf08cb76ae1f09402b6f9b4' (2023-07-11)
• Added input 'devenv/cachix/devenv/poetry2nix/flake-utils/systems':
    'github:nix-systems/default/da67096a3b9bf56a91d16901293e51ba5b49a27e' (2023-04-09)
• Added input 'devenv/cachix/devenv/poetry2nix/nix-github-actions':
    'github:nix-community/nix-github-actions/165b1650b753316aa7f1787f3005a8d2da0f5301' (2023-07-09)
• Added input 'devenv/cachix/devenv/poetry2nix/nix-github-actions/nixpkgs':
    follows 'devenv/cachix/devenv/poetry2nix/nixpkgs'
• Added input 'devenv/cachix/devenv/poetry2nix/nixpkgs':
    follows 'devenv/cachix/devenv/nixpkgs'
• Added input 'devenv/cachix/devenv/pre-commit-hooks':
    follows 'devenv/cachix/pre-commit-hooks'
• Added input 'devenv/cachix/flake-compat':
    'github:edolstra/flake-compat/0f9255e01c2351cc7d116c072cb317785dd33b33' (2023-10-04)
• Added input 'devenv/cachix/nixpkgs':
    follows 'devenv/nixpkgs'
• Added input 'devenv/cachix/pre-commit-hooks':
    'github:cachix/pre-commit-hooks.nix/5df5a70ad7575f6601d91f0efec95dd9bc619431' (2024-02-15)
• Added input 'devenv/cachix/pre-commit-hooks/flake-compat':
    'github:edolstra/flake-compat/0f9255e01c2351cc7d116c072cb317785dd33b33' (2023-10-04)
• Added input 'devenv/cachix/pre-commit-hooks/flake-utils':
    'github:numtide/flake-utils/4022d587cbbfd70fe950c1e2083a02621806a725' (2023-12-04)
• Added input 'devenv/cachix/pre-commit-hooks/flake-utils/systems':
    'github:nix-systems/default/da67096a3b9bf56a91d16901293e51ba5b49a27e' (2023-04-09)
• Added input 'devenv/cachix/pre-commit-hooks/gitignore':
    'github:hercules-ci/gitignore.nix/43e1aa1308018f37118e34d3a9cb4f5e75dc11d5' (2023-12-29)
• Added input 'devenv/cachix/pre-commit-hooks/gitignore/nixpkgs':
    follows 'devenv/cachix/pre-commit-hooks/nixpkgs'
• Added input 'devenv/cachix/pre-commit-hooks/nixpkgs':
    follows 'devenv/cachix/nixpkgs'
• Added input 'devenv/cachix/pre-commit-hooks/nixpkgs-stable':
    'github:NixOS/nixpkgs/3dc440faeee9e889fe2d1b4d25ad0f430d449356' (2024-01-10)
• Updated input 'devenv/flake-compat':
    'github:edolstra/flake-compat/35bb57c0c8d8b62bbfd284272c928ceb64ddbde9' (2023-01-17)
  → 'github:edolstra/flake-compat/0f9255e01c2351cc7d116c072cb317785dd33b33' (2023-10-04)
• Updated input 'devenv/nix':
    'github:domenkozar/nix/7c91803598ffbcfe4a55c44ac6d49b2cf07a527f' (2023-02-16)
  → 'github:domenkozar/nix/c5bbf14ecbd692eeabf4184cc8d50f79c2446549' (2024-03-15)
• Added input 'devenv/nix/flake-compat':
    'github:edolstra/flake-compat/35bb57c0c8d8b62bbfd284272c928ceb64ddbde9' (2023-01-17)
• Removed input 'devenv/nix/lowdown-src'
• Updated input 'devenv/nixpkgs':
    'github:NixOS/nixpkgs/126f49a01de5b7e35a43fd43f891ecf6d3a51459' (2023-03-15)
  → 'github:cachix/devenv-nixpkgs/829e73affeadfb4198a7105cbe3a03153d13edc9' (2024-03-12)
• Updated input 'devenv/pre-commit-hooks':
    'github:cachix/pre-commit-hooks.nix/ea96f0c05924341c551a797aaba8126334c505d2' (2024-01-08)
  → 'github:cachix/pre-commit-hooks.nix/e35aed5fda3cc79f88ed7f1795021e559582093a' (2024-04-02)
• Updated input 'devenv/pre-commit-hooks/flake-utils':
    'github:numtide/flake-utils/a1720a10a6cfe8234c0e93907ffe81be440f4cef' (2023-05-31)
  → 'github:numtide/flake-utils/b1d9ab70662946ef0850d488da1c9019f3a9752a' (2024-03-11)
• Updated input 'devenv/pre-commit-hooks/gitignore':
    'github:hercules-ci/gitignore.nix/a20de23b925fd8264fd7fad6454652e142fd7f73' (2022-08-14)
  → 'github:hercules-ci/gitignore.nix/637db329424fd7e46cf4185293b9cc8c88c95394' (2024-02-28)
• Updated input 'devenv/pre-commit-hooks/nixpkgs-stable':
    'github:NixOS/nixpkgs/c37ca420157f4abc31e26f436c1145f8951ff373' (2023-06-03)
  → 'github:NixOS/nixpkgs/614b4613980a522ba49f0d194531beddbb7220d3' (2024-03-17)
• Updated input 'fenix':
    'github:nix-community/fenix/93e89638c15512db65e931f26ce36edf8cfbb4a5' (2024-01-10)
  → 'github:nix-community/fenix/99c6241db5ca5363c05c8f4acbdf3a4e8fc42844' (2024-04-06)
• Updated input 'fenix/nixpkgs':
    'github:nixos/nixpkgs/46ae0210ce163b3cba6c7da08840c1d63de9c701' (2024-01-06)
  → 'github:nixos/nixpkgs/fd281bd6b7d3e32ddfa399853946f782553163b5' (2024-04-03)
• Updated input 'fenix/rust-analyzer-src':
    'github:rust-lang/rust-analyzer/ae6e73772432cfe35bb0ff6de6fdcfa908642b67' (2024-01-09)
  → 'github:rust-lang/rust-analyzer/8e581ac348e223488622f4d3003cb2bd412bf27e' (2024-04-03)
• Updated input 'nixpkgs':
    'github:NixOS/nixpkgs/317484b1ead87b9c1b8ac5261a8d2dd748a0492d' (2024-01-08)
  → 'github:NixOS/nixpkgs/ff0dbd94265ac470dda06a657d5fe49de93b4599' (2024-04-06)
2024-04-08 23:27:56 +02:00
bfd4b3765f
lang: state with confusing error 2024-04-08 15:43:42 +02:00
198c74c7ae
lang: make attrset delims braces 2024-04-08 14:04:52 +02:00
8d7401531e
lang: some small, unfinished stuff 2024-04-07 01:04:02 +02:00
b6e304fa78
lang: rework ast structure 2024-04-07 00:55:12 +02:00
ace69b0094
svg-filters: format matrices in complex test 2024-04-03 20:08:33 +02:00
84448af714
lang: funky basic error reporting stuff 2024-04-03 17:00:20 +02:00
ae60db7721
lang: extract tests into file 2024-04-03 00:30:11 +02:00
de008263ca
lang: add test 2024-04-03 00:28:45 +02:00
ca84af4e1b
lang: basic parser 2024-04-03 00:08:00 +02:00
ae86ae29ab
svg-filters: more testssss 2024-03-24 17:07:35 +01:00
02c5e9e159
svg-filters: figured out flood not working in local test env 2024-03-24 16:11:07 +01:00
0197df5ee2
svg-filters: add turbulence and displacement map 2024-03-24 15:49:41 +01:00
919a3bb377
svg-filters: seperate out abstractions and add flood abstraction 2024-03-22 16:47:21 +01:00
9ae8c2fbd3
svg-filters: add nice abstractions for component transfer 2024-03-22 16:24:04 +01:00
9727ef82ca
svg-filters: implement feComponentTransfer rendering 2024-03-22 16:05:36 +01:00
c31a158d9b
svg-filters: rework macro slightly 2024-03-20 19:11:42 +01:00
aeeee54200
svg-filters: new conditional attrs macro 2024-03-19 18:43:30 +01:00
dc7d76dc26
svg-filters: create gen_attr and gen_attrs convenience macros 2024-03-19 15:47:09 +01:00
e17fffb66b
svg-filters: blend node 2024-03-19 15:15:36 +01:00
f59062cf88
svg-filters: cleanup 2024-03-17 00:51:20 +01:00
384fef5a81
svg-filters(primitives): start with feComponentTransfer 2024-03-17 00:50:51 +01:00
77bcb54b5e
svg-filters(tests): start 2024-03-17 00:50:16 +01:00
d87033d320
svg-filters(codegen): add pretty and ugly printing 2024-03-17 00:49:53 +01:00
bf60bdd814
svg-filters: simplify and refactor a bit 2024-03-16 23:57:09 +01:00
5368951254
svg-filters: rework codegen 2024-03-16 20:52:45 +01:00
a42ec014e5
svg-filters: get svg generation working!!!! 2024-03-16 00:35:23 +01:00
01b1880089
svg-filters: get bare basic xml generation going 2024-03-15 22:45:28 +01:00
56848a1b05
svg-filters: add basic graph and stuffs 2024-03-15 19:52:40 +01:00
69f0baf425
svg-filters: init 2024-03-15 16:44:47 +01:00
112 changed files with 7024 additions and 470 deletions

1496
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -4,12 +4,18 @@ members = [
"crates/eval", "crates/eval",
"crates/ir", "crates/ir",
"crates/lang", "crates/lang",
"crates/svg-filters",
"crates/prowocessing",
"crates/executor-poc",
"crates/pawarser",
"crates/json-pawarser",
] ]
resolver = "2" resolver = "2"
[workspace.dependencies] [workspace.dependencies]
clap = { version = "4", features = ["derive"] } clap = { version = "4", features = ["derive"] }
serde = { version = "1.0", features = ["derive"] } serde = { version = "1.0", features = ["derive"] }
petgraph = "0.6.4"
# to enable all the lints below, this must be present in a workspace member's Cargo.toml: # to enable all the lints below, this must be present in a workspace member's Cargo.toml:
# [lints] # [lints]

View file

@ -2,6 +2,7 @@
name = "app" name = "app"
version = "0.1.0" version = "0.1.0"
edition = "2021" edition = "2021"
default-run = "app"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
@ -11,6 +12,7 @@ clap = { workspace = true, features = [ "derive", "env" ] }
dirs = "5" dirs = "5"
eval = { path = "../eval" } eval = { path = "../eval" }
ir = { path = "../ir" } ir = { path = "../ir" }
prowocessing = { path = "../prowocessing"}
owo-colors = "4" owo-colors = "4"
ron = "0.8" ron = "0.8"
serde = { workspace = true, features = [ "derive" ] } serde = { workspace = true, features = [ "derive" ] }

View file

@ -1,18 +1,11 @@
use std::path::PathBuf; use self::config_file::{find_config_file, Configs};
pub(crate) use cli::CliConfigs;
use clap::Parser;
use self::{
cli::Args,
config_file::{find_config_file, Configs},
};
mod cli; mod cli;
mod config_file; mod config_file;
/// this struct may hold all configuration /// this struct may hold all configuration
pub struct Config { pub struct Config {
pub source: PathBuf,
pub evaluator: eval::Available, pub evaluator: eval::Available,
pub startup_msg: bool, pub startup_msg: bool,
@ -20,13 +13,17 @@ pub struct Config {
impl Config { impl Config {
/// Get the configs from all possible places (args, file, env...) /// Get the configs from all possible places (args, file, env...)
pub fn read() -> Self { pub fn read(args: &CliConfigs) -> Self {
let args = Args::parse(); // let config = if let Some(config) = &args.config_path {
let config = if let Some(config) = args.config_path { // Ok(config.clone())
Ok(config) // } else {
} else { // find_config_file()
find_config_file() // };
}; let config = args
.config_path
.clone()
.ok_or(())
.or_else(|()| find_config_file());
// try to read a maybe existing config file // try to read a maybe existing config file
let config = config.ok().and_then(|path| { let config = config.ok().and_then(|path| {
@ -42,7 +39,6 @@ impl Config {
if let Some(file) = config { if let Some(file) = config {
Self { Self {
source: args.source,
evaluator: args.evaluator.and(file.evaluator).unwrap_or_default(), evaluator: args.evaluator.and(file.evaluator).unwrap_or_default(),
// this is negated because to an outward api, the negative is more intuitive, // this is negated because to an outward api, the negative is more intuitive,
// while in the source the other way around is more intuitive // while in the source the other way around is more intuitive
@ -50,7 +46,6 @@ impl Config {
} }
} else { } else {
Self { Self {
source: args.source,
startup_msg: !args.no_startup_message, startup_msg: !args.no_startup_message,
evaluator: args.evaluator.unwrap_or_default(), evaluator: args.evaluator.unwrap_or_default(),
} }

View file

@ -1,12 +1,9 @@
use std::path::PathBuf; use std::path::PathBuf;
use clap::{builder::BoolishValueParser, ArgAction, Parser}; use clap::{builder::BoolishValueParser, ArgAction, Args};
#[derive(Parser)]
pub(crate) struct Args {
/// What file contains the pipeline to evaluate.
pub source: PathBuf,
#[derive(Args)]
pub(crate) struct CliConfigs {
/// How to actually run the pipeline. /// How to actually run the pipeline.
/// Overrides the config file. Defaults to the debug evaluator. /// Overrides the config file. Defaults to the debug evaluator.
#[arg(short, long)] #[arg(short, long)]

View file

@ -5,7 +5,9 @@ use std::{
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use super::error::Config; use crate::error_reporting::{report_serde_json_err, report_serde_ron_err};
use super::error::{self, Config};
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
pub struct Configs { pub struct Configs {
@ -40,14 +42,20 @@ pub(super) fn find_config_file() -> Result<PathBuf, Config> {
} }
impl Configs { impl Configs {
pub fn read(p: PathBuf) -> Result<Self, Config> { pub fn read(p: PathBuf) -> Result<Self, error::Config> {
match p match p
.extension() .extension()
.map(|v| v.to_str().expect("config path to be UTF-8")) .map(|v| v.to_str().expect("config path to be UTF-8"))
{ {
Some("ron") => Ok(serde_json::from_str(&fs::read_to_string(p)?)?), Some("ron") => {
Some("json") => Ok(ron::from_str(&fs::read_to_string(p)?)?), let f = fs::read_to_string(p)?;
e => Err(Config::UnknownExtension(e.map(str::to_owned))), ron::from_str(&f).or_else(|e| report_serde_ron_err(&f, &e))
}
Some("json") => {
let f = fs::read_to_string(p)?;
serde_json::from_str(&f).or_else(|e| report_serde_json_err(&f, &e))
}
e => Err(error::Config::UnknownExtension(e.map(str::to_owned))),
} }
} }
} }

View file

@ -25,7 +25,7 @@ fn report_serde_err(src: &str, line: usize, col: usize, msg: String) -> ! {
.finish() .finish()
.eprint(("test", Source::from(src))) .eprint(("test", Source::from(src)))
.expect("writing error to stderr failed"); .expect("writing error to stderr failed");
process::exit(1); process::exit(1)
} }
/// Reconstruct a byte offset from the line + column numbers typical from serde crates /// Reconstruct a byte offset from the line + column numbers typical from serde crates

View file

@ -1,6 +1,8 @@
use std::fs; use std::{fs, path::PathBuf};
use config::Config; use clap::{Parser, Subcommand};
use config::{CliConfigs, Config};
use dev::DevCommands;
use welcome_msg::print_startup_msg; use welcome_msg::print_startup_msg;
mod config; mod config;
@ -9,19 +11,60 @@ mod config;
mod error_reporting; mod error_reporting;
mod welcome_msg; mod welcome_msg;
#[derive(Parser)]
struct Args {
#[command(flatten)]
configs: CliConfigs,
#[command(subcommand)]
command: Commands,
}
#[derive(Subcommand)]
enum Commands {
Run {
/// What file contains the pipeline to evaluate.
source: PathBuf,
},
Dev {
#[command(subcommand)]
command: DevCommands,
},
}
fn main() { fn main() {
// TODO: proper error handling across the whole function // TODO: proper error handling across the whole function
// don't forget to also look inside `Config` // don't forget to also look inside `Config`
let cfg = Config::read(); let args = Args::parse();
let cfg = Config::read(&args.configs);
if cfg.startup_msg { if cfg.startup_msg {
print_startup_msg(); print_startup_msg();
} }
let source = fs::read_to_string(cfg.source).expect("can't find source file"); match args.command {
Commands::Run { source } => {
let source = fs::read_to_string(source).expect("can't find source file");
let ir = ir::from_ron(&source).expect("failed to parse source to graph ir"); let ir = ir::from_ron(&source).expect("failed to parse source to graph ir");
let mut machine = cfg.evaluator.pick(); let mut machine = cfg.evaluator.pick();
machine.feed(ir); machine.feed(ir);
machine.eval_full(); machine.eval_full();
}
Commands::Dev {
command: dev_command,
} => dev_command.run(),
}
}
mod dev {
use clap::Subcommand;
#[derive(Subcommand)]
pub(crate) enum DevCommands {}
impl DevCommands {
pub fn run(self) {
println!("There are currently no dev commands.");
}
}
} }

View file

@ -37,7 +37,7 @@ impl Available {
#[must_use] #[must_use]
pub fn pick(&self) -> Box<dyn Evaluator> { pub fn pick(&self) -> Box<dyn Evaluator> {
match self { match self {
Self::Debug => Box::new(kind::debug::Evaluator::default()), Self::Debug => Box::<kind::debug::Evaluator>::default(),
} }
} }
} }

View file

@ -0,0 +1,13 @@
[package]
name = "executor-poc"
version = "0.1.0"
edition = "2021"
[dependencies]
image = "0.25.1"
indexmap = "2.2.6"
nalgebra = "0.33.0"
petgraph.workspace = true
[lints]
workspace = true

View file

@ -0,0 +1,128 @@
use indexmap::IndexMap;
use instructions::Instruction;
use petgraph::graph::DiGraph;
use types::Type;
trait Node {
fn inputs() -> IndexMap<String, Type>;
fn outputs() -> IndexMap<String, Type>;
}
struct NodeGraph {
graph: DiGraph<Instruction, TypedEdge>,
}
struct TypedEdge {
from: String,
to: String,
typ: Type,
}
mod instructions {
//! This is the lowest level of the IR, the one the executor will use.
use std::path::Path;
use indexmap::{indexmap, IndexMap};
pub enum Instruction {
// File handling
LoadFile,
SaveFile,
ColorMatrix,
PosMatrix,
Blend,
SplitChannels,
}
impl Instruction {
fn inputs(&self) -> IndexMap<String, Type> {
match self {
Instruction::LoadFile => indexmap! {
"path" => Type::Path
},
Instruction::SaveFile => indexmap! {
"path" => Type::Path
},
Instruction::ColorMatrix => indexmap! {
"image" => Type::ImageData,
"matrix" => Type::Mat(4,5)
},
Instruction::PosMatrix => indexmap! {
"image" => Type::ImageData,
"matrix" => Type::Mat(2, 3),
},
Instruction::Blend => todo!(),
Instruction::SplitChannels => todo!(),
}
}
fn outputs(&self) -> IndexMap<String, Type> {
match self {
Instruction::LoadFile => indexmap! {
"image" => Type::ImageData
},
Instruction::SaveFile => indexmap! {},
Instruction::ColorMatrix => indexmap! {
"resut" => Type::ImageData
},
Instruction::PosMatrix => todo!(),
Instruction::Blend => todo!(),
Instruction::SplitChannels => todo!(),
}
}
}
}
mod types {
pub enum Type {
// TODO: later do lower level type system for this stuff?
// Image(Size, PixelType),
// // image data for processing.
// // always PixelType::Rgba32F
// ImageData(Size),
// // stuff that's still to be generated, not sized and no pixeltype
// ProceduralImage,
ImageData,
Text,
Integer,
Float,
Double,
Path,
Bool,
Vec(
// length,
u8,
),
Mat(
// Rows
u8,
// Columns
u8,
),
}
// pub struct Size {
// width: u16,
// height: u16,
// }
// Pixel types. Taken from variants [here](https://docs.rs/image/latest/image/pub enum.DynamicImage.html).
// pub enum PixelType {
// Luma8,
// LumaA8,
// Rgb8,
// Rgba8,
// Luma16,
// LumaA16,
// Rgb16,
// Rgba16,
// Rgb32F,
// #[default]
// Rgba32F,
// }
}

View file

@ -0,0 +1,13 @@
[package]
name = "json-pawarser"
version = "0.1.0"
edition = "2021"
[dependencies]
logos = "0.14.2"
enumset = "1.1.3"
rowan = "0.15.15"
pawarser = { path = "../pawarser" }
[lints]
workspace = true

View file

@ -0,0 +1,78 @@
use array::array;
use enumset::{enum_set, EnumSet};
use pawarser::parser::ParserBuilder;
use crate::{
syntax_error::SyntaxError,
syntax_kind::{lex, SyntaxKind},
};
use self::object::object;
mod array;
mod object;
pub(crate) type Parser<'src> = pawarser::Parser<'src, SyntaxKind, SyntaxError>;
pub(crate) type CompletedMarker = pawarser::CompletedMarker<SyntaxKind, SyntaxError>;
const BASIC_VALUE_TOKENS: EnumSet<SyntaxKind> =
enum_set!(SyntaxKind::BOOL | SyntaxKind::NULL | SyntaxKind::NUMBER | SyntaxKind::STRING);
pub fn value(p: &mut Parser) -> bool {
if BASIC_VALUE_TOKENS.contains(p.current()) {
p.do_bump();
return true;
} else {
object(p).or_else(|| array(p)).is_some()
}
}
#[cfg(test)]
mod tests {
use super::{
test_utils::{check_parser, gen_checks},
value,
};
#[test]
fn value_lit() {
gen_checks! {value;
r#""helo world""# => r#"ROOT { STRING "\"helo world\""; }"#,
"42" => r#"ROOT { NUMBER "42"; }"#,
"null" => r#"ROOT { NULL "null"; }"#,
"true" => r#"ROOT { BOOL "true"; }"#,
"false" => r#"ROOT { BOOL "false"; }"#
};
}
}
#[cfg(test)]
mod test_utils {
use pawarser::parser::ParserBuilder;
use crate::syntax_kind::{lex, SyntaxKind};
use super::Parser;
macro_rules! gen_checks {
($fn_to_test:ident; $($in:literal => $out:literal),+) => {
$(crate::grammar::test_utils::check_parser($in, |p| { $fn_to_test(p); }, $out);)+
}
}
pub(super) use gen_checks;
pub(super) fn check_parser(input: &str, parser_fn: fn(&mut Parser), expected_output: &str) {
let toks = lex(input);
let mut p: Parser = ParserBuilder::new(toks)
.add_meaningless(SyntaxKind::WHITESPACE)
.add_meaningless(SyntaxKind::NEWLINE)
.build();
parser_fn(&mut p);
let out = p.finish();
assert_eq!(format!("{out:?}").trim_end(), expected_output);
}
}

View file

@ -0,0 +1,36 @@
use crate::{syntax_error::SyntaxError, syntax_kind::SyntaxKind};
use super::{value, CompletedMarker, Parser};
pub(super) fn array(p: &mut Parser) -> Option<CompletedMarker> {
let array_start = p.start("array");
if !p.eat(SyntaxKind::BRACKET_OPEN) {
array_start.abandon(p);
return None;
}
let el = p.start("arr_el");
value(p);
el.complete(p, SyntaxKind::ELEMENT);
while p.at(SyntaxKind::COMMA) {
let potential_trailing_comma = p.start("potential_trailing_comma");
p.eat(SyntaxKind::COMMA);
let maybe_el = p.start("arr_el");
if !value(p) {
maybe_el.abandon(p);
potential_trailing_comma.complete(p, SyntaxKind::TRAILING_COMMA);
} else {
maybe_el.complete(p, SyntaxKind::ELEMENT);
potential_trailing_comma.abandon(p);
}
}
Some(if !p.eat(SyntaxKind::BRACKET_CLOSE) {
array_start.error(p, SyntaxError::UnclosedArray)
} else {
array_start.complete(p, SyntaxKind::ARRAY)
})
}

View file

@ -0,0 +1,92 @@
use crate::{grammar::value, syntax_error::SyntaxError, syntax_kind::SyntaxKind};
use super::{CompletedMarker, Parser, BASIC_VALUE_TOKENS};
pub(super) fn object(p: &mut Parser) -> Option<CompletedMarker> {
let obj_start = p.start("object");
if !p.eat(SyntaxKind::BRACE_OPEN) {
obj_start.abandon(p);
return None;
}
member(p);
while p.at(SyntaxKind::COMMA) {
// not always an error, later configurable
let potential_trailing_comma = p.start("potential_trailing_comma");
p.eat(SyntaxKind::COMMA);
if member(p).is_none() {
potential_trailing_comma.complete(p, SyntaxKind::TRAILING_COMMA);
} else {
potential_trailing_comma.abandon(p);
}
}
Some(if p.eat(SyntaxKind::BRACE_CLOSE) {
obj_start.complete(p, SyntaxKind::OBJECT)
} else {
obj_start.error(p, SyntaxError::UnclosedObject)
})
}
fn member(p: &mut Parser) -> Option<CompletedMarker> {
let member_start = p.start("member");
if p.at(SyntaxKind::BRACE_CLOSE) {
member_start.abandon(p);
return None;
} else if p.at(SyntaxKind::STRING) {
let member_name_start = p.start("member_name");
p.eat(SyntaxKind::STRING);
member_name_start.complete(p, SyntaxKind::MEMBER_NAME);
} else {
return todo!("handle other tokens: {:?}", p.current());
}
if !p.eat(SyntaxKind::COLON) {
todo!("handle wrong tokens")
}
let member_value_start = p.start("member_value_start");
if value(p) {
member_value_start.complete(p, SyntaxKind::MEMBER_VALUE);
Some(member_start.complete(p, SyntaxKind::MEMBER))
} else {
member_value_start.abandon(p);
let e = member_start.error(p, SyntaxError::MemberMissingValue);
Some(
e.precede(p, "member but failed already")
.complete(p, SyntaxKind::MEMBER),
)
}
}
#[cfg(test)]
mod tests {
use crate::grammar::{
object::{member, object},
test_utils::gen_checks,
};
#[test]
fn object_basic() {
gen_checks! {object;
r#"{"a": "b"}"# => r#"ROOT { OBJECT { BRACE_OPEN "{"; MEMBER { MEMBER_NAME { STRING "\"a\""; } COLON ":"; WHITESPACE " "; MEMBER_VALUE { STRING "\"b\""; } } BRACE_CLOSE "}"; } }"#,
r#"{"a": 42}"# => r#"ROOT { OBJECT { BRACE_OPEN "{"; MEMBER { MEMBER_NAME { STRING "\"a\""; } COLON ":"; WHITESPACE " "; MEMBER_VALUE { NUMBER "42"; } } BRACE_CLOSE "}"; } }"#,
r#"{"a": "b""# => r#"ROOT { PARSE_ERR: UnclosedObject { BRACE_OPEN "{"; MEMBER { MEMBER_NAME { STRING "\"a\""; } COLON ":"; WHITESPACE " "; MEMBER_VALUE { STRING "\"b\""; } } } }"#,
r#"{"a": }"# => r#"ROOT { OBJECT { BRACE_OPEN "{"; MEMBER { PARSE_ERR: MemberMissingValue { MEMBER_NAME { STRING "\"a\""; } COLON ":"; } } WHITESPACE " "; BRACE_CLOSE "}"; } }"#,
r#"{"a":"# => r#"ROOT { PARSE_ERR: UnclosedObject { BRACE_OPEN "{"; MEMBER { PARSE_ERR: MemberMissingValue { MEMBER_NAME { STRING "\"a\""; } COLON ":"; } } } }"#,
r#"{"a":true,}"# => r#"ROOT { OBJECT { BRACE_OPEN "{"; MEMBER { MEMBER_NAME { STRING "\"a\""; } COLON ":"; MEMBER_VALUE { BOOL "true"; } } TRAILING_COMMA { COMMA ","; } BRACE_CLOSE "}"; } }"#
}
}
#[test]
fn member_basic() {
gen_checks! {member;
r#""a": "b""# => r#"ROOT { MEMBER { MEMBER_NAME { STRING "\"a\""; } COLON ":"; WHITESPACE " "; MEMBER_VALUE { STRING "\"b\""; } } }"#,
r#""a": 42"# => r#"ROOT { MEMBER { MEMBER_NAME { STRING "\"a\""; } COLON ":"; WHITESPACE " "; MEMBER_VALUE { NUMBER "42"; } } }"#,
r#""a":"# => r#"ROOT { MEMBER { PARSE_ERR: MemberMissingValue { MEMBER_NAME { STRING "\"a\""; } COLON ":"; } } }"#
}
}
}

View file

@ -0,0 +1,3 @@
mod grammar;
mod syntax_error;
mod syntax_kind;

View file

@ -0,0 +1,11 @@
use crate::syntax_kind::SyntaxKind;
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum SyntaxError {
UnclosedObject,
UnclosedArray,
DisallowedKeyType(SyntaxKind),
MemberMissingValue,
UnexpectedTrailingComma,
}
impl pawarser::parser::SyntaxError for SyntaxError {}

View file

@ -0,0 +1,117 @@
use logos::Logos;
pub fn lex(src: &str) -> Vec<(SyntaxKind, &str)> {
let mut lex = SyntaxKind::lexer(src);
let mut r = Vec::new();
while let Some(tok_res) = lex.next() {
r.push((tok_res.unwrap_or(SyntaxKind::LEX_ERR), lex.slice()))
}
r
}
#[derive(enumset::EnumSetType, Debug, Logos, PartialEq, Eq, Clone, Copy, Hash)]
#[repr(u16)]
#[enumset(no_super_impls)]
#[allow(non_camel_case_types)]
pub enum SyntaxKind {
OBJECT,
MEMBER,
MEMBER_NAME,
MEMBER_VALUE,
ARRAY,
ELEMENT,
// SyntaxKinds for future json5/etc support
TRAILING_COMMA,
// Tokens
// Regexes adapted from [the logos handbook](https://logos.maciej.codes/examples/json_borrowed.html)
#[token("true")]
#[token("false")]
BOOL,
#[token("{")]
BRACE_OPEN,
#[token("}")]
BRACE_CLOSE,
#[token("[")]
BRACKET_OPEN,
#[token("]")]
BRACKET_CLOSE,
#[token(":")]
COLON,
#[token(",")]
COMMA,
#[token("null")]
NULL,
#[regex(r"-?(?:0|[1-9]\d*)(?:\.\d+)?(?:[eE][+-]?\d+)?")]
NUMBER,
#[regex(r#""([^"\\]|\\["\\bnfrt]|u[a-fA-F0-9]{4})*""#)]
STRING,
// Whitespace tokens
#[regex("[ \\t\\f]+")]
WHITESPACE,
#[token("\n")]
NEWLINE,
// Error SyntaxKinds
LEX_ERR,
PARSE_ERR,
// Meta SyntaxKinds
ROOT,
EOF,
}
impl pawarser::parser::SyntaxElement for SyntaxKind {
const SYNTAX_EOF: Self = Self::EOF;
const SYNTAX_ERROR: Self = Self::PARSE_ERR;
const SYNTAX_ROOT: Self = Self::ROOT;
}
impl From<SyntaxKind> for rowan::SyntaxKind {
fn from(kind: SyntaxKind) -> Self {
Self(kind as u16)
}
}
impl From<rowan::SyntaxKind> for SyntaxKind {
fn from(raw: rowan::SyntaxKind) -> Self {
assert!(raw.0 <= SyntaxKind::EOF as u16);
#[allow(unsafe_code, reason = "The transmute is necessary here")]
unsafe {
std::mem::transmute::<u16, SyntaxKind>(raw.0)
}
}
}
#[cfg(test)]
mod tests {
use crate::syntax_kind::{lex, SyntaxKind};
#[test]
fn simple_object() {
const TEST_DATA: &str = r#"{"hello_world": "meow", "some_num":7.42}"#;
assert_eq!(
dbg!(lex(TEST_DATA)),
vec![
(SyntaxKind::BRACE_OPEN, "{"),
(SyntaxKind::STRING, "\"hello_world\""),
(SyntaxKind::COLON, ":"),
(SyntaxKind::WHITESPACE, " "),
(SyntaxKind::STRING, "\"meow\""),
(SyntaxKind::COMMA, ","),
(SyntaxKind::WHITESPACE, " "),
(SyntaxKind::STRING, "\"some_num\""),
(SyntaxKind::COLON, ":"),
(SyntaxKind::NUMBER, "7.42"),
(SyntaxKind::BRACE_CLOSE, "}")
]
);
}
}

View file

@ -7,6 +7,19 @@ edition = "2021"
[dependencies] [dependencies]
logos = "0.14" logos = "0.14"
petgraph = { workspace = true}
indexmap = "2.2.6"
clap = { version = "4", features = ["derive"] }
ariadne = "0.4.0"
ego-tree = "0.6.2"
rowan = "0.15.15"
drop_bomb = "0.1.5"
enumset = "1.1.3"
indoc = "2"
dashmap = "5.5.3"
crossbeam = "0.8.4"
owo-colors = {version = "4", features = ["supports-colors"]}
strip-ansi-escapes = "0.2.0"
[lints] [lints]
workspace = true workspace = true

80
crates/lang/src/ast.rs Normal file
View file

@ -0,0 +1,80 @@
use crate::lst_parser::syntax_kind::SyntaxKind::*;
use crate::SyntaxNode;
use rowan::Language;
// Heavily modified version of https://github.com/rust-analyzer/rowan/blob/e2d2e93e16c5104b136d0bc738a0d48346922200/examples/s_expressions.rs#L250-L266
macro_rules! ast_nodes {
($($ast:ident, $kind:ident);+) => {
$(
#[derive(Debug, Clone, PartialEq, Eq, Hash)]
#[repr(transparent)]
pub struct $ast(SyntaxNode);
impl rowan::ast::AstNode for $ast {
type Language = crate::Lang;
fn can_cast(kind: <Self::Language as Language>::Kind) -> bool {
kind == $kind
}
fn cast(node: SyntaxNode) -> Option<Self> {
if node.kind() == $kind {
Some(Self(node))
} else {
None
}
}
fn syntax(&self) -> &SyntaxNode {
&self.0
}
}
)+
};
}
ast_nodes!(
Def, DEF;
DefName, DEF_NAME;
DefBody, DEF_BODY;
Mod, MODULE;
ModName, MODULE_NAME;
ModBody, MODULE_BODY;
Use, USE;
UsePat, USE_PAT;
PatItem, PAT_ITEM;
PatGlob, PAT_GLOB;
PatGroup, PAT_GROUP;
Literal, LITERAL;
IntLit, INT_NUM;
FloatLit, FLOAT_NUM;
StringLit, STRING;
Matrix, MATRIX;
MatrixRow, MAT_ROW;
Vector, VEC;
List, LIST;
CollectionItem, COLLECTION_ITEM;
ParenthesizedExpr, PARENTHESIZED_EXPR;
Expression, EXPR;
Pipeline, PIPELINE;
Instruction, INSTR;
InstructionName, INSTR_NAME;
InstructionParams, INSTR_PARAMS;
AttributeSet, ATTR_SET;
Attribute, ATTR;
AttributeName, ATTR_NAME;
AttributeValue, ATTR_VALUE;
ParseError, PARSE_ERR;
LexError, LEX_ERR;
Root, ROOT;
Eof, EOF
);

View file

@ -1 +1,25 @@
pub mod tokens; #![feature(type_alias_impl_trait, lint_reasons, box_into_inner)]
use crate::lst_parser::syntax_kind::SyntaxKind;
pub mod ast;
pub mod lst_parser;
pub mod world;
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash)]
pub enum Lang {}
impl rowan::Language for Lang {
type Kind = SyntaxKind;
#[allow(unsafe_code)]
fn kind_from_raw(raw: rowan::SyntaxKind) -> Self::Kind {
assert!(raw.0 <= SyntaxKind::ROOT as u16);
unsafe { std::mem::transmute::<u16, SyntaxKind>(raw.0) }
}
fn kind_to_raw(kind: Self::Kind) -> rowan::SyntaxKind {
kind.into()
}
}
pub type SyntaxNode = rowan::SyntaxNode<Lang>;
pub type SyntaxToken = rowan::SyntaxNode<Lang>;
pub type SyntaxElement = rowan::NodeOrToken<SyntaxNode, SyntaxToken>;

View file

@ -0,0 +1,169 @@
use drop_bomb::DropBomb;
use self::{
error::SyntaxError,
events::{Event, NodeKind},
input::Input,
syntax_kind::SyntaxKind,
};
use std::cell::Cell;
pub mod syntax_kind;
#[cfg(test)]
mod tests;
pub mod error;
pub mod events;
pub mod grammar;
pub mod input;
pub mod output;
const PARSER_STEP_LIMIT: u32 = 4096;
pub struct Parser<'src, 'toks> {
input: Input<'src, 'toks>,
pos: usize,
events: Vec<Event>,
steps: Cell<u32>,
}
impl<'src, 'toks> Parser<'src, 'toks> {
pub fn new(input: Input<'src, 'toks>) -> Self {
Self {
input,
pos: 0,
events: Vec::new(),
steps: Cell::new(0),
}
}
pub fn finish(self) -> Vec<Event> {
self.events
}
pub(crate) fn nth(&self, n: usize) -> SyntaxKind {
self.step();
self.input.kind(self.pos + n)
}
pub fn eat_succeeding_ws(&mut self) {
self.push_ev(Event::Eat {
count: self.input.meaningless_tail_len(),
});
}
pub(crate) fn current(&self) -> SyntaxKind {
self.step();
self.input.kind(self.pos)
}
pub(crate) fn start(&mut self, name: &str) -> Marker {
let pos = self.events.len();
self.push_ev(Event::tombstone());
Marker::new(pos, name)
}
pub(crate) fn at(&self, kind: SyntaxKind) -> bool {
self.nth_at(0, kind)
}
pub(crate) fn eat(&mut self, kind: SyntaxKind) -> bool {
if !self.at(kind) {
return false;
}
self.do_bump();
true
}
pub(crate) fn nth_at(&self, n: usize, kind: SyntaxKind) -> bool {
self.nth(n) == kind
}
fn do_bump(&mut self) {
self.push_ev(Event::Eat {
count: self.input.preceding_meaningless(self.pos),
});
self.pos += 1;
}
fn push_ev(&mut self, event: Event) {
self.events.push(event)
}
fn step(&self) {
let steps = self.steps.get();
assert!(steps <= PARSER_STEP_LIMIT, "the parser seems stuck...");
self.steps.set(steps + 1);
}
}
pub(crate) struct Marker {
pos: usize,
bomb: DropBomb,
}
impl Marker {
pub(crate) fn new(pos: usize, name: &str) -> Self {
Self {
pos,
bomb: DropBomb::new(format!("Marker {name} must be completed or abandoned")),
}
}
fn close_node(mut self, p: &mut Parser, kind: NodeKind) -> CompletedMarker {
self.bomb.defuse();
match &mut p.events[self.pos] {
Event::Start { kind: slot, .. } => *slot = kind.clone(),
_ => unreachable!(),
}
p.push_ev(Event::Finish);
CompletedMarker {
pos: self.pos,
kind,
}
}
pub(crate) fn complete(self, p: &mut Parser<'_, '_>, kind: SyntaxKind) -> CompletedMarker {
self.close_node(p, NodeKind::Syntax(kind))
}
pub(crate) fn error(self, p: &mut Parser, kind: SyntaxError) -> CompletedMarker {
self.close_node(p, NodeKind::Error(kind))
}
pub(crate) fn abandon(mut self, p: &mut Parser<'_, '_>) {
self.bomb.defuse();
if self.pos == p.events.len() - 1 {
match p.events.pop() {
Some(Event::Start {
kind: NodeKind::Syntax(SyntaxKind::TOMBSTONE),
forward_parent: None,
}) => (),
_ => unreachable!(),
}
}
}
}
pub(crate) struct CompletedMarker {
pos: usize,
kind: NodeKind,
}
impl CompletedMarker {
pub(crate) fn precede(self, p: &mut Parser<'_, '_>, name: &str) -> Marker {
let new_pos = p.start(name);
match &mut p.events[self.pos] {
Event::Start { forward_parent, .. } => {
*forward_parent = Some(new_pos.pos - self.pos);
}
_ => unreachable!(),
}
new_pos
}
}

View file

@ -0,0 +1,15 @@
use crate::lst_parser::syntax_kind::SyntaxKind;
#[derive(Debug, PartialEq, Eq, Clone)]
pub enum SyntaxError {
Expected(Vec<SyntaxKind>),
PipelineNeedsSink,
// if there was two space seperated items in a list
SpaceSepInList,
SemicolonInList,
CommaInMatOrVec,
UnterminatedTopLevelItem,
UnclosedModuleBody,
UnfinishedPath,
PathSepContainsSemicolon,
}

View file

@ -0,0 +1,70 @@
use crate::lst_parser::syntax_kind::SyntaxKind;
use super::error::SyntaxError;
#[derive(Debug)]
pub enum Event {
Start {
kind: NodeKind,
forward_parent: Option<usize>,
},
Finish,
Eat {
count: usize,
},
}
#[derive(Debug, Clone, PartialEq)]
pub enum NodeKind {
Syntax(SyntaxKind),
Error(SyntaxError),
}
impl NodeKind {
pub fn is_syntax(&self) -> bool {
matches!(self, Self::Syntax(_))
}
pub fn is_error(&self) -> bool {
matches!(self, Self::Error(_))
}
}
impl From<SyntaxKind> for NodeKind {
fn from(value: SyntaxKind) -> Self {
NodeKind::Syntax(value)
}
}
impl From<SyntaxError> for NodeKind {
fn from(value: SyntaxError) -> Self {
NodeKind::Error(value)
}
}
impl PartialEq<SyntaxKind> for NodeKind {
fn eq(&self, other: &SyntaxKind) -> bool {
match self {
NodeKind::Syntax(s) => s == other,
NodeKind::Error(_) => false,
}
}
}
impl PartialEq<SyntaxError> for NodeKind {
fn eq(&self, other: &SyntaxError) -> bool {
match self {
NodeKind::Syntax(_) => false,
NodeKind::Error(e) => e == other,
}
}
}
impl Event {
pub(crate) fn tombstone() -> Self {
Self::Start {
kind: SyntaxKind::TOMBSTONE.into(),
forward_parent: None,
}
}
}

View file

@ -0,0 +1,38 @@
use std::fmt::Debug;
use crate::lst_parser::syntax_kind::SyntaxKind::*;
use self::module::{mod_body, top_level_item};
use super::{
input::Input,
output::Output,
syntax_kind::{self, lex},
Parser,
};
mod expression;
mod module;
pub fn source_file(p: &mut Parser) {
let root = p.start("root");
mod_body(p);
// expression::expression(p, false);
p.eat_succeeding_ws();
root.complete(p, ROOT);
}
fn check_parser(input: &str, parser_fn: fn(&mut Parser), output: &str) {
let toks = lex(input);
let mut parser = Parser::new(Input::new(&toks));
parser_fn(&mut parser);
let p_out = dbg!(parser.finish());
let o = Output::from_parser_output(toks, p_out);
let s = strip_ansi_escapes::strip_str(format!("{o:?}"));
assert_eq!(&s, output);
}

View file

@ -0,0 +1,44 @@
use crate::lst_parser::{error::SyntaxError, syntax_kind::SyntaxKind::*, CompletedMarker, Parser};
use self::{collection::collection, instruction::instr, lit::literal, pipeline::PIPES};
mod collection;
mod instruction;
mod lit;
mod pipeline;
pub fn expression(p: &mut Parser, in_pipe: bool) -> Option<CompletedMarker> {
let expr = p.start("expr");
if atom(p).or_else(|| instr(p)).is_none() {
expr.abandon(p);
return None;
}
let r = expr.complete(p, EXPR);
if PIPES.contains(p.current()) && !in_pipe {
pipeline::pipeline(p, r)
} else {
Some(r)
}
}
pub fn atom(p: &mut Parser) -> Option<CompletedMarker> {
literal(p)
.or_else(|| collection(p))
.or_else(|| parenthesized_expr(p))
}
pub fn parenthesized_expr(p: &mut Parser) -> Option<CompletedMarker> {
if p.eat(L_PAREN) {
let par_expr = p.start("parenthesized");
expression(p, false);
if !p.eat(R_PAREN) {
return Some(par_expr.error(p, SyntaxError::Expected(vec![R_PAREN])));
}
return Some(par_expr.complete(p, PARENTHESIZED_EXPR));
}
None
}

View file

@ -0,0 +1,25 @@
use enumset::enum_set;
use crate::lst_parser::{
syntax_kind::{SyntaxKind::*, TokenSet},
CompletedMarker, Parser,
};
use self::{attr_set::attr_set, vec::vec_matrix_list};
mod attr_set;
mod vec;
const COLLECTION_START: TokenSet = enum_set!(L_BRACK | L_BRACE);
pub fn collection(p: &mut Parser) -> Option<CompletedMarker> {
if !COLLECTION_START.contains(p.current()) {
return None;
}
Some(match p.current() {
L_BRACK => vec_matrix_list(p),
L_BRACE => attr_set(p),
_ => unreachable!(),
})
}

View file

@ -0,0 +1,45 @@
use crate::lst_parser::{
error::SyntaxError,
grammar::expression::{atom, expression},
CompletedMarker, Marker, Parser,
SyntaxKind::*,
};
pub fn attr_set(p: &mut Parser) -> CompletedMarker {
let start = p.start("attr_set_start");
assert!(p.eat(L_BRACE));
loop {
if attr(p).is_some() {
// TODO: handle others
if p.eat(COMMA) {
continue;
} else if p.eat(R_BRACE) {
return start.complete(p, ATTR_SET);
}
// TODO: check for newline and stuff following that for recov of others
} else if p.eat(R_BRACE) {
return start.complete(p, ATTR_SET);
}
}
}
fn attr(p: &mut Parser) -> Option<CompletedMarker> {
if p.at(IDENT) {
let attr_start = p.start("attr");
let attr_name_start = p.start("attr_name");
p.do_bump();
attr_name_start.complete(p, ATTR_NAME);
// TODO: handle comma, expr/atom, other
p.eat(COLON);
// TODO: handle failed expr parser too
let attr_value = p.start("attr_value");
let _ = expression(p, false);
attr_value.complete(p, ATTR_VALUE);
Some(attr_start.complete(p, ATTR))
} else {
None
}
}

View file

@ -0,0 +1,97 @@
use crate::lst_parser::{
error::SyntaxError, grammar::expression::atom, CompletedMarker, Marker, Parser, SyntaxKind::*,
};
pub fn vec_matrix_list(p: &mut Parser) -> CompletedMarker {
let start = p.start("vec_matrix_list_start");
assert!(p.eat(L_BRACK));
let row_start = p.start("matrix_row_start");
if let Some(item) = atom(p) {
item.precede(p, "coll_item_start")
.complete(p, COLLECTION_ITEM);
if p.at(COMMA) {
row_start.abandon(p);
return finish_list(p, start);
}
finish_mat_or_vec(p, start, row_start)
} else if p.eat(R_BRACK) {
row_start.abandon(p);
start.complete(p, LIST)
} else {
row_start.abandon(p);
start.error(p, SyntaxError::Expected(vec![EXPR, R_BRACK]))
}
}
fn finish_list(p: &mut Parser, list_start: Marker) -> CompletedMarker {
loop {
if p.eat(COMMA) {
if let Some(item) = atom(p) {
item.precede(p, "coll_item_start")
.complete(p, COLLECTION_ITEM);
} else if p.eat(R_BRACK) {
return list_start.complete(p, LIST);
}
} else if p.eat(R_BRACK) {
return list_start.complete(p, LIST);
} else if let Some(item) = atom(p) {
item.precede(p, "next_item")
.complete(p, COLLECTION_ITEM)
.precede(p, "err_space_sep")
.error(p, SyntaxError::SpaceSepInList);
} else if p.at(SEMICOLON) {
let semi_err = p.start("semicolon_err");
p.eat(SEMICOLON);
semi_err.error(p, SyntaxError::SemicolonInList);
if let Some(item) = atom(p) {
item.precede(p, "coll_item_start")
.complete(p, COLLECTION_ITEM);
} else if p.eat(R_BRACK) {
return list_start.complete(p, LIST);
}
}
}
}
// TODO: handle commas, general other wrong toks
fn finish_mat_or_vec(p: &mut Parser, coll_start: Marker, mut row_start: Marker) -> CompletedMarker {
let mut is_matrix = false;
let mut row_item_count = 1;
loop {
if let Some(item) = atom(p) {
item.precede(p, "coll_item_start")
.complete(p, COLLECTION_ITEM);
row_item_count += 1;
} else if p.at(SEMICOLON) {
is_matrix = true;
row_start.complete(p, MAT_ROW);
p.eat(SEMICOLON);
row_start = p.start("matrix_row_start");
row_item_count = 0;
} else if p.at(R_BRACK) {
if is_matrix && row_item_count == 0 {
row_start.abandon(p);
p.eat(R_BRACK);
return coll_start.complete(p, MATRIX);
} else if is_matrix {
row_start.complete(p, MAT_ROW);
p.eat(R_BRACK);
return coll_start.complete(p, MATRIX);
} else {
row_start.abandon(p);
p.eat(R_BRACK);
return coll_start.complete(p, VEC);
}
} else if p.at(COMMA) {
let err_unexpected_comma = p.start("err_unexpected_comma");
p.do_bump();
err_unexpected_comma.error(p, SyntaxError::CommaInMatOrVec);
} else {
let err_unexpected = p.start("err_unexpected_tok");
p.do_bump();
err_unexpected.error(p, SyntaxError::Expected(vec![EXPR, SEMICOLON, R_BRACK]));
}
}
}

View file

@ -0,0 +1,34 @@
use crate::lst_parser::{syntax_kind::SyntaxKind::*, CompletedMarker, Parser};
use super::{atom, lit::literal};
pub fn instr(p: &mut Parser) -> Option<CompletedMarker> {
if !p.at(IDENT) {
return None;
}
let instr = p.start("instr");
instr_name(p);
instr_params(p);
Some(instr.complete(p, INSTR))
}
fn instr_name(p: &mut Parser) {
let instr_name = p.start("instr_name");
while p.at(IDENT) {
p.do_bump();
}
instr_name.complete(p, INSTR_NAME);
}
fn instr_params(p: &mut Parser) {
if let Some(start) = atom(p) {
while atom(p).is_some() {}
start.precede(p, "params_start").complete(p, INSTR_PARAMS);
}
}

View file

@ -0,0 +1,59 @@
use enumset::enum_set;
use indoc::indoc;
use crate::lst_parser::{
grammar::check_parser,
syntax_kind::{SyntaxKind::*, TokenSet},
CompletedMarker, Parser,
};
const LIT_TOKENS: TokenSet = enum_set!(INT_NUM | FLOAT_NUM | STRING);
pub fn literal(p: &mut Parser) -> Option<CompletedMarker> {
if !LIT_TOKENS.contains(p.current()) {
return None;
}
let lit = p.start("lit");
p.do_bump();
Some(lit.complete(p, LITERAL))
}
#[test]
fn test_parse_lst_lit() {
check_parser(
"42",
|p| {
literal(p);
},
indoc! {r#"
LITERAL {
INT_NUM "42";
}
"#},
);
check_parser(
"3.14",
|p| {
literal(p);
},
indoc! {r#"
LITERAL {
FLOAT_NUM "3.14";
}
"#},
);
check_parser(
r#""Meow""#,
|p| {
literal(p);
},
indoc! {r#"
LITERAL {
STRING "\"Meow\"";
}
"#},
);
}

View file

@ -0,0 +1,36 @@
use enumset::enum_set;
use crate::lst_parser::{
error::SyntaxError,
syntax_kind::{SyntaxKind::*, TokenSet},
CompletedMarker, Parser,
};
use super::expression;
pub fn pipeline(p: &mut Parser, start_expr: CompletedMarker) -> Option<CompletedMarker> {
if !pipe(p) {
return Some(start_expr);
}
let pipeline_marker = start_expr.precede(p, "pipeline_start");
loop {
if expression(p, true).is_none() {
return Some(pipeline_marker.error(p, SyntaxError::PipelineNeedsSink));
}
if !pipe(p) {
return Some(pipeline_marker.complete(p, PIPELINE));
}
}
}
pub const PIPES: TokenSet = enum_set!(PIPE | MAPPING_PIPE | NULL_PIPE);
fn pipe(p: &mut Parser) -> bool {
if PIPES.contains(p.current()) {
p.do_bump();
true
} else {
false
}
}

View file

@ -0,0 +1,191 @@
use enumset::enum_set;
use crate::lst_parser::{
error::SyntaxError,
grammar::expression::expression,
syntax_kind::{SyntaxKind::*, TokenSet},
CompletedMarker, Parser,
};
const TOP_LEVEL_ITEM_START: TokenSet = enum_set!(DEF_KW | MOD_KW | USE_KW);
pub fn mod_body(p: &mut Parser) {
loop {
if top_level_item(p).is_none() {
break;
}
}
}
fn mod_decl(p: &mut Parser) -> Option<CompletedMarker> {
let mod_start = p.start("module");
if !p.eat(MOD_KW) {
mod_start.abandon(p);
return None;
}
let mod_name = p.start("module_name");
if p.eat(IDENT) {
mod_name.complete(p, MODULE_NAME);
} else {
mod_name.error(p, SyntaxError::Expected(vec![IDENT]));
}
let mod_body_marker = p.start("mod_body");
if p.eat(SEMICOLON) {
mod_body_marker.abandon(p);
Some(mod_start.complete(p, MODULE))
} else if p.eat(L_BRACE) {
mod_body(p);
if !p.eat(R_BRACE) {
mod_body_marker
.complete(p, MODULE_BODY)
.precede(p, "unclosed_mod_body_err")
.error(p, SyntaxError::UnclosedModuleBody);
} else {
mod_body_marker.complete(p, MODULE_BODY);
}
Some(mod_start.complete(p, MODULE))
} else {
Some(mod_start.error(p, SyntaxError::Expected(vec![MODULE_BODY])))
}
}
pub fn top_level_item(p: &mut Parser) -> Option<CompletedMarker> {
if !TOP_LEVEL_ITEM_START.contains(p.current()) {
return None;
}
def(p).or_else(|| mod_decl(p)).or_else(|| r#use(p))
}
fn def(p: &mut Parser) -> Option<CompletedMarker> {
let def_start = p.start("top_level_def");
if !p.eat(DEF_KW) {
def_start.abandon(p);
return None;
}
let def_name = p.start("def_name");
if p.eat(IDENT) {
def_name.complete(p, DEF_NAME);
} else {
def_name.error(p, SyntaxError::Expected(vec![IDENT]));
}
let maybe_expected_eq = p.start("maybe_expect_eq");
if !p.eat(EQ) {
maybe_expected_eq.error(p, SyntaxError::Expected(vec![EQ]));
} else {
maybe_expected_eq.abandon(p);
}
let body = p.start("def_body");
if expression(p, false).is_some() {
body.complete(p, DEF_BODY);
} else {
body.error(p, SyntaxError::Expected(vec![DEF_BODY]));
}
Some(if p.eat(SEMICOLON) {
def_start.complete(p, DEF)
} else if TOP_LEVEL_ITEM_START.contains(p.current()) || p.at(EOF) {
def_start
.complete(p, DEF)
.precede(p, "unterminated_tl_item")
.error(p, SyntaxError::UnterminatedTopLevelItem)
} else {
def_start
.complete(p, DEF)
.precede(p, "err_unexpected")
.error(p, SyntaxError::Expected(vec![SEMICOLON]))
})
}
fn r#use(p: &mut Parser) -> Option<CompletedMarker> {
let use_start = p.start("use_start");
if !p.eat(USE_KW) {
use_start.abandon(p);
return None;
}
if use_pat(p).is_none() {
p.start("expected_use_pat")
.error(p, SyntaxError::Expected(vec![USE_PAT]));
}
let use_item = use_start.complete(p, USE);
Some(if p.eat(SEMICOLON) {
use_item
} else if TOP_LEVEL_ITEM_START.contains(p.current()) || p.at(EOF) {
use_item
.precede(p, "unterminated_tl_item")
.error(p, SyntaxError::UnterminatedTopLevelItem)
} else {
use_item
.precede(p, "err_unexpected")
.error(p, SyntaxError::Expected(vec![SEMICOLON]))
})
}
fn use_pat(p: &mut Parser) -> Option<CompletedMarker> {
let use_pat_marker = p.start("use_pat");
if !p.eat(IDENT) {
return None;
}
loop {
if p.eat(PATH_SEP) {
if pat_item(p).is_none() {
break Some(use_pat_marker.error(p, SyntaxError::UnfinishedPath));
}
} else if p.at(SEMICOLON) && p.nth_at(1, COLON) {
let broken_sep = p.start("broken_path_sep");
let wrong_semi = p.start("semi_typo");
p.eat(SEMICOLON);
wrong_semi.error(p, SyntaxError::PathSepContainsSemicolon);
p.eat(COLON);
broken_sep.complete(p, PATH_SEP);
if pat_item(p).is_none() {
break Some(use_pat_marker.error(p, SyntaxError::UnfinishedPath));
}
} else if p.at(COLON) && p.nth_at(1, SEMICOLON) {
let broken_sep = p.start("broken_path_sep");
p.eat(COLON);
let wrong_semi = p.start("semi_typo");
p.eat(SEMICOLON);
wrong_semi.error(p, SyntaxError::PathSepContainsSemicolon);
broken_sep.complete(p, PATH_SEP);
if pat_item(p).is_none() {
break Some(use_pat_marker.error(p, SyntaxError::UnfinishedPath));
}
} else if p.at(SEMICOLON) && p.nth_at(1, SEMICOLON) {
let broken_sep = p.start("broken_path_sep");
p.eat(SEMICOLON);
p.eat(SEMICOLON);
broken_sep
.complete(p, PATH_SEP)
.precede(p, "semi_typo_err")
.error(p, SyntaxError::PathSepContainsSemicolon);
if pat_item(p).is_none() {
break Some(use_pat_marker.error(p, SyntaxError::UnfinishedPath));
}
} else if p.eat(SEMICOLON) {
break Some(use_pat_marker.complete(p, USE_PAT));
} else {
break Some(use_pat_marker.error(p, SyntaxError::Expected(vec![PATH_SEP, SEMICOLON])));
}
}
}
fn pat_item(p: &mut Parser) -> Option<CompletedMarker> {
let item_start = p.start("pat_item_start");
if p.eat(IDENT) {
Some(item_start.complete(p, PAT_ITEM))
} else if p.eat(STAR) {
Some(item_start.complete(p, PAT_GLOB))
} else if p.eat(L_BRACE) {
todo!("write PAT_GROUPs")
} else {
None
}
}

View file

@ -0,0 +1,70 @@
use enumset::enum_set;
use crate::lst_parser::syntax_kind::SyntaxKind;
use super::syntax_kind::TokenSet;
pub struct Input<'src, 'toks> {
raw: &'toks Vec<(SyntaxKind, &'src str)>,
/// indices of the "meaningful" tokens (not whitespace etc)
/// includes newlines because those might indeed help with finding errors
meaningful: Vec<usize>,
/// indices of newlines for the purpose of easily querying them
/// can be helpful with missing commas etc
newlines: Vec<usize>,
}
pub const MEANINGLESS_TOKS: TokenSet = enum_set!(SyntaxKind::WHITESPACE | SyntaxKind::NEWLINE);
impl<'src, 'toks> Input<'src, 'toks> {
pub fn new(raw_toks: &'toks Vec<(SyntaxKind, &'src str)>) -> Self {
let meaningful = raw_toks
.iter()
.enumerate()
.filter_map(|(i, tok)| {
if MEANINGLESS_TOKS.contains(tok.0) {
None
} else {
Some(i)
}
})
.collect();
let newlines = raw_toks
.iter()
.enumerate()
.filter_map(|(i, tok)| match tok.0 {
SyntaxKind::NEWLINE => Some(i),
_ => None,
})
.collect();
Self {
raw: raw_toks,
meaningful,
newlines,
}
}
#[allow(clippy::unwrap_used, reason = "meaningful indices cannot be invalid")]
pub(crate) fn kind(&self, idx: usize) -> SyntaxKind {
let Some(meaningful_idx) = self.meaningful.get(idx) else {
return SyntaxKind::EOF;
};
self.raw.get(*meaningful_idx).unwrap().0
}
pub(crate) fn preceding_meaningless(&self, idx: usize) -> usize {
assert!(self.meaningful.len() > idx);
if idx == 0 {
1
} else {
self.meaningful[idx] - self.meaningful[idx - 1]
}
}
pub(crate) fn meaningless_tail_len(&self) -> usize {
self.raw.len() - (self.meaningful.last().unwrap() + 1)
}
}

View file

@ -0,0 +1,208 @@
use clap::builder;
use owo_colors::{unset_override, OwoColorize};
use rowan::{GreenNode, GreenNodeBuilder, GreenNodeData, GreenTokenData, Language, NodeOrToken};
use std::mem;
use crate::{
lst_parser::{input::MEANINGLESS_TOKS, syntax_kind::SyntaxKind},
Lang, SyntaxNode,
};
use super::{
error::SyntaxError,
events::{Event, NodeKind},
};
pub struct Output {
pub green_node: GreenNode,
pub errors: Vec<SyntaxError>,
}
impl std::fmt::Debug for Output {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let mut errs: Vec<&SyntaxError> = self.errors.iter().collect();
errs.reverse();
debug_print_green_node(NodeOrToken::Node(&self.green_node), f, 0, &mut errs, false)
}
}
const INDENT_STR: &str = " ";
/// colored argument currently broken
fn debug_print_green_node(
node: NodeOrToken<&GreenNodeData, &GreenTokenData>,
f: &mut dyn std::fmt::Write,
lvl: i32,
errs: &mut Vec<&SyntaxError>,
colored: bool,
) -> std::fmt::Result {
for _ in 0..lvl {
f.write_str(INDENT_STR)?;
}
let r = match node {
NodeOrToken::Node(n) => {
let kind = Lang::kind_from_raw(node.kind());
if kind != SyntaxKind::PARSE_ERR {
writeln!(
f,
"{:?} {}",
Lang::kind_from_raw(node.kind()).bright_yellow().bold(),
"{".yellow()
)?;
} else {
let err = errs
.pop()
.expect("all error syntax nodes should correspond to an error")
.bright_red();
writeln!(
f,
"{:?}{} {err:?} {}",
kind.bright_red().bold(),
":".red(),
"{".bright_red().bold()
)?;
}
for c in n.children() {
debug_print_green_node(c, f, lvl + 1, errs, colored)?;
}
for _ in 0..lvl {
f.write_str(INDENT_STR)?;
}
if kind != SyntaxKind::PARSE_ERR {
write!(f, "{}", "}\n".yellow())
} else {
write!(f, "{}", "}\n".bright_red().bold())
}
}
NodeOrToken::Token(t) => {
let tok = Lang::kind_from_raw(t.kind());
if MEANINGLESS_TOKS.contains(tok) {
writeln!(
f,
"{:?} {:?}{}",
Lang::kind_from_raw(t.kind()).white(),
t.text().white(),
";".white()
)
} else {
writeln!(
f,
"{:?} {:?}{}",
Lang::kind_from_raw(t.kind()).bright_cyan().bold(),
t.text().green(),
";".yellow()
)
}
}
};
r
}
impl Output {
pub fn debug_colored(&self) -> String {
let mut out = String::new();
let mut errs: Vec<&SyntaxError> = self.errors.iter().collect();
errs.reverse();
let _ = debug_print_green_node(
NodeOrToken::Node(&self.green_node),
&mut out,
0,
&mut errs,
true,
);
out
}
pub fn from_parser_output(
mut raw_toks: Vec<(SyntaxKind, &str)>,
mut events: Vec<Event>,
) -> Self {
let mut builder = GreenNodeBuilder::new();
let mut fw_parents = Vec::new();
let mut errors = Vec::new();
raw_toks.reverse();
for i in 0..events.len() {
match mem::replace(&mut events[i], Event::tombstone()) {
Event::Start {
kind,
forward_parent,
} => {
if kind == SyntaxKind::TOMBSTONE && forward_parent.is_none() {
continue;
}
fw_parents.push(kind);
let mut idx = i;
let mut fp = forward_parent;
while let Some(fwd) = fp {
idx += fwd as usize;
fp = match mem::replace(&mut events[idx], Event::tombstone()) {
Event::Start {
kind,
forward_parent,
} => {
fw_parents.push(kind);
forward_parent
}
_ => unreachable!(),
}
}
// remove whitespace bc it's ugly
while let Some((SyntaxKind::WHITESPACE | SyntaxKind::NEWLINE, _)) =
raw_toks.last()
{
match events.iter_mut().find(|ev| matches!(ev, Event::Eat { .. })) {
Some(Event::Eat { count }) => *count -= 1,
_ => unreachable!(),
}
let (tok, text): (SyntaxKind, &str) = raw_toks.pop().unwrap();
builder.token(tok.into(), text);
}
for kind in fw_parents.drain(..).rev() {
match kind {
NodeKind::Syntax(kind) if kind != SyntaxKind::TOMBSTONE => {
builder.start_node(kind.into())
}
NodeKind::Error(err) => {
errors.push(err);
builder.start_node(SyntaxKind::PARSE_ERR.into())
}
_ => {}
}
}
}
Event::Finish => builder.finish_node(),
Event::Eat { count } => (0..count).for_each(|_| {
let (tok, text): (SyntaxKind, &str) = raw_toks.pop().unwrap();
builder.token(tok.into(), text);
}),
}
}
Self {
green_node: builder.finish(),
errors,
}
}
pub fn syntax(&self) -> SyntaxNode {
SyntaxNode::new_root(self.green_node.clone())
}
pub fn errors(&self) -> Vec<SyntaxError> {
self.errors.clone()
}
pub fn dissolve(self) -> (GreenNode, Vec<SyntaxError>) {
let Self { green_node, errors } = self;
(green_node, errors)
}
}

View file

@ -0,0 +1,140 @@
use enumset::EnumSet;
use logos::Logos;
pub fn lex(src: &str) -> Vec<(SyntaxKind, &str)> {
let mut lex = SyntaxKind::lexer(src);
let mut r = Vec::new();
while let Some(tok_res) = lex.next() {
r.push((tok_res.unwrap_or(SyntaxKind::LEX_ERR), lex.slice()))
}
r
}
#[derive(enumset::EnumSetType, Logos, Debug, PartialEq, Eq, Clone, Copy, Hash, PartialOrd, Ord)]
#[repr(u16)]
#[enumset(no_super_impls)]
#[allow(non_camel_case_types)]
pub enum SyntaxKind {
#[token("def")]
DEF_KW = 0,
DEF,
DEF_NAME,
DEF_BODY,
#[token("let")]
LET_KW,
#[token("in")]
IN_KW,
LET_IN,
#[token("::")]
PATH_SEP,
#[token("mod")]
MOD_KW,
MODULE,
MODULE_NAME,
MODULE_BODY,
USE,
#[token("use")]
USE_KW,
USE_PAT,
PAT_ITEM,
PAT_GLOB,
PAT_GROUP,
#[regex("[\\d]+")]
INT_NUM,
#[regex("[+-]?([\\d]+\\.[\\d]*|[\\d]*\\.[\\d]+)")]
FLOAT_NUM,
#[regex(r#""([^"\\]|\\["\\bnfrt]|u[a-fA-F0-9]{4})*""#)]
STRING,
MATRIX,
MAT_ROW,
VEC,
LIST,
// either of a vec, a matrix or a list
COLLECTION_ITEM,
PARENTHESIZED_EXPR,
EXPR,
LITERAL,
#[token("(")]
L_PAREN,
#[token(")")]
R_PAREN,
#[token("{")]
L_BRACE,
#[token("}")]
R_BRACE,
#[token("[")]
L_BRACK,
#[token("]")]
R_BRACK,
#[token("<")]
L_ANGLE,
#[token(">")]
R_ANGLE,
#[token("+")]
PLUS,
#[token("-")]
MINUS,
#[token("*")]
STAR,
#[token("/")]
SLASH,
#[token("%")]
PERCENT,
#[token("^")]
CARET,
INSTR,
INSTR_NAME,
INSTR_PARAMS,
ATTR_SET,
ATTR,
ATTR_NAME,
ATTR_VALUE,
#[regex("[a-zA-Z_]+[a-zA-Z_\\-\\d]*")]
IDENT,
#[regex("\\$[a-zA-Z0-9_\\-]+")]
VAR,
#[regex("\\@[a-zA-Z0-9_\\-]+")]
INPUT_VAR,
#[token("$")]
DOLLAR,
#[token("@")]
AT,
#[token(",")]
COMMA,
#[token("|")]
PIPE,
#[token("@|")]
MAPPING_PIPE,
#[token("!|")]
NULL_PIPE,
PIPELINE,
#[token("=")]
EQ,
#[token(":")]
COLON,
#[token(";")]
SEMICOLON,
#[token(".")]
DOT,
#[token("!")]
BANG,
#[regex("[ \\t\\f]+")]
WHITESPACE,
#[token("\n")]
NEWLINE,
PARSE_ERR,
LEX_ERR,
ROOT,
EOF,
TOMBSTONE,
}
pub type TokenSet = EnumSet<SyntaxKind>;
impl From<SyntaxKind> for rowan::SyntaxKind {
fn from(kind: SyntaxKind) -> Self {
Self(kind as u16)
}
}

View file

@ -0,0 +1 @@

29
crates/lang/src/main.rs Normal file
View file

@ -0,0 +1,29 @@
use clap::Parser;
use std::{fs, path::PathBuf};
use lang::lst_parser::{self, grammar, input, output::Output, syntax_kind};
#[derive(Parser)]
struct Args {
file: PathBuf,
}
#[allow(clippy::unwrap_used)]
fn main() {
let args = Args::parse();
let n = args.file.clone();
let f = fs::read_to_string(n.clone()).expect("failed to read file");
let toks = dbg!(syntax_kind::lex(&f));
let input = input::Input::new(&toks);
let mut parser = lst_parser::Parser::new(input);
grammar::source_file(&mut parser);
let p_out = dbg!(parser.finish());
let o = Output::from_parser_output(toks, p_out);
println!("{}", o.debug_colored());
// World::new(n);
}

View file

@ -1,45 +0,0 @@
use logos::Logos;
#[derive(Logos, Debug, PartialEq, Eq)]
#[logos(skip r"[ \t\n\f]+")]
pub enum Token<'a> {
#[regex("[a-zA-Z0-9_\\-]+", |lex| lex.slice())]
Word(&'a str),
#[regex("\\$[a-zA-Z0-9_\\-]+", |lex| &lex.slice()[1..])]
VarIdent(&'a str),
#[token("@..")]
InputSpread,
#[regex("\\@[a-zA-Z0-9_\\-]+", |lex| &lex.slice()[1..])]
InputIdent(&'a str),
#[token(",")]
Comma,
#[token("|")]
Pipe,
#[token("@|")]
MappingPipe,
#[token("!|")]
NullPipe,
#[token("@")]
At,
#[token(">")]
GreaterThan,
#[token("=")]
Equals,
#[token(":")]
Colon,
#[token("[")]
BracketOpen,
#[token("]")]
BracketClose,
#[token("(")]
ParenOpen,
#[token(")")]
ParenClose,
#[token("{")]
BraceOpen,
#[token("}")]
BraceClose,
}
#[cfg(test)]
mod tests;

View file

@ -1,107 +0,0 @@
use logos::Logos;
use super::Token;
/// generates tests for the lexer to avoid writing boilerplate
macro_rules! lexer_test {
($name:ident, $input:literal, $out:expr) => {
#[test]
fn $name() {
let lex = Token::lexer($input);
let toks = lex.map(|tok| tok.unwrap()).collect::<Vec<_>>();
assert_eq!(toks, $out);
}
};
}
lexer_test! {
test_lex_simple_pipeline,
"streamer | processor | sink",
[
Token::Word("streamer"),
Token::Pipe,
Token::Word("processor"),
Token::Pipe,
Token::Word("sink")
]
}
lexer_test! {
test_lex_var_ident,
"$identifier",
[ Token::VarIdent("identifier") ]
}
lexer_test! {
test_lex_subgroup,
"subgroup(first, second) = a | b { 1: $first } | c { 1: $second }",
[
Token::Word("subgroup"),
Token::ParenOpen,
Token::Word("first"),
Token::Comma,
Token::Word("second"),
Token::ParenClose,
Token::Equals,
Token::Word("a"),
Token::Pipe,
Token::Word("b"),
Token::BraceOpen,
Token::Word("1"),
Token::Colon,
Token::VarIdent("first"),
Token::BraceClose,
Token::Pipe,
Token::Word("c"),
Token::BraceOpen,
Token::Word("1"),
Token::Colon,
Token::VarIdent("second"),
Token::BraceClose
]
}
lexer_test! {
text_lex_crossing_pipeline_reordering,
"a >first, second|second, first> c",
[
Token::Word("a"),
Token::GreaterThan,
Token::Word("first"),
Token::Comma,
Token::Word("second"),
Token::Pipe,
Token::Word("second"),
Token::Comma,
Token::Word("first"),
Token::GreaterThan,
Token::Word("c")
]
}
lexer_test! {
test_lex_crossing_input_args,
"a >second| c { second: @first }",
[
Token::Word("a"),
Token::GreaterThan,
Token::Word("second"),
Token::Pipe,
Token::Word("c"),
Token::BraceOpen,
Token::Word("second"),
Token::Colon,
Token::InputIdent("first"),
Token::BraceClose
]
}
lexer_test! {
test_lex_map_io_named,
"a @| c",
[
Token::Word("a"),
Token::MappingPipe,
Token::Word("c")
]
}

27
crates/lang/src/world.rs Normal file
View file

@ -0,0 +1,27 @@
use std::path::Path;
use self::files::{Files, OpenFileError};
mod error;
mod files;
struct World;
impl World {
pub fn new(entry_point: &Path) -> Result<Self, WorldCreationError> {
let mut files = Files::default();
let (entry_point_id, errors) = files.add_file(entry_point)?;
todo!()
}
}
enum WorldCreationError {
FailedToOpenEntryPoint(OpenFileError),
}
impl From<OpenFileError> for WorldCreationError {
fn from(value: OpenFileError) -> Self {
Self::FailedToOpenEntryPoint(value)
}
}

View file

@ -0,0 +1,10 @@
use std::path::PathBuf;
use crate::{ast::ParseError, lst_parser::error::SyntaxError};
use super::files::{FileId, Loc, OpenFileError};
pub enum Error {
Syntax(Loc<ParseError>, SyntaxError),
OpenFileError(OpenFileError),
}

View file

@ -0,0 +1,57 @@
use std::{
collections::HashMap,
io,
path::{Path, PathBuf},
};
mod loc;
pub use loc::Loc;
use rowan::ast::AstNode;
use crate::{
ast::ParseError,
lst_parser::{self, error::SyntaxError, input, output::Output},
world::{error::Error, files::source_file::SourceFile},
};
#[derive(Default)]
pub struct Files {
inner: Vec<source_file::SourceFile>,
path_to_id_map: HashMap<PathBuf, FileId>,
}
impl Files {
pub fn add_file(&mut self, path: &Path) -> Result<(FileId, Vec<Error>), OpenFileError> {
if !path.exists() {
return Err(OpenFileError::NotFound(path.to_owned()));
}
let file_id = FileId(self.inner.len());
let (source_file, errs) = match SourceFile::open(path) {
Ok((source_file, errs)) => {
let errs = errs
.into_iter()
.map(|(ptr, err)| Error::Syntax(Loc::from_ptr(ptr, file_id), err))
.collect::<Vec<_>>();
(source_file, errs)
}
Err(e) => return Err(OpenFileError::IoError(path.to_path_buf(), e)),
};
self.inner.push(source_file);
self.path_to_id_map.insert(path.to_path_buf(), file_id);
Ok((file_id, errs))
}
}
pub enum OpenFileError {
NotFound(PathBuf),
IoError(PathBuf, std::io::Error),
}
#[derive(Copy, Clone, Debug)]
pub struct FileId(usize);
mod source_file;

View file

@ -0,0 +1,29 @@
use rowan::ast::{AstNode, AstPtr};
use crate::Lang;
use super::FileId;
#[derive(Clone)]
pub struct Loc<N: AstNode<Language = Lang>> {
file: FileId,
syntax: AstPtr<N>,
}
impl<N: AstNode<Language = Lang>> Loc<N> {
pub fn new(node: N, file: FileId) -> Self {
Self::from_ptr(AstPtr::new(&node), file)
}
pub fn from_ptr(ptr: AstPtr<N>, file: FileId) -> Self {
Self { file, syntax: ptr }
}
pub fn file(&self) -> FileId {
self.file
}
pub fn syntax(&self) -> AstPtr<N> {
self.syntax.clone()
}
}

View file

@ -0,0 +1,113 @@
use crate::lst_parser::{self, grammar, input, syntax_kind};
use crate::SyntaxNode;
use crate::lst_parser::output::Output;
use crate::lst_parser::error::SyntaxError;
use crate::ast::ParseError;
use rowan::ast::{AstNode, AstPtr};
use std::path::Path;
use std::{fs, io};
use rowan::GreenNode;
use std::path::PathBuf;
pub(crate) struct SourceFile {
pub(crate) path: PathBuf,
pub(crate) lst: rowan::GreenNode,
}
impl SourceFile {
pub(crate) fn open(p: &Path) -> io::Result<(Self, Vec<(AstPtr<ParseError>, SyntaxError)>)> {
assert!(p.exists());
let f = fs::read_to_string(p)?;
let (lst, errs) = Self::parse(f);
Ok((
Self {
path: p.to_path_buf(),
lst,
},
errs,
))
}
pub(crate) fn parse(f: String) -> (GreenNode, Vec<(AstPtr<ParseError>, SyntaxError)>) {
let toks = syntax_kind::lex(&f);
let input = input::Input::new(&toks);
let mut parser = lst_parser::Parser::new(input);
grammar::source_file(&mut parser);
let p_out = parser.finish();
let (lst, errs) = Output::from_parser_output(toks, p_out).dissolve();
(lst.clone(), Self::find_errs(lst, errs))
}
pub(crate) fn find_errs(
lst: GreenNode,
mut errs: Vec<SyntaxError>,
) -> Vec<(AstPtr<ParseError>, SyntaxError)> {
let mut out = Vec::new();
errs.reverse();
let lst = SyntaxNode::new_root(lst);
Self::find_errs_recursive(&mut out, lst, &mut errs);
out
}
pub(crate) fn find_errs_recursive(
mut out: &mut Vec<(AstPtr<ParseError>, SyntaxError)>,
lst: SyntaxNode,
mut errs: &mut Vec<SyntaxError>,
) {
lst.children()
.filter_map(|c| ParseError::cast(c))
.for_each(|e| out.push((AstPtr::new(&e), errs.pop().unwrap())));
lst.children()
.for_each(|c| Self::find_errs_recursive(out, c, errs));
}
}
#[cfg(test)]
mod tests {
use crate::world::files::source_file::SourceFile;
fn check_find_errs(input: &str, expected: &[&str]) {
let (_, errs) = SourceFile::parse(input.to_string());
let errs = errs
.into_iter()
.map(|(loc, err)| format!("{:?}@{:?}", err, loc.syntax_node_ptr().text_range()))
.collect::<Vec<String>>();
assert_eq!(
errs,
expected
.into_iter()
.map(|s| s.to_string())
.collect::<Vec<_>>()
)
}
#[test]
fn test_find_errs() {
check_find_errs(
"def meow = ;\n mod ;",
&["Expected([DEF_BODY])@11..11", "Expected([IDENT])@18..18"],
);
check_find_errs(
"def awawa = a |",
&["UnterminatedTopLevelItem@0..15", "PipelineNeedsSink@12..15"],
)
}
}

View file

@ -0,0 +1,12 @@
[package]
name = "pawarser"
version = "0.1.0"
edition = "2021"
[dependencies]
rowan = "0.15.15"
drop_bomb = "0.1.5"
enumset = "1.1.3"
[lints]
workspace = true

View file

@ -0,0 +1,8 @@
#![feature(iter_collect_into)]
pub mod parser;
pub use parser::{
error::SyntaxError,
marker::{CompletedMarker, Marker},
Parser, SyntaxElement,
};

View file

@ -0,0 +1,253 @@
use std::{cell::Cell, fmt, marker::PhantomData, mem};
use enumset::{EnumSet, EnumSetType};
use rowan::{GreenNode, GreenNodeBuilder};
use crate::parser::event::NodeKind;
use self::{event::Event, input::Input, marker::Marker};
pub use {error::SyntaxError, output::ParserOutput};
pub mod error;
mod event;
mod input;
pub mod marker;
pub mod output;
/// this is used to define some required SyntaxKinds like an EOF token or an error token
pub trait SyntaxElement
where
Self: EnumSetType
+ Into<rowan::SyntaxKind>
+ From<rowan::SyntaxKind>
+ fmt::Debug
+ Clone
+ PartialEq
+ Eq,
{
/// EOF value. This will be used by the rest of the parser library to represent an EOF.
const SYNTAX_EOF: Self;
/// Error value. This will be used as a placeholder for associated respective errors.
const SYNTAX_ERROR: Self;
const SYNTAX_ROOT: Self;
}
pub struct Parser<'src, SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError> {
input: Input<'src, SyntaxKind>,
pos: usize,
events: Vec<Event<SyntaxKind, SyntaxErr>>,
step_limit: u32,
steps: Cell<u32>,
}
impl<'src, 'toks, SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError>
Parser<'src, SyntaxKind, SyntaxErr>
{
/// eat all meaningless tokens at the end of the file.
pub fn eat_succeeding_meaningless(&mut self) {
self.push_ev(Event::Eat {
count: self.input.meaningless_tail_len(),
});
}
/// Get token from current position of the parser.
pub fn current(&self) -> SyntaxKind {
self.step();
self.input.kind(self.pos)
}
pub fn start(&mut self, name: &str) -> Marker {
let pos = self.events.len();
self.push_ev(Event::tombstone());
Marker::new(pos, name)
}
/// Eat next token if it's of kind `kind` and return `true`.
/// Otherwise, `false`.
pub fn eat(&mut self, kind: SyntaxKind) -> bool {
if !self.at(kind) {
return false;
}
self.do_bump();
true
}
pub fn do_bump(&mut self) {
self.push_ev(Event::Eat {
count: self.input.preceding_meaningless(self.pos),
});
self.pos += 1;
}
/// Check if the token at the current parser position is of `kind`
pub fn at(&self, kind: SyntaxKind) -> bool {
self.nth_at(0, kind)
}
/// Check if the token that is `n` ahead is of `kind`
pub fn nth_at(&self, n: usize, kind: SyntaxKind) -> bool {
self.nth(n) == kind
}
pub fn nth(&self, n: usize) -> SyntaxKind {
self.step();
self.input.kind(self.pos + n)
}
fn push_ev(&mut self, event: Event<SyntaxKind, SyntaxErr>) {
self.events.push(event);
}
fn step(&self) {
let steps = self.steps.get();
assert!(steps <= self.step_limit, "the parser seems stuck.");
self.steps.set(steps + 1);
}
pub fn finish(self) -> ParserOutput<SyntaxKind, SyntaxErr> {
let Self {
input,
pos,
mut events,
step_limit,
steps,
} = self;
let (mut raw_toks, meaningless_tokens) = input.dissolve();
let mut builder = GreenNodeBuilder::new();
// TODO: document what the hell a forward parent is
let mut fw_parents = Vec::new();
let mut errors: Vec<SyntaxErr> = Vec::new();
raw_toks.reverse();
// always have an implicit root node to avoid [`GreenNodeBuilder::finish()`] panicking due to multiple root elements.
builder.start_node(SyntaxKind::SYNTAX_ROOT.into());
for i in 0..events.len() {
match mem::replace(&mut events[i], Event::tombstone()) {
Event::Start {
kind,
forward_parent,
} => {
if kind == NodeKind::Tombstone && forward_parent.is_none() {
continue;
}
// resolving forward parents
// temporarily jump around with the parser index and replace them with tombstones
fw_parents.push(kind);
let mut idx = i;
let mut fp = forward_parent;
while let Some(fwd) = fp {
idx += fwd as usize;
fp = match mem::replace(&mut events[idx], Event::tombstone()) {
Event::Start {
kind,
forward_parent,
} => {
fw_parents.push(kind);
forward_parent
}
_ => unreachable!(),
}
}
// clear semantically meaningless tokens before the new tree node for aesthetic reasons
while raw_toks
.last()
.is_some_and(|v| meaningless_tokens.contains(v.0))
{
// update first next Eat event
match events.iter_mut().find(|ev| matches!(ev, Event::Eat { .. })) {
Some(Event::Eat { count }) => *count -= 1,
_ => unreachable!(),
}
// put whitespace into lst
let (tok, text) = raw_toks.pop().unwrap();
builder.token(tok.into(), text);
}
// insert forward parents into the tree in correct order
for kind in fw_parents.drain(..).rev() {
match kind {
NodeKind::Syntax(kind) => builder.start_node(kind.into()),
NodeKind::Error(err) => {
errors.push(err);
builder.start_node(SyntaxKind::SYNTAX_ERROR.into())
}
_ => {}
}
}
}
Event::Finish => builder.finish_node(),
Event::Eat { count } => (0..count).for_each(|_| {
let (tok, text) = raw_toks.pop().unwrap();
builder.token(tok.into(), text);
}),
}
}
// finish SYNTAX_ROOT
builder.finish_node();
ParserOutput {
green_node: builder.finish(),
errors,
_syntax_kind: PhantomData::<SyntaxKind>,
}
}
}
pub struct ParserBuilder<
'src,
SyntaxKind: SyntaxElement,
// SyntaxErr: SyntaxError,
> {
raw_toks: Vec<(SyntaxKind, &'src str)>,
meaningless_token_kinds: EnumSet<SyntaxKind>,
step_limit: u32,
}
impl<'src, SyntaxKind: SyntaxElement> ParserBuilder<'src, SyntaxKind> {
pub fn new(raw_toks: Vec<(SyntaxKind, &'src str)>) -> Self {
Self {
raw_toks,
meaningless_token_kinds: EnumSet::new(),
step_limit: 4096,
}
}
/// Sets the parser step limit.
/// Defaults to 4096
pub fn step_limit(mut self, new: u32) -> Self {
self.step_limit = new;
self
}
pub fn add_meaningless(mut self, kind: SyntaxKind) -> Self {
self.meaningless_token_kinds.insert(kind);
self
}
pub fn add_meaningless_many(mut self, kind: Vec<SyntaxKind>) -> Self {
self.meaningless_token_kinds
.insert_all(kind.into_iter().collect());
self
}
pub fn build<SyntaxErr: SyntaxError>(self) -> Parser<'src, SyntaxKind, SyntaxErr> {
let Self {
raw_toks,
meaningless_token_kinds,
step_limit,
} = self;
Parser {
input: Input::new(raw_toks, Some(meaningless_token_kinds)),
pos: 0,
events: Vec::new(),
step_limit,
steps: Cell::new(0),
}
}
}

View file

@ -0,0 +1,9 @@
use std::fmt;
/// A marker trait... for now!
// TODO: constrain that conversion to `NodeKind::Error` is enforced to be possible
pub trait SyntaxError
where
Self: fmt::Debug + Clone + PartialEq + Eq,
{
}

View file

@ -0,0 +1,42 @@
use enumset::EnumSetType;
use super::{error::SyntaxError, SyntaxElement};
pub enum Event<SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError> {
Start {
kind: NodeKind<SyntaxKind, SyntaxErr>,
forward_parent: Option<usize>,
},
Finish,
Eat {
count: usize,
},
}
impl<SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError> Event<SyntaxKind, SyntaxErr> {
pub fn tombstone() -> Self {
Self::Start {
kind: NodeKind::Tombstone,
forward_parent: None,
}
}
}
#[derive(Clone, PartialEq, Eq)]
pub enum NodeKind<SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError> {
Tombstone,
Syntax(SyntaxKind),
Error(SyntaxErr),
}
impl<SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError> NodeKind<SyntaxKind, SyntaxErr> {
pub fn is_tombstone(&self) -> bool {
matches!(self, Self::Tombstone)
}
pub fn is_syntax(&self) -> bool {
matches!(self, Self::Syntax(_))
}
pub fn is_error(&self) -> bool {
matches!(self, Self::Error(_))
}
}

View file

@ -0,0 +1,67 @@
use enumset::{EnumSet, EnumSetType};
use super::SyntaxElement;
pub struct Input<'src, SyntaxKind: SyntaxElement> {
raw: Vec<(SyntaxKind, &'src str)>,
// enumset of meaningless tokens
semantically_meaningless: EnumSet<SyntaxKind>,
// indices of non-meaningless tokens
meaningful_toks: Vec<usize>,
}
impl<'src, SyntaxKind: SyntaxElement> Input<'src, SyntaxKind> {
pub fn new(
raw_toks: Vec<(SyntaxKind, &'src str)>,
meaningless: Option<EnumSet<SyntaxKind>>,
) -> Self {
let mut meaningful_toks = Vec::new();
if let Some(meaningless) = meaningless {
let meaningful_toks = raw_toks
.iter()
.enumerate()
.filter_map(|(i, tok)| (!meaningless.contains(tok.0)).then_some(i))
.collect_into(&mut meaningful_toks);
}
Self {
raw: raw_toks,
semantically_meaningless: meaningless.unwrap_or_default(),
meaningful_toks,
}
}
pub fn kind(&self, idx: usize) -> SyntaxKind {
let Some(meaningful_idx) = self.meaningful_toks.get(idx) else {
return SyntaxKind::SYNTAX_EOF;
};
self.raw.get(*meaningful_idx).unwrap().0
}
pub fn preceding_meaningless(&self, idx: usize) -> usize {
assert!(self.meaningful_toks.len() > idx);
if idx == 0 {
// maybe should be `self.meaningful_toks[idx]` instead??
1
} else {
self.meaningful_toks[idx] - self.meaningful_toks[idx - 1]
}
}
/// get the count of meaningless tokens at the end of the file.
pub fn meaningless_tail_len(&self) -> usize {
self.raw.len() - (self.meaningful_toks.last().unwrap() + 1)
}
pub fn dissolve(self) -> (Vec<(SyntaxKind, &'src str)>, EnumSet<SyntaxKind>) {
let Self {
raw,
semantically_meaningless,
..
} = self;
(raw, semantically_meaningless)
}
}

View file

@ -0,0 +1,97 @@
use drop_bomb::DropBomb;
use rowan::SyntaxKind;
use super::{
error::SyntaxError,
event::{Event, NodeKind},
Parser, SyntaxElement,
};
pub struct Marker {
pos: usize,
bomb: DropBomb,
}
impl Marker {
pub(super) fn new(pos: usize, name: &str) -> Self {
Self {
pos,
bomb: DropBomb::new(format!("Marker {name} must be completed or abandoned.")),
}
}
fn close_node<SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError>(
mut self,
p: &mut Parser<SyntaxKind, SyntaxErr>,
kind: NodeKind<SyntaxKind, SyntaxErr>,
) -> CompletedMarker<SyntaxKind, SyntaxErr> {
self.bomb.defuse();
match &mut p.events[self.pos] {
Event::Start { kind: slot, .. } => *slot = kind.clone(),
_ => unreachable!(),
}
p.push_ev(Event::Finish);
CompletedMarker {
pos: self.pos,
kind,
}
}
pub fn complete<SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError>(
self,
p: &mut Parser<SyntaxKind, SyntaxErr>,
kind: SyntaxKind,
) -> CompletedMarker<SyntaxKind, SyntaxErr> {
self.close_node(p, NodeKind::Syntax(kind))
}
pub fn error<SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError>(
self,
p: &mut Parser<SyntaxKind, SyntaxErr>,
kind: SyntaxErr,
) -> CompletedMarker<SyntaxKind, SyntaxErr> {
self.close_node(p, NodeKind::Error(kind))
}
pub fn abandon<SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError>(
mut self,
p: &mut Parser<SyntaxKind, SyntaxErr>,
) {
self.bomb.defuse();
// clean up empty tombstone event from marker
if self.pos == p.events.len() - 1 {
match p.events.pop() {
Some(Event::Start {
kind: NodeKind::Tombstone,
forward_parent: None,
}) => (),
_ => unreachable!(),
}
}
}
}
pub struct CompletedMarker<SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError> {
pos: usize,
kind: NodeKind<SyntaxKind, SyntaxErr>,
}
impl<SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError> CompletedMarker<SyntaxKind, SyntaxErr> {
pub fn precede(self, p: &mut Parser<SyntaxKind, SyntaxErr>, name: &str) -> Marker {
let new_pos = p.start(name);
match &mut p.events[self.pos] {
Event::Start { forward_parent, .. } => {
// point forward parent of the node this marker completed to the new node
// will later be used to make the new node a parent of the current node.
*forward_parent = Some(new_pos.pos - self.pos)
}
_ => unreachable!(),
}
new_pos
}
}

View file

@ -0,0 +1,73 @@
use std::{fmt, marker::PhantomData};
use rowan::{GreenNode, GreenNodeData, GreenTokenData, NodeOrToken};
use crate::{SyntaxElement, SyntaxError};
pub struct ParserOutput<SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError> {
pub green_node: GreenNode,
pub errors: Vec<SyntaxErr>,
pub(super) _syntax_kind: PhantomData<SyntaxKind>,
}
impl<SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError> std::fmt::Debug
for ParserOutput<SyntaxKind, SyntaxErr>
{
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let mut errs: Vec<&SyntaxErr> = self.errors.iter().collect();
errs.reverse();
debug_print_output::<SyntaxKind, SyntaxErr>(
NodeOrToken::Node(&self.green_node),
f,
0,
&mut errs,
)
}
}
fn debug_print_output<SyntaxKind: SyntaxElement, SyntaxErr: SyntaxError>(
node: NodeOrToken<&GreenNodeData, &GreenTokenData>,
f: &mut std::fmt::Formatter<'_>,
lvl: i32,
errs: &mut Vec<&SyntaxErr>,
) -> std::fmt::Result {
if f.alternate() {
for _ in 0..lvl {
f.write_str(" ")?;
}
}
let maybe_newline = if f.alternate() { "\n" } else { " " };
match node {
NodeOrToken::Node(n) => {
let kind: SyntaxKind = node.kind().into();
if kind != SyntaxKind::SYNTAX_ERROR {
write!(f, "{:?} {{{maybe_newline}", kind)?;
} else {
let err = errs
.pop()
.expect("all error syntax nodes should correspond to an error");
write!(f, "{:?}: {err:?} {{{maybe_newline}", kind)?;
}
for c in n.children() {
debug_print_output::<SyntaxKind, SyntaxErr>(c, f, lvl + 1, errs)?;
}
if f.alternate() {
for _ in 0..lvl {
f.write_str(" ")?;
}
}
write!(f, "}}{maybe_newline}")
}
NodeOrToken::Token(t) => {
write!(
f,
"{:?} {:?};{maybe_newline}",
Into::<SyntaxKind>::into(t.kind()),
t.text()
)
}
}
}

View file

@ -0,0 +1,13 @@
[package]
name = "prowocessing"
version = "0.1.0"
edition = "2021"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
image = "0.24.8"
palette = "0.7.4"
[lints]
workspace = true

View file

@ -0,0 +1,2 @@
pub mod enum_based;
pub mod trait_based;

View file

@ -0,0 +1,64 @@
pub enum Instruction {
Uppercase,
Lowercase,
}
pub struct Pipeline {
pipeline: Vec<fn(String) -> String>,
}
impl Pipeline {
pub fn run(&self, val: String) -> String {
let mut current = val;
for instr in &self.pipeline {
current = instr(current);
}
current
}
}
pub struct PipelineBuilder {
pipeline: Vec<Instruction>,
}
impl PipelineBuilder {
pub fn new() -> Self {
Self {
pipeline: Vec::new(),
}
}
#[must_use]
pub fn insert(mut self, instr: Instruction) -> Self {
self.pipeline.push(instr);
self
}
pub fn build(&self) -> Pipeline {
fn uppercase(v: String) -> String {
str::to_uppercase(&v)
}
fn lowercase(v: String) -> String {
str::to_lowercase(&v)
}
let mut res = Vec::new();
for item in &self.pipeline {
res.push(match item {
Instruction::Uppercase => uppercase,
Instruction::Lowercase => lowercase,
});
}
Pipeline { pipeline: res }
}
}
impl Default for PipelineBuilder {
fn default() -> Self {
Self::new()
}
}

View file

@ -0,0 +1,11 @@
//! An experiment for a hyper-modular trait-based architecture.
//!
//! Patterns defining this (or well, which I reference a lot while writing this):
//! - [Command pattern using trait objects](https://rust-unofficial.github.io/patterns/patterns/behavioural/command.html)
//! - [Builder pattern](https://rust-unofficial.github.io/patterns/patterns/creational/builder.html)
pub mod data;
#[macro_use]
pub mod element;
pub mod ops;
pub mod pipeline;

View file

@ -0,0 +1,5 @@
//! Definitions of the data transfer and storage types.
pub mod io;
pub mod raw;

View file

@ -0,0 +1,53 @@
//! Types for element and pipeline IO
use std::{borrow::ToOwned, convert::Into};
use super::raw::Data;
/// Newtype struct with borrowed types for pipeline/element inputs, so that doesn't force a move or clone
#[derive(PartialEq, Eq, Debug)]
pub struct Inputs<'a>(pub Vec<&'a Data>);
impl<'a> From<Vec<&'a Data>> for Inputs<'a> {
fn from(value: Vec<&'a Data>) -> Self {
Self(value)
}
}
impl<'a, T: Into<&'a Data>> From<T> for Inputs<'a> {
fn from(value: T) -> Self {
Self(vec![value.into()])
}
}
impl<'a> From<&'a Outputs> for Inputs<'a> {
fn from(value: &'a Outputs) -> Self {
Self(value.0.iter().map(Into::into).collect())
}
}
/// Used for pipeline/element outputs
#[derive(PartialEq, Eq, Debug)]
pub struct Outputs(pub Vec<Data>);
impl Outputs {
/// consume self and return inner value(s)
pub fn into_inner(self) -> Vec<Data> {
self.0
}
}
impl From<Vec<Data>> for Outputs {
fn from(value: Vec<Data>) -> Self {
Self(value)
}
}
impl<T: Into<Data>> From<T> for Outputs {
fn from(value: T) -> Self {
Self(vec![value.into()])
}
}
impl From<Inputs<'_>> for Outputs {
fn from(value: Inputs) -> Self {
Self(value.0.into_iter().map(ToOwned::to_owned).collect())
}
}

View file

@ -0,0 +1,20 @@
//! Dynamic data storage and transfer types for use in [`io`]
// Dynamic data type
#[derive(Clone, Debug, PartialEq, Eq)]
pub enum Data {
String(String),
Int(i32),
}
impl From<String> for Data {
fn from(value: String) -> Self {
Self::String(value)
}
}
impl From<i32> for Data {
fn from(value: i32) -> Self {
Self::Int(value)
}
}

View file

@ -0,0 +1,29 @@
//! The trait and type representations
use std::any::TypeId;
use crate::experimental::trait_based::data::io::Inputs;
use super::data::io::Outputs;
pub(crate) trait PipelineElement {
/// return a static runner function pointer to avoid dynamic dispatch during pipeline execution - Types MUST match the signature
fn runner(&self) -> fn(&Inputs) -> Outputs;
/// return the signature of the element
fn signature(&self) -> ElementSignature;
}
/// Type signature for an element used for static checking
pub(crate) struct ElementSignature {
pub inputs: Vec<TypeId>,
pub outputs: Vec<TypeId>,
}
macro_rules! signature {
($($inputs:ty),+ => $($outputs:ty),+) => (
ElementSignature {
inputs: vec![$(std::any::TypeId::of::<$inputs>(), )+],
outputs: vec![$(std::any::TypeId::of::<$outputs>(), )+]
}
)
}

View file

@ -0,0 +1,7 @@
mod num;
mod str;
pub mod prelude {
pub(crate) use super::num::*;
pub(crate) use super::str::*;
}

View file

@ -0,0 +1,62 @@
//! Operations on numeric data
use core::panic;
use std::any::TypeId;
use crate::experimental::trait_based::{
data::{
io::{Inputs, Outputs},
raw::Data,
},
element::{ElementSignature, PipelineElement},
};
/// Addition
pub struct Add(pub i32);
impl PipelineElement for Add {
fn runner(&self) -> fn(&Inputs) -> Outputs {
|input| {
let [Data::Int(i0), Data::Int(i1), ..] = input.0[..] else {
panic!("Invalid data passed")
};
(i0 + i1).into()
}
}
fn signature(&self) -> ElementSignature {
signature!(i32, i32 => i32)
}
}
/// Subtraction
pub struct Subtract(pub i32);
impl PipelineElement for Subtract {
fn runner(&self) -> fn(&Inputs) -> Outputs {
|input| {
let [Data::Int(i0), Data::Int(i1), ..] = input.0[..] else {
panic!("Invalid data passed")
};
(i0 + i1).into()
}
}
fn signature(&self) -> ElementSignature {
signature!(i32, i32 => i32)
}
}
/// Turn input to string
pub struct Stringify;
impl PipelineElement for Stringify {
fn runner(&self) -> fn(&Inputs) -> Outputs {
|input| {
let [Data::Int(int), ..] = input.0[..] else {
panic!("Invalid data passed")
};
int.to_string().into()
}
}
fn signature(&self) -> ElementSignature {
signature!(i32 => String)
}
}

View file

@ -0,0 +1,59 @@
//! Operation on String/text data
use crate::experimental::trait_based::{
data::{
io::{Inputs, Outputs},
raw::Data,
},
element::{ElementSignature, PipelineElement},
};
/// Concatenate the inputs
pub struct Concatenate(pub String);
impl PipelineElement for Concatenate {
fn runner(&self) -> fn(&Inputs) -> Outputs {
|input| {
let [Data::String(s0), Data::String(s1), ..] = input.0[..] else {
panic!("Invalid data passed")
};
format!("{s0}{s1}").into()
}
}
fn signature(&self) -> ElementSignature {
signature!(String, String => String)
}
}
/// Turn input text to uppercase
pub struct Upper;
impl PipelineElement for Upper {
fn runner(&self) -> fn(&Inputs) -> Outputs {
|input| {
let [Data::String(s), ..] = input.0[..] else {
panic!("Invalid data passed")
};
s.to_uppercase().into()
}
}
fn signature(&self) -> ElementSignature {
signature!(String => String)
}
}
/// Turn input text to lowercase
pub struct Lower;
impl PipelineElement for Lower {
fn runner(&self) -> fn(&Inputs) -> Outputs {
|input| {
let [Data::String(s), ..] = input.0[..] else {
panic!("Invalid data passed")
};
s.to_lowercase().into()
}
}
fn signature(&self) -> ElementSignature {
signature!(String => String)
}
}

View file

@ -0,0 +1,107 @@
use super::data::io::{Inputs, Outputs};
use super::element::PipelineElement;
use super::ops::prelude::*;
/// Builder for the pipelines that are actually run
///
/// TODO:
/// - Bind additional inputs if instruction has more then one and is passd without any additional
/// - allow binding to pointers to other pipelines?
/// - allow referencing earlier data
pub struct PipelineBuilder {
elements: Vec<Box<dyn PipelineElement>>,
}
impl PipelineBuilder {
/// Create new, empty builder
pub fn new() -> Self {
Self {
elements: Vec::new(),
}
}
/// Insert element into pipeline
fn insert<T: PipelineElement + 'static>(mut self, el: T) -> Self {
if let Some(previous_item) = self.elements.last() {
assert_eq!(
previous_item.signature().outputs[0],
el.signature().inputs[0]
);
}
self.elements.push(Box::new(el));
self
}
/// insert string concatenattion element
#[must_use]
pub fn concatenate(self, sec: String) -> Self {
self.insert(Concatenate(sec))
}
/// insert string uppercase element
#[must_use]
pub fn upper(self) -> Self {
self.insert(Upper)
}
/// insert string lowercase element
#[must_use]
pub fn lower(self) -> Self {
self.insert(Lower)
}
/// insert numeric addition element
#[must_use]
#[allow(
clippy::should_implement_trait,
reason = "is not equivalent to addition"
)]
pub fn add(self, sec: i32) -> Self {
self.insert(Add(sec))
}
/// insert numeric subtraction element
#[must_use]
pub fn subtract(self, sec: i32) -> Self {
self.insert(Subtract(sec))
}
/// insert stringify element
#[must_use]
pub fn stringify(self) -> Self {
self.insert(Stringify)
}
/// Build the pipeline. Doesn't check again - `insert` should verify correctness.
pub fn build(&self) -> Pipeline {
let mut r = Vec::new();
self.elements.iter().for_each(|el| r.push(el.runner()));
Pipeline { runners: r }
}
}
impl Default for PipelineBuilder {
fn default() -> Self {
Self::new()
}
}
/// Runnable pipeline - at the core of this library
pub struct Pipeline {
runners: Vec<fn(&Inputs) -> Outputs>,
}
impl Pipeline {
/// run the pipeline
pub fn run(&self, inputs: Inputs) -> Outputs {
let mut out: Outputs = inputs.into();
for runner in &self.runners {
out = runner(&(&out).into());
}
out
}
}

View file

@ -0,0 +1,40 @@
//! # This is the image processing library for iOwO
//!
//! One of the design goals for this library is, however, to be a simple, generic image processing library.
//! For now, it's just indev... lets see what comes of it!
#![feature(lint_reasons)]
/// just some experiments, to test whether the architecture i want is even possible (or how to do it). probably temporary.
/// Gonna first try string processing...
pub mod experimental;
#[cfg(test)]
mod tests {
use crate::experimental::{
enum_based,
trait_based::{self, data::io::Outputs},
};
#[test]
fn test_enums() {
let builder = enum_based::PipelineBuilder::new().insert(enum_based::Instruction::Uppercase);
let upr = builder.build();
let upr_lowr = builder.insert(enum_based::Instruction::Lowercase).build();
assert_eq!(upr.run(String::from("Test")), String::from("TEST"));
assert_eq!(upr_lowr.run(String::from("Test")), String::from("test"));
}
#[test]
fn add() {
let pipe = trait_based::pipeline::PipelineBuilder::new()
.add(0)
.stringify()
.build();
assert_eq!(
pipe.run(vec![&2.into(), &3.into()].into()),
Outputs(vec![String::from("5").into()])
);
}
}

View file

@ -0,0 +1,15 @@
[package]
name = "svg-filters"
version = "0.1.0"
edition = "2021"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
csscolorparser = "0.6.2"
indexmap = "2.2.5"
petgraph = { workspace = true }
quick-xml = { version = "0.31.0", features = ["serialize"] }
[lints]
workspace = true

View file

@ -0,0 +1,158 @@
use std::{
cmp,
collections::{BTreeSet, HashMap},
fmt::Display,
io::Read,
ops::Not,
};
use indexmap::IndexMap;
use petgraph::{
algo::toposort,
graph::DiGraph,
prelude::{EdgeIndex, NodeIndex},
};
use quick_xml::ElementWriter;
use crate::{
types::{
graph::{edge::Edge, FilterGraph, NodeInput},
nodes::{primitives::WriteElement, CommonAttrs},
},
Node,
};
use self::error::CodegenError;
pub struct SvgDocument {
filters: HashMap<String, FilterGraph>,
}
impl SvgDocument {
pub fn new() -> Self {
Self {
filters: HashMap::new(),
}
}
#[allow(clippy::unwrap_used, reason = "we literally just did the insertion")]
pub fn create_filter(&mut self, id: impl ToString) -> &mut FilterGraph {
let filter = FilterGraph::new();
self.filters.insert(id.to_string(), filter);
self.filters.get_mut(&id.to_string()).unwrap()
}
pub fn generate_svg_pretty(&self) -> String {
let mut result = Vec::new();
let doc_writer = quick_xml::Writer::new_with_indent(&mut result, b' ', 2);
self.generate(doc_writer);
String::from_utf8_lossy(&result).to_string()
}
pub fn generate_svg(&self) -> String {
let mut result = Vec::new();
let doc_writer = quick_xml::Writer::new(&mut result);
self.generate(doc_writer);
String::from_utf8_lossy(&result).to_string()
}
fn generate(&self, mut doc_writer: quick_xml::Writer<&mut Vec<u8>>) {
doc_writer
.create_element("svg")
.write_inner_content(|writer| {
self.filters
.iter()
.try_fold(writer, Self::gen_filter)
.map(|_| {})
});
}
fn gen_filter<'w, 'r>(
writer: &'w mut quick_xml::Writer<&'r mut Vec<u8>>,
(id, graph): (&String, &FilterGraph),
) -> Result<&'w mut quick_xml::Writer<&'r mut Vec<u8>>, CodegenError> {
writer
.create_element("filter")
.with_attribute(("id", id.as_str()))
.write_inner_content(|writer| Self::graph_to_svg(writer, graph))
}
fn graph_to_svg(
writer: &mut quick_xml::Writer<&mut Vec<u8>>,
graph: &FilterGraph,
) -> Result<(), CodegenError> {
let sorted = toposort(&graph.dag, None).expect("no cycles allowed in a DAG");
sorted
.into_iter()
.filter_map(|node_idx| {
graph
.dag
.node_weight(node_idx)
.and_then(|node| node.primitive())
.map(|(primitive, common_attrs)| (node_idx, primitive, common_attrs))
})
.try_fold(writer, |writer, (node_idx, primitive, common_attrs)| {
primitive.element_writer(
writer,
*common_attrs,
graph
.inputs(node_idx)
.into_iter()
.map(|v| v.to_string())
.collect(),
graph
.outputs(node_idx)
.is_empty()
.not()
.then_some(format!("r{}", node_idx.index())),
)
})?;
Ok(())
}
}
/// convenience method to avoid fuckups during future changes
fn format_edge_idx(idx: EdgeIndex) -> String {
format!("edge{}", idx.index())
}
fn format_node_idx(node_idx: NodeIndex) -> String {
format!("r{}", node_idx.index())
}
mod error {
use std::{error::Error, fmt::Display};
#[derive(Debug)]
pub enum CodegenError {
QuickXmlError(quick_xml::Error),
}
impl From<quick_xml::Error> for CodegenError {
fn from(value: quick_xml::Error) -> Self {
Self::QuickXmlError(value)
}
}
impl Display for CodegenError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
CodegenError::QuickXmlError(e) => e.fmt(f),
}
}
}
impl Error for CodegenError {}
}
impl Default for SvgDocument {
fn default() -> Self {
Self::new()
}
}

View file

@ -0,0 +1,40 @@
#![feature(lint_reasons)]
#[macro_use]
pub mod util {
macro_rules! gen_attr {
($name:literal = $out:expr) => {
quick_xml::events::attributes::Attribute {
key: quick_xml::name::QName($name),
value: std::borrow::Cow::from(($out).to_string().into_bytes()),
}
};
}
macro_rules! gen_attrs {
($($name:literal: $out:expr),+) => {
vec![
$(gen_attr!($name = $out)),+
]
};
($($cond:expr => $name:literal: $out:expr),+) => {
{
let mut r = Vec::new();
$(if $cond {
r.push(gen_attr!($name = $out));
})+
r
}
};
($other:ident; $($cond:expr => $name:literal: $out:expr),+) => {
$other.append(&mut gen_attrs![$($cond => $name: $out),+]);
};
}
}
pub mod codegen;
pub mod types;
pub use types::nodes::Node;
#[cfg(test)]
mod tests;

View file

@ -0,0 +1,65 @@
use svg_filters::{
codegen::SvgDocument,
types::nodes::{
primitives::{
blend::BlendMode,
color_matrix::ColorMatrixType,
component_transfer::TransferFn,
displacement_map::Channel,
turbulence::{NoiseType, StitchTiles},
},
standard_input::StandardInput,
},
};
fn main() {
let mut doc = SvgDocument::new();
let f = doc.create_filter("cmyk-chromabb");
let noise = f.turbulence(0., 0.1, 2, 0, StitchTiles::Stitch, NoiseType::FractalNoise);
let noise = f.component_transfer_rgba(
noise,
TransferFn::Discrete {
table_values: vec![0., 0.2, 0.4, 0.6, 0.8, 1.],
},
TransferFn::Discrete {
table_values: vec![0., 0.2, 0.4, 0.6, 0.8, 1.],
},
TransferFn::Discrete {
table_values: vec![0., 0.2, 0.4, 0.6, 0.8, 1.],
},
TransferFn::Linear {
slope: 0.,
intercept: 0.5,
},
);
let cyan = f.color_matrix(
StandardInput::SourceGraphic,
ColorMatrixType::Matrix(Box::new([
0., 0., 0., 0., 0., //
0., 1., 0., 0., 0., //
0., 0., 1., 0., 0., //
0., 0., 0., 1., 0.,
])),
);
let cyan = f.offset(cyan, 25., 0.);
let cyan = f.displacement_map(cyan, noise, 50., Channel::R, Channel::A);
let magenta = f.color_matrix(
StandardInput::SourceGraphic,
ColorMatrixType::Matrix(Box::new([
1., 0., 0., 0., 0., //
0., 0., 0., 0., 0., //
0., 0., 1., 0., 0., //
0., 0., 0., 1., 0.,
])),
);
let magenta = f.displacement_map(magenta, noise, 50., Channel::R, Channel::A);
let magenta = f.offset(magenta, -25., 0.);
f.blend(cyan, magenta, BlendMode::Screen);
println!("{}", doc.generate_svg_pretty());
}

View file

@ -0,0 +1,17 @@
mod blend;
mod color_matrix;
mod complex;
mod component_transfer;
mod displacement_map;
mod flood;
mod gaussian_blur;
mod offset;
mod turbulence;
mod composite {}
mod convolve_matrix {}
mod diffuse_lighting {}
mod image {}
mod merge {}
mod morphology {}
mod specular_lighting {}
mod tile {}

View file

@ -0,0 +1,20 @@
use crate::{
codegen::SvgDocument,
types::nodes::{primitives::blend::BlendMode, standard_input::StandardInput},
};
#[test]
fn test_offset_blend() {
let mut doc = SvgDocument::new();
let blend = doc.create_filter("blend");
let offset0 = blend.offset(StandardInput::SourceGraphic, 100., 0.);
let offset1 = blend.offset(StandardInput::SourceGraphic, -100., 0.);
blend.blend(offset0, offset1, BlendMode::Multiply);
assert_eq!(
doc.generate_svg(),
r#"<svg><filter id="blend"><feOffset dx="-100" dy="0" in="SourceGraphic" result="r7"/><feOffset dx="100" dy="0" in="SourceGraphic" result="r6"/><feBlend mode="multiply" in="r6" in2="r7"/></filter></svg>"#
);
}

View file

@ -0,0 +1,25 @@
use crate::{
codegen::SvgDocument,
types::nodes::{primitives::color_matrix::ColorMatrixType, standard_input::StandardInput},
};
#[test]
fn test_greyscale_channel_extraction() {
let mut doc = SvgDocument::new();
let greyscale = doc.create_filter("greyscale");
greyscale.color_matrix(
StandardInput::SourceGraphic,
ColorMatrixType::Matrix(Box::new([
1., 0., 0., 0., 0., //
1., 0., 0., 0., 0., //
1., 0., 0., 0., 0., //
0., 0., 0., 1., 0.,
])),
);
assert_eq!(
doc.generate_svg(),
r#"<svg><filter id="greyscale"><feColorMatrix values="1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 1 0" in="SourceGraphic"/></filter></svg>"#
);
}

View file

@ -0,0 +1,51 @@
use crate::{
codegen::SvgDocument,
types::nodes::{primitives::color_matrix::ColorMatrixType, standard_input::StandardInput},
};
#[test]
fn test_chrom_abb() {
let mut doc = SvgDocument::new();
let chromabb = doc.create_filter("chromabb_gen");
let chan_r = chromabb.color_matrix(
StandardInput::SourceGraphic,
ColorMatrixType::Matrix(Box::new([
1., 0., 0., 0., 0., //
0., 0., 0., 0., 0., //
0., 0., 0., 0., 0., //
0., 0., 0., 1., 0.,
])),
);
let offset_r = chromabb.offset(chan_r, 25., 0.);
let blur_r = chromabb.gaussian_blur_xy(offset_r, 5, 0);
let chan_b = chromabb.color_matrix(
StandardInput::SourceGraphic,
ColorMatrixType::Matrix(Box::new([
0., 0., 0., 0., 0., //
0., 0., 0., 0., 0., //
0., 0., 1., 0., 0., //
0., 0., 0., 1., 0.,
])),
);
let offset_b = chromabb.offset(chan_b, -25., 0.);
let blur_b = chromabb.gaussian_blur_xy(offset_b, 5, 0);
let composite_rb = chromabb.composite_arithmetic(blur_r, blur_b, 0., 1., 1., 0.);
let chan_g = chromabb.color_matrix(
StandardInput::SourceGraphic,
ColorMatrixType::Matrix(Box::new([
0., 0., 0., 0., 0., //
0., 1., 0., 0., 0., //
0., 0., 0., 0., 0., //
0., 0., 0., 1., 0.,
])),
);
chromabb.composite_arithmetic(composite_rb, chan_g, 0., 1., 1., 0.);
assert_eq!(
doc.generate_svg(),
r#"<svg><filter id="chromabb_gen"><feColorMatrix values="0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0" in="SourceGraphic" result="r13"/><feColorMatrix values="0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0" in="SourceGraphic" result="r9"/><feOffset dx="-25" dy="0" in="r9" result="r10"/><feGaussianBlur stdDeviation="5 0" in="r10" result="r11"/><feColorMatrix values="1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0" in="SourceGraphic" result="r6"/><feOffset dx="25" dy="0" in="r6" result="r7"/><feGaussianBlur stdDeviation="5 0" in="r7" result="r8"/><feComposite operator="arithmetic" k1="0" k2="1" k3="1" k4="0" in="r8" in2="r11" result="r12"/><feComposite operator="arithmetic" k1="0" k2="1" k3="1" k4="0" in="r12" in2="r13"/></filter></svg>"#
);
}

View file

@ -0,0 +1,36 @@
use crate::{
codegen::SvgDocument,
types::nodes::primitives::{
component_transfer::{ComponentTransfer, TransferFn},
FePrimitive,
},
Node,
};
#[test]
fn test_comp_trans_simple() {
let mut doc = SvgDocument::new();
let comptrans = doc.create_filter("comp_trans");
comptrans.add_node(Node::simple(FePrimitive::ComponentTransfer(
ComponentTransfer {
func_r: TransferFn::Table {
table_values: vec![0., 0.1, 0.4, 0.9],
},
func_g: TransferFn::Discrete {
table_values: vec![0.1, 0.3, 0.5, 0.7, 0.9],
},
func_b: TransferFn::Linear {
slope: 1.0,
intercept: 0.75,
},
func_a: TransferFn::Identity,
},
)));
assert_eq!(
doc.generate_svg(),
r#"<svg><filter id="comp_trans"><feComponentTransfer><feFuncR type="table" tableValues="0 0.1 0.4 0.9"/><feFuncG type="discrete" tableValues="0.1 0.3 0.5 0.7 0.9"/><feFuncB type="linear" slope="1" intercept="0.75"/><feFuncA type="identity"/></feComponentTransfer></filter></svg>"#
);
}

View file

@ -0,0 +1,32 @@
use crate::{
codegen::SvgDocument,
types::nodes::{
primitives::{
displacement_map::Channel,
turbulence::{NoiseType, StitchTiles},
},
standard_input::StandardInput,
},
};
#[test]
fn test_displacement_map_simple() {
let mut doc = SvgDocument::new();
let displace = doc.create_filter("displace");
let simple_noise =
displace.turbulence(0.01, 0.01, 1, 0, StitchTiles::Stitch, NoiseType::Turbulence);
displace.displacement_map(
StandardInput::SourceGraphic,
simple_noise,
128.,
Channel::R,
Channel::R,
);
assert_eq!(
doc.generate_svg(),
r#"<svg><filter id="displace"><feTurbulence baseFrequency="0.01 0.01" stitchTiles="stitch" result="r6"/><feDisplacementMap scale="128" xChannelSelector="R" yChannelSelector="R" in="SourceGraphic" in2="r6"/></filter></svg>"#
);
}

View file

@ -0,0 +1,17 @@
use csscolorparser::Color;
use crate::codegen::SvgDocument;
#[test]
fn test_flood_simple() {
let mut doc = SvgDocument::new();
let turbdispl = doc.create_filter("noiseDisplace");
turbdispl.flood(Color::new(0.9, 0.7, 0.85, 1.), 1.);
assert_eq!(
doc.generate_svg(),
r##"<svg><filter id="noiseDisplace"><feFlood flood-color="#e6b3d9" flood-opacity="1"/></filter></svg>"##
);
}

View file

@ -0,0 +1,13 @@
use crate::{codegen::SvgDocument, types::nodes::standard_input::StandardInput};
#[test]
fn test_simple_blur() {
let mut doc = SvgDocument::new();
let blur = doc.create_filter("blur");
blur.gaussian_blur_xy(StandardInput::SourceGraphic, 30, 30);
assert_eq!(
doc.generate_svg(),
r#"<svg><filter id="blur"><feGaussianBlur stdDeviation="30 30" in="SourceGraphic"/></filter></svg>"#
);
}

View file

@ -0,0 +1,14 @@
use crate::{codegen::SvgDocument, types::nodes::standard_input::StandardInput};
#[test]
fn test_offset_simple() {
let mut doc = SvgDocument::new();
let offset = doc.create_filter("offset");
offset.offset(StandardInput::SourceGraphic, 25., -25.);
assert_eq!(
doc.generate_svg(),
r#"<svg><filter id="offset"><feOffset dx="25" dy="-25" in="SourceGraphic"/></filter></svg>"#
);
}

View file

@ -0,0 +1,25 @@
use crate::{
codegen::SvgDocument,
types::nodes::primitives::turbulence::{NoiseType, StitchTiles},
};
#[test]
fn test_simple_turbulence() {
let mut doc = SvgDocument::new();
let noise = doc.create_filter("noise");
noise.turbulence(
0.01,
0.01,
1,
0,
StitchTiles::Stitch,
NoiseType::FractalNoise,
);
assert_eq!(
doc.generate_svg(),
r#"<svg><filter id="noise"><feTurbulence baseFrequency="0.01 0.01" stitchTiles="stitch" type="fractalNoise"/></filter></svg>"#
);
}

View file

@ -0,0 +1,6 @@
pub mod length;
pub mod nodes;
// pub mod old;
pub mod graph;

View file

@ -0,0 +1,143 @@
use std::fmt::{Debug, Display};
use petgraph::{prelude::NodeIndex, prelude::*};
use crate::Node;
use super::nodes::standard_input::StandardInput;
#[derive(Debug)]
pub struct FilterGraph {
pub dag: DiGraph<Node, ()>,
source_graphic_idx: NodeIndex,
source_alpha_idx: NodeIndex,
background_image_idx: NodeIndex,
background_alpha_idx: NodeIndex,
fill_paint_idx: NodeIndex,
stroke_paint_idx: NodeIndex,
}
impl Default for FilterGraph {
fn default() -> Self {
Self::new()
}
}
#[derive(Debug, Clone, Copy)]
pub enum NodeInput {
Standard(StandardInput),
Idx(NodeIndex),
}
impl Display for NodeInput {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
NodeInput::Standard(s) => Debug::fmt(s, f),
NodeInput::Idx(idx) => write!(f, "r{}", idx.index()),
}
}
}
impl From<StandardInput> for NodeInput {
fn from(value: StandardInput) -> Self {
Self::Standard(value)
}
}
impl From<NodeIndex> for NodeInput {
fn from(value: NodeIndex) -> Self {
Self::Idx(value)
}
}
impl FilterGraph {
pub fn new() -> Self {
let mut dag = DiGraph::new();
let source_graphic_idx = dag.add_node(Node::StdInput(StandardInput::SourceGraphic));
let source_alpha_idx = dag.add_node(Node::StdInput(StandardInput::SourceAlpha));
let background_image_idx = dag.add_node(Node::StdInput(StandardInput::BackgroundImage));
let background_alpha_idx = dag.add_node(Node::StdInput(StandardInput::BackgroundAlpha));
let fill_paint_idx = dag.add_node(Node::StdInput(StandardInput::FillPaint));
let stroke_paint_idx = dag.add_node(Node::StdInput(StandardInput::StrokePaint));
Self {
dag,
source_graphic_idx,
source_alpha_idx,
background_image_idx,
background_alpha_idx,
fill_paint_idx,
stroke_paint_idx,
}
}
pub fn add_node(&mut self, node: Node) -> NodeIndex {
self.dag.add_node(node)
}
fn resolve_input(&self, input: NodeInput) -> NodeIndex {
match input {
NodeInput::Standard(StandardInput::SourceGraphic) => self.source_graphic_idx,
NodeInput::Standard(StandardInput::SourceAlpha) => self.source_alpha_idx,
NodeInput::Standard(StandardInput::BackgroundImage) => self.background_image_idx,
NodeInput::Standard(StandardInput::BackgroundAlpha) => self.background_alpha_idx,
NodeInput::Standard(StandardInput::FillPaint) => self.fill_paint_idx,
NodeInput::Standard(StandardInput::StrokePaint) => self.stroke_paint_idx,
NodeInput::Idx(i) => i,
}
}
#[allow(
clippy::unwrap_used,
reason = "we only operate on values we know exist, so unwrapping is safe"
)]
pub fn inputs(&self, node_idx: NodeIndex) -> Vec<NodeInput> {
let mut inputs = self
.dag
.neighbors_directed(node_idx, Direction::Incoming)
.map(|input_idx| (self.dag.find_edge(input_idx, node_idx).unwrap(), input_idx))
.collect::<Vec<_>>();
inputs.sort_by(|(a, _), (b, _)| a.cmp(b));
inputs
.into_iter()
.map(
|(_, input_idx)| match self.dag.node_weight(input_idx).unwrap() {
Node::StdInput(s) => NodeInput::Standard(*s),
Node::Primitive { .. } => NodeInput::Idx(input_idx),
},
)
.collect()
}
pub fn outputs(&self, node_idx: NodeIndex) -> Vec<NodeIndex> {
self.dag
.neighbors_directed(node_idx, Direction::Outgoing)
.collect()
}
pub fn source_graphic(&self) -> NodeIndex {
self.source_graphic_idx
}
pub fn source_alpha(&self) -> NodeIndex {
self.source_alpha_idx
}
pub fn background_image(&self) -> NodeIndex {
self.background_image_idx
}
pub fn background_alpha(&self) -> NodeIndex {
self.background_alpha_idx
}
pub fn fill_paint(&self) -> NodeIndex {
self.fill_paint_idx
}
pub fn stroke_paint(&self) -> NodeIndex {
self.stroke_paint_idx
}
}
pub mod abstracted_inputs;
pub mod edge;

View file

@ -0,0 +1,196 @@
use csscolorparser::Color;
use petgraph::{data::Build, prelude::NodeIndex};
use crate::{
types::nodes::primitives::{
blend::BlendMode,
color_matrix::ColorMatrixType,
component_transfer::TransferFn,
displacement_map::Channel,
turbulence::{NoiseType, StitchTiles},
},
Node,
};
use super::{FilterGraph, NodeInput};
impl FilterGraph {
pub fn color_matrix(
&mut self,
r#in: impl Into<NodeInput>,
cm_type: ColorMatrixType,
) -> NodeIndex {
let node_idx = self.dag.add_node(Node::color_matrix(cm_type));
self.dag
.add_edge(self.resolve_input(r#in.into()), node_idx, ());
node_idx
}
pub fn offset(&mut self, r#in: impl Into<NodeInput>, dx: f32, dy: f32) -> NodeIndex {
let node_idx = self.dag.add_node(Node::offset(dx, dy));
self.dag
.add_edge(self.resolve_input(r#in.into()), node_idx, ());
node_idx
}
pub fn gaussian_blur_xy(&mut self, r#in: impl Into<NodeInput>, x: u16, y: u16) -> NodeIndex {
let node_idx = self.dag.add_node(Node::gaussian_blur_xy(x, y));
self.dag
.add_edge(self.resolve_input(r#in.into()), node_idx, ());
node_idx
}
pub fn blend(
&mut self,
r#in: impl Into<NodeInput>,
in2: impl Into<NodeInput>,
mode: BlendMode,
) -> NodeIndex {
let node_idx = self.dag.add_node(Node::blend(mode));
self.dag
.add_edge(self.resolve_input(r#in.into()), node_idx, ());
self.dag
.add_edge(self.resolve_input(in2.into()), node_idx, ());
node_idx
}
pub fn composite_arithmetic(
&mut self,
r#in: impl Into<NodeInput>,
in2: impl Into<NodeInput>,
k1: f32,
k2: f32,
k3: f32,
k4: f32,
) -> NodeIndex {
let node_idx = self
.dag
.add_node(Node::composite_arithmetic(k1, k2, k3, k4));
self.dag
.add_edge(self.resolve_input(r#in.into()), node_idx, ());
self.dag
.add_edge(self.resolve_input(in2.into()), node_idx, ());
node_idx
}
pub fn component_transfer_rgba(
&mut self,
r#in: impl Into<NodeInput>,
r: TransferFn,
g: TransferFn,
b: TransferFn,
a: TransferFn,
) -> NodeIndex {
let node_idx = self.dag.add_node(Node::component_transfer_rgba(r, g, b, a));
self.dag
.add_edge(self.resolve_input(r#in.into()), node_idx, ());
node_idx
}
pub fn component_transfer_rgb(
&mut self,
r#in: impl Into<NodeInput>,
r: TransferFn,
g: TransferFn,
b: TransferFn,
) -> NodeIndex {
let node_idx = self.dag.add_node(Node::component_transfer_rgb(r, g, b));
self.dag
.add_edge(self.resolve_input(r#in.into()), node_idx, ());
node_idx
}
pub fn component_transfer_r(
&mut self,
r#in: impl Into<NodeInput>,
func: TransferFn,
) -> NodeIndex {
let node_idx = self.dag.add_node(Node::component_transfer_r(func));
self.dag
.add_edge(self.resolve_input(r#in.into()), node_idx, ());
node_idx
}
pub fn component_transfer_g(
&mut self,
r#in: impl Into<NodeInput>,
func: TransferFn,
) -> NodeIndex {
let node_idx = self.dag.add_node(Node::component_transfer_g(func));
self.dag
.add_edge(self.resolve_input(r#in.into()), node_idx, ());
node_idx
}
pub fn component_transfer_b(
&mut self,
r#in: impl Into<NodeInput>,
func: TransferFn,
) -> NodeIndex {
let node_idx = self.dag.add_node(Node::component_transfer_b(func));
self.dag
.add_edge(self.resolve_input(r#in.into()), node_idx, ());
node_idx
}
pub fn component_transfer_a(
&mut self,
r#in: impl Into<NodeInput>,
func: TransferFn,
) -> NodeIndex {
let node_idx = self.dag.add_node(Node::component_transfer_a(func));
self.dag
.add_edge(self.resolve_input(r#in.into()), node_idx, ());
node_idx
}
pub fn component_transfer_single(
&mut self,
r#in: impl Into<NodeInput>,
func: TransferFn,
) -> NodeIndex {
let node_idx = self.dag.add_node(Node::component_transfer_single(func));
self.dag
.add_edge(self.resolve_input(r#in.into()), node_idx, ());
node_idx
}
pub fn flood(&mut self, flood_color: Color, flood_opacity: f32) -> NodeIndex {
self.dag.add_node(Node::flood(flood_color, flood_opacity))
}
pub fn flood_opaque(&mut self, flood_color: Color) -> NodeIndex {
self.dag.add_node(Node::flood_opaque(flood_color))
}
pub fn turbulence(
&mut self,
base_freq_x: f32,
base_freq_y: f32,
num_octaves: u16,
seed: u32,
stitch_tiles: StitchTiles,
noise_type: NoiseType,
) -> NodeIndex {
self.dag.add_node(Node::turbulence(
base_freq_x,
base_freq_y,
num_octaves,
seed,
stitch_tiles,
noise_type,
))
}
pub fn displacement_map(
&mut self,
source_image: impl Into<NodeInput>,
displacement_map: impl Into<NodeInput>,
scale: f32,
x_channel: Channel,
y_channel: Channel,
) -> NodeIndex {
let node_idx = self
.dag
.add_node(Node::displacement_map(scale, x_channel, y_channel));
self.dag
.add_edge(self.resolve_input(source_image.into()), node_idx, ());
self.dag
.add_edge(self.resolve_input(displacement_map.into()), node_idx, ());
node_idx
}
}

View file

@ -0,0 +1,19 @@
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Default)]
pub struct Edge {
input_idx: u8,
}
impl Edge {
pub fn new(input_idx: u8) -> Self {
Self { input_idx }
}
}
impl ToString for Edge {
fn to_string(&self) -> String {
match self.input_idx {
0 => "in".to_owned(),
n => format!("in{}", n + 1),
}
}
}

View file

@ -0,0 +1,48 @@
use std::fmt::Display;
#[derive(Default, Debug, Clone, Copy)]
pub struct Length(f32, Unit);
impl Length {
pub fn is_zero(&self) -> bool {
self.0 == 0.
}
}
impl Display for Length {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}{}", self.0, self.1)
}
}
pub type Coordinate = Length;
#[derive(Default, Debug, Clone, Copy)]
pub enum Unit {
#[default]
None,
Em,
Ex,
Px,
In,
Cm,
Mm,
Pt,
Pc,
}
impl Display for Unit {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Unit::None => f.write_str(""),
Unit::Em => f.write_str("em"),
Unit::Ex => f.write_str("ex"),
Unit::Px => f.write_str("px"),
Unit::In => f.write_str("in"),
Unit::Cm => f.write_str("cm"),
Unit::Mm => f.write_str("mm"),
Unit::Pt => f.write_str("pt"),
Unit::Pc => f.write_str("pc"),
}
}
}

View file

@ -0,0 +1,237 @@
use std::borrow::Cow;
use csscolorparser::Color;
use quick_xml::{events::attributes::Attribute, name::QName};
use self::{
primitives::{
blend::{Blend, BlendMode},
color_matrix::{ColorMatrix, ColorMatrixType},
component_transfer::{ComponentTransfer, TransferFn},
composite::{Composite, CompositeOperator},
displacement_map::{Channel, DisplacementMap},
flood::Flood,
gaussian_blur::GaussianBlur,
offset::Offset,
turbulence::{NoiseType, StitchTiles, Turbulence},
FePrimitive,
},
standard_input::StandardInput,
};
use super::length::{Coordinate, Length};
pub mod primitives;
pub mod standard_input;
#[derive(Debug)]
pub enum Node {
StdInput(StandardInput),
Primitive {
primitive: FePrimitive,
common_attrs: CommonAttrs,
},
}
impl Default for Node {
fn default() -> Self {
Self::StdInput(StandardInput::SourceGraphic)
}
}
#[derive(Default, Debug, Clone, Copy)]
pub struct CommonAttrs {
pub x: Coordinate,
pub y: Coordinate,
pub width: Length,
pub height: Length,
}
impl From<CommonAttrs> for Vec<Attribute<'_>> {
fn from(val: CommonAttrs) -> Self {
gen_attrs![
!val.x.is_zero() => b"x": val.x,
!val.y.is_zero() => b"y": val.y,
!val.width.is_zero() => b"width": val.width,
!val.height.is_zero() => b"height": val.height
]
}
}
impl Node {
pub fn simple(el: FePrimitive) -> Node {
Node::Primitive {
primitive: el,
common_attrs: CommonAttrs::default(),
}
}
pub fn primitive(&self) -> Option<(&FePrimitive, &CommonAttrs)> {
if let Node::Primitive {
primitive,
common_attrs,
} = self
{
Some((primitive, common_attrs))
} else {
None
}
}
pub fn input_count(&self) -> u8 {
match self {
Node::Primitive {
primitive:
FePrimitive::ColorMatrix(_)
| FePrimitive::ComponentTransfer(_)
| FePrimitive::ConvolveMatrix(_)
| FePrimitive::DiffuseLighting(_)
| FePrimitive::GaussianBlur(_)
| FePrimitive::Morphology(_)
| FePrimitive::Offset(_)
| FePrimitive::SpecularLighting(_)
| FePrimitive::Tile(_),
..
} => 1,
Node::Primitive {
primitive:
FePrimitive::Composite(_) | FePrimitive::Blend(_) | FePrimitive::DisplacementMap(_),
..
} => 2,
Node::StdInput(_)
| Node::Primitive {
primitive:
FePrimitive::Flood(_) | FePrimitive::Image(_) | FePrimitive::Turbulence(_),
..
} => 0,
Node::Primitive {
primitive: FePrimitive::Merge(_),
..
} => todo!(),
}
}
pub fn blend(mode: BlendMode) -> Self {
Self::simple(FePrimitive::Blend(Blend::new(mode)))
}
pub fn color_matrix(cm_type: ColorMatrixType) -> Self {
Self::simple(FePrimitive::ColorMatrix(ColorMatrix::new(cm_type)))
}
pub fn composite(op: CompositeOperator) -> Self {
Self::simple(FePrimitive::Composite(Composite::new(op)))
}
pub fn composite_arithmetic(k1: f32, k2: f32, k3: f32, k4: f32) -> Self {
Self::composite(CompositeOperator::Arithmetic { k1, k2, k3, k4 })
}
pub fn gaussian_blur(v: u16) -> Self {
Self::simple(FePrimitive::GaussianBlur(GaussianBlur::single(v)))
}
pub fn gaussian_blur_xy(x: u16, y: u16) -> Self {
Self::simple(FePrimitive::GaussianBlur(GaussianBlur::with_xy(x, y)))
}
pub fn offset(dx: f32, dy: f32) -> Self {
Self::simple(FePrimitive::Offset(Offset::new(dx, dy)))
}
pub fn component_transfer_rgba(
r: TransferFn,
g: TransferFn,
b: TransferFn,
a: TransferFn,
) -> Self {
Self::simple(FePrimitive::ComponentTransfer(ComponentTransfer {
func_r: r,
func_g: g,
func_b: b,
func_a: a,
}))
}
pub fn component_transfer_rgb(r: TransferFn, g: TransferFn, b: TransferFn) -> Self {
Self::component_transfer_rgba(r, g, b, TransferFn::Identity)
}
pub fn component_transfer_r(func: TransferFn) -> Self {
Self::component_transfer_rgba(
func,
TransferFn::Identity,
TransferFn::Identity,
TransferFn::Identity,
)
}
pub fn component_transfer_g(func: TransferFn) -> Self {
Self::component_transfer_rgba(
TransferFn::Identity,
func,
TransferFn::Identity,
TransferFn::Identity,
)
}
pub fn component_transfer_b(func: TransferFn) -> Self {
Self::component_transfer_rgba(
TransferFn::Identity,
TransferFn::Identity,
func,
TransferFn::Identity,
)
}
pub fn component_transfer_a(func: TransferFn) -> Self {
Self::component_transfer_rgba(
TransferFn::Identity,
TransferFn::Identity,
TransferFn::Identity,
func,
)
}
pub fn component_transfer_single(func: TransferFn) -> Self {
Self::component_transfer_rgb(func.clone(), func.clone(), func)
}
pub fn flood(flood_color: Color, flood_opacity: f32) -> Self {
Self::simple(FePrimitive::Flood(Flood {
flood_color,
flood_opacity,
}))
}
pub fn flood_opaque(flood_color: Color) -> Self {
Self::flood(flood_color, 1.)
}
pub fn turbulence(
base_freq_x: f32,
base_freq_y: f32,
num_octaves: u16,
seed: u32,
stitch_tiles: StitchTiles,
noise_type: NoiseType,
) -> Self {
Self::simple(FePrimitive::Turbulence(Turbulence {
base_frequency: (base_freq_x, base_freq_y),
num_octaves,
seed,
stitch_tiles,
noise_type,
}))
}
pub fn displacement_map(scale: f32, x_channel: Channel, y_channel: Channel) -> Self {
Self::simple(FePrimitive::DisplacementMap(DisplacementMap {
scale,
x_channel_selector: x_channel,
y_channel_selector: y_channel,
}))
}
}

View file

@ -0,0 +1,150 @@
use quick_xml::{events::attributes::Attribute, ElementWriter, Writer};
use std::convert::Into;
use super::CommonAttrs;
pub mod blend;
pub mod color_matrix;
pub mod component_transfer;
pub mod composite;
pub mod convolve_matrix;
pub mod diffuse_lighting;
pub mod displacement_map;
pub mod flood;
pub mod gaussian_blur;
pub mod image;
pub mod merge;
pub mod morphology;
pub mod offset;
pub mod specular_lighting;
pub mod tile;
pub mod turbulence;
pub trait WriteElement {
fn attrs(&self) -> Vec<Attribute>;
fn tag_name(&self) -> &'static str;
fn element_writer<'writer, 'result>(
&self,
writer: &'writer mut Writer<&'result mut Vec<u8>>,
common: CommonAttrs,
inputs: Vec<String>,
output: Option<String>,
) -> quick_xml::Result<&'writer mut quick_xml::Writer<&'result mut Vec<u8>>> {
let attrs: Vec<_> = inputs
.into_iter()
.enumerate()
.map(|(i, edge)| {
(
match i {
0 => "in".to_owned(),
n => format!("in{}", n + 1),
}
.into_bytes(),
edge.into_bytes(),
)
})
.collect();
let mut el_writer = writer
.create_element(self.tag_name())
.with_attributes(Into::<Vec<Attribute<'_>>>::into(common))
.with_attributes(self.attrs())
.with_attributes(attrs.iter().map(|(k, v)| (&k[..], &v[..])));
if let Some(output) = output {
el_writer = el_writer.with_attribute(("result", output.as_str()));
}
el_writer.write_empty()
}
}
/// svg filter effects primitives
#[derive(Debug)]
pub enum FePrimitive {
Blend(blend::Blend),
ColorMatrix(color_matrix::ColorMatrix),
ComponentTransfer(component_transfer::ComponentTransfer),
Composite(composite::Composite),
ConvolveMatrix(convolve_matrix::ConvolveMatrix),
DiffuseLighting(diffuse_lighting::DiffuseLighting),
DisplacementMap(displacement_map::DisplacementMap),
Flood(flood::Flood),
GaussianBlur(gaussian_blur::GaussianBlur),
Image(image::Image),
Merge(merge::Merge),
Morphology(morphology::Morphology),
Offset(offset::Offset),
SpecularLighting(specular_lighting::SpecularLighting),
Tile(tile::Tile),
Turbulence(turbulence::Turbulence),
}
impl WriteElement for FePrimitive {
fn attrs(&self) -> std::vec::Vec<quick_xml::events::attributes::Attribute<'_>> {
match self {
FePrimitive::Blend(el) => el.attrs(),
FePrimitive::ColorMatrix(el) => el.attrs(),
FePrimitive::ComponentTransfer(el) => el.attrs(),
FePrimitive::Composite(el) => el.attrs(),
FePrimitive::GaussianBlur(el) => el.attrs(),
FePrimitive::Offset(el) => el.attrs(),
FePrimitive::Turbulence(el) => el.attrs(),
FePrimitive::DisplacementMap(el) => el.attrs(),
FePrimitive::Flood(el) => el.attrs(),
FePrimitive::Morphology(el) => el.attrs(),
FePrimitive::ConvolveMatrix(_) => todo!(),
FePrimitive::DiffuseLighting(_) => todo!(),
FePrimitive::Image(_) => todo!(),
FePrimitive::Merge(_) => todo!(),
FePrimitive::SpecularLighting(_) => todo!(),
FePrimitive::Tile(_) => todo!(),
}
}
fn tag_name(&self) -> &'static str {
match self {
FePrimitive::Blend(el) => el.tag_name(),
FePrimitive::ColorMatrix(el) => el.tag_name(),
FePrimitive::ComponentTransfer(el) => el.tag_name(),
FePrimitive::Composite(el) => el.tag_name(),
FePrimitive::GaussianBlur(el) => el.tag_name(),
FePrimitive::Offset(el) => el.tag_name(),
FePrimitive::Turbulence(el) => el.tag_name(),
FePrimitive::DisplacementMap(el) => el.tag_name(),
FePrimitive::Flood(el) => el.tag_name(),
FePrimitive::Morphology(el) => el.tag_name(),
FePrimitive::ConvolveMatrix(_) => todo!(),
FePrimitive::DiffuseLighting(_) => todo!(),
FePrimitive::Image(_) => todo!(),
FePrimitive::Merge(_) => todo!(),
FePrimitive::SpecularLighting(_) => todo!(),
FePrimitive::Tile(_) => todo!(),
}
}
fn element_writer<'writer, 'result>(
&self,
writer: &'writer mut Writer<&'result mut Vec<u8>>,
common: CommonAttrs,
inputs: Vec<String>,
output: Option<String>,
) -> quick_xml::Result<&'writer mut quick_xml::Writer<&'result mut Vec<u8>>> {
match self {
FePrimitive::Blend(el) => el.element_writer(writer, common, inputs, output),
FePrimitive::ColorMatrix(el) => el.element_writer(writer, common, inputs, output),
FePrimitive::ComponentTransfer(el) => el.element_writer(writer, common, inputs, output),
FePrimitive::Composite(el) => el.element_writer(writer, common, inputs, output),
FePrimitive::Turbulence(el) => el.element_writer(writer, common, inputs, output),
FePrimitive::GaussianBlur(el) => el.element_writer(writer, common, inputs, output),
FePrimitive::Offset(el) => el.element_writer(writer, common, inputs, output),
FePrimitive::DisplacementMap(el) => el.element_writer(writer, common, inputs, output),
FePrimitive::Flood(el) => el.element_writer(writer, common, inputs, output),
FePrimitive::Morphology(el) => el.element_writer(writer, common, inputs, output),
FePrimitive::ConvolveMatrix(_) => todo!(),
FePrimitive::DiffuseLighting(_) => todo!(),
FePrimitive::Image(_) => todo!(),
FePrimitive::Merge(_) => todo!(),
FePrimitive::SpecularLighting(_) => todo!(),
FePrimitive::Tile(_) => todo!(),
}
}
}

View file

@ -0,0 +1,82 @@
use std::fmt::Display;
use super::WriteElement;
/// [feBlend](https://www.w3.org/TR/SVG11/filters.html#feBlendElement)
#[derive(Debug)]
pub struct Blend {
mode: BlendMode,
}
impl Blend {
pub fn new(mode: BlendMode) -> Self {
Self { mode }
}
}
impl Default for Blend {
fn default() -> Self {
Self {
mode: BlendMode::Normal,
}
}
}
impl WriteElement for Blend {
fn attrs(&self) -> Vec<quick_xml::events::attributes::Attribute> {
if let BlendMode::Normal = self.mode {
Vec::new()
} else {
gen_attrs![b"mode": self.mode]
}
}
fn tag_name(&self) -> &'static str {
"feBlend"
}
}
/// as according to https://drafts.fxtf.org/compositing-1/#blending
#[derive(Debug)]
pub enum BlendMode {
Normal,
Multiply,
Screen,
Overlay,
Darken,
Lighten,
ColorDodge,
ColorBurn,
HardLight,
SoftLight,
Difference,
Exclusion,
Hue,
Saturation,
Color,
Luminosity,
}
impl Display for BlendMode {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.write_str(match self {
BlendMode::Normal => "normal",
BlendMode::Multiply => "multiply",
BlendMode::Screen => "screen",
BlendMode::Overlay => "overlay",
BlendMode::Darken => "darken",
BlendMode::Lighten => "lighten",
BlendMode::ColorDodge => "color-dodge",
BlendMode::ColorBurn => "color-burn",
BlendMode::HardLight => "hard-light",
BlendMode::SoftLight => "soft-light",
BlendMode::Difference => "difference",
BlendMode::Exclusion => "exclusion",
BlendMode::Hue => "hue",
BlendMode::Saturation => "saturation",
BlendMode::Color => "color",
BlendMode::Luminosity => "luminosity",
})
}
}

View file

@ -0,0 +1,47 @@
use super::WriteElement;
/// [feColorMatrix](https://www.w3.org/TR/SVG11/filters.html#feColorMatrixElement)
#[derive(Debug)]
pub struct ColorMatrix {
cm_type: ColorMatrixType,
}
impl ColorMatrix {
pub fn new(cm_type: ColorMatrixType) -> Self {
Self { cm_type }
}
}
impl WriteElement for ColorMatrix {
fn attrs(&self) -> Vec<quick_xml::events::attributes::Attribute> {
match &self.cm_type {
ColorMatrixType::Matrix(v) => gen_attrs![
b"values": v
.iter()
.map(std::string::ToString::to_string)
.reduce(|mut acc, e| {
acc.push(' ');
acc.push_str(&e);
acc
})
.expect("fixed length arr should always work")
],
ColorMatrixType::Saturate(v) | ColorMatrixType::HueRotate(v) => {
gen_attrs![b"values": v]
}
ColorMatrixType::LuminanceToAlpha => Vec::new(),
}
}
fn tag_name(&self) -> &'static str {
"feColorMatrix"
}
}
#[derive(Debug)]
pub enum ColorMatrixType {
Matrix(Box<[f32; 20]>),
Saturate(f32),
HueRotate(f32),
LuminanceToAlpha,
}

View file

@ -0,0 +1,134 @@
use quick_xml::{events::attributes::Attribute, Writer};
use super::WriteElement;
/// [feComponentTransfer](https://www.w3.org/TR/SVG11/filters.html#feComponentTransferElement)
#[derive(Debug)]
pub struct ComponentTransfer {
pub func_r: TransferFn,
pub func_g: TransferFn,
pub func_b: TransferFn,
pub func_a: TransferFn,
}
impl WriteElement for ComponentTransfer {
fn attrs(&self) -> Vec<quick_xml::events::attributes::Attribute> {
Vec::new()
}
fn tag_name(&self) -> &'static str {
"feComponentTransfer"
}
fn element_writer<'writer, 'result>(
&self,
writer: &'writer mut quick_xml::Writer<&'result mut Vec<u8>>,
common: crate::types::nodes::CommonAttrs,
inputs: Vec<String>,
output: Option<String>,
) -> quick_xml::Result<&'writer mut quick_xml::Writer<&'result mut Vec<u8>>> {
let inputs: Vec<_> = inputs
.into_iter()
.enumerate()
.map(|(i, edge)| {
(
match i {
0 => "in".to_owned(),
n => format!("in{}", n + 1),
}
.into_bytes(),
edge.into_bytes(),
)
})
.collect();
let mut el_writer = writer
.create_element(self.tag_name())
.with_attributes(inputs.iter().map(|(k, v)| (&k[..], &v[..])))
.with_attributes(Into::<Vec<Attribute<'_>>>::into(common));
if let Some(output) = output {
el_writer = el_writer.with_attribute(("result", output.as_str()));
}
el_writer.write_inner_content(|writer| {
self.func_r.write_self(writer, "feFuncR")?;
self.func_g.write_self(writer, "feFuncG")?;
self.func_b.write_self(writer, "feFuncB")?;
self.func_a.write_self(writer, "feFuncA")?;
Ok(())
})
}
}
/// [transfer functions](https://www.w3.org/TR/SVG11/filters.html#transferFuncElements)
#[derive(Debug, Clone)]
pub enum TransferFn {
Identity,
Table {
table_values: Vec<f32>,
},
Discrete {
table_values: Vec<f32>,
},
Linear {
slope: f32,
intercept: f32,
},
Gamma {
amplitude: f32,
exponent: f32,
offset: f32,
},
}
impl TransferFn {
#[allow(clippy::str_to_string, reason = "inside macro call")]
fn write_self<'writer, 'result>(
&self,
target: &'writer mut Writer<&'result mut Vec<u8>>,
name: &'static str,
) -> quick_xml::Result<&'writer mut Writer<&'result mut Vec<u8>>> {
target
.create_element(name)
.with_attributes(match self {
TransferFn::Identity => gen_attrs![b"type": "identity"],
TransferFn::Table { table_values } => gen_attrs![
b"type": "table",
b"tableValues": table_values
.iter()
.map(std::string::ToString::to_string)
.reduce(|mut acc, e| {
acc.push(' ');
acc.push_str(&e);
acc
}).expect("empty tables disallowed")
],
TransferFn::Discrete { table_values } => gen_attrs![
b"type": "discrete",
b"tableValues": table_values
.iter()
.map(std::string::ToString::to_string)
.reduce(|mut acc, e| {
acc.push(' ');
acc.push_str(&e);
acc
}).expect("empty tables disallowed")
],
TransferFn::Linear { slope, intercept } => gen_attrs![
b"type": "linear",
b"slope": slope,
b"intercept": intercept
],
TransferFn::Gamma {
amplitude,
exponent,
offset,
} => gen_attrs![
b"type": "gamma",
b"amplitude": amplitude,
b"exponent": exponent,
b"offset": offset
],
})
.write_empty()
}
}

View file

@ -0,0 +1,86 @@
use std::borrow::Cow;
use quick_xml::{events::attributes::Attribute, name::QName};
use super::WriteElement;
/// [feComposite](https://www.w3.org/TR/SVG11/filters.html#feCompositeElement)
#[derive(Debug)]
pub struct Composite {
operator: CompositeOperator,
}
impl Composite {
pub fn new(op: CompositeOperator) -> Self {
Self { operator: op }
}
pub fn arithmetic(k1: f32, k2: f32, k3: f32, k4: f32) -> Self {
Self {
operator: CompositeOperator::Arithmetic { k1, k2, k3, k4 },
}
}
}
#[derive(Debug)]
pub enum CompositeOperator {
Over,
In,
Out,
Atop,
Xor,
Arithmetic { k1: f32, k2: f32, k3: f32, k4: f32 },
}
impl WriteElement for Composite {
fn attrs(&self) -> Vec<quick_xml::events::attributes::Attribute> {
let (op_name, vals) = match self.operator {
CompositeOperator::Over => ("over", None),
CompositeOperator::In => ("in", None),
CompositeOperator::Out => ("out", None),
CompositeOperator::Atop => ("atop", None),
CompositeOperator::Xor => ("xor", None),
CompositeOperator::Arithmetic { k1, k2, k3, k4 } => {
("arithmetic", Some([k1, k2, k3, k4]))
}
};
let mut r = vec![Attribute {
key: QName(b"operator"),
value: Cow::from(op_name.as_bytes()),
}];
if let Some([k1, k2, k3, k4]) = vals {
// r.append(&mut vec![
// Attribute {
// key: QName(b"k1"),
// value: Cow::from(k1.to_string().into_bytes()),
// },
// Attribute {
// key: QName(b"k2"),
// value: Cow::from(k2.to_string().into_bytes()),
// },
// Attribute {
// key: QName(b"k3"),
// value: Cow::from(k3.to_string().into_bytes()),
// },
// Attribute {
// key: QName(b"k4"),
// value: Cow::from(k4.to_string().into_bytes()),
// },
// ]);
r.append(&mut gen_attrs![
b"k1": k1,
b"k2": k2,
b"k3": k3,
b"k4": k4
]);
}
r
}
fn tag_name(&self) -> &'static str {
"feComposite"
}
}

View file

@ -0,0 +1,20 @@
#[derive(Debug)]
pub struct ConvolveMatrix {
order: (u16, u16),
// must be checked to be `order.0 * order.1`
kernel_matrix: Vec<f32>,
divisor: f32,
bias: f32,
target_x: i32,
target_y: i32,
edge_mode: EdgeMode,
kernel_unit_length: (f32, f32),
preserve_alpha: bool,
}
#[derive(Debug)]
enum EdgeMode {
None,
Duplicate,
Wrap,
}

View file

@ -0,0 +1,3 @@
// TODO
#[derive(Debug)]
pub struct DiffuseLighting;

View file

@ -0,0 +1,34 @@
use super::WriteElement;
/// [feDisplacementMap](https://www.w3.org/TR/SVG11/filters.html#feDisplacementMapElement)
#[derive(Debug)]
pub struct DisplacementMap {
pub scale: f32,
pub x_channel_selector: Channel,
pub y_channel_selector: Channel,
}
impl WriteElement for DisplacementMap {
fn attrs(&self) -> Vec<quick_xml::events::attributes::Attribute> {
let mut r = Vec::new();
gen_attrs![
r;
self.scale != 0. => b"scale": self.scale,
self.x_channel_selector != Channel::A => b"xChannelSelector": format!("{:?}", self.x_channel_selector),
self.y_channel_selector != Channel::A => b"yChannelSelector": format!("{:?}", self.y_channel_selector)
];
r
}
fn tag_name(&self) -> &'static str {
"feDisplacementMap"
}
}
#[derive(Debug, PartialEq, Eq)]
pub enum Channel {
A,
R,
G,
B,
}

View file

@ -0,0 +1,23 @@
use csscolorparser::Color;
use super::WriteElement;
/// [feFlood](https://www.w3.org/TR/SVG11/filters.html#feFloodElement)
#[derive(Debug)]
pub struct Flood {
pub flood_color: Color,
pub flood_opacity: f32,
}
impl WriteElement for Flood {
fn attrs(&self) -> Vec<quick_xml::events::attributes::Attribute> {
gen_attrs![
b"flood-color": self.flood_color.to_hex_string(),
b"flood-opacity": self.flood_opacity
]
}
fn tag_name(&self) -> &'static str {
"feFlood"
}
}

View file

@ -0,0 +1,35 @@
use std::borrow::Cow;
use quick_xml::{events::attributes::Attribute, name::QName};
use super::WriteElement;
/// [feGaussianBlur](https://www.w3.org/TR/SVG11/filters.html#feGaussianBlurElement)
#[derive(Debug)]
pub struct GaussianBlur {
std_deviation: (u16, u16),
}
impl GaussianBlur {
pub fn single(v: u16) -> Self {
Self {
std_deviation: (v, v),
}
}
pub fn with_xy(x: u16, y: u16) -> Self {
Self {
std_deviation: (x, y),
}
}
}
impl WriteElement for GaussianBlur {
fn attrs(&self) -> Vec<quick_xml::events::attributes::Attribute> {
gen_attrs![b"stdDeviation": format!("{} {}", self.std_deviation.0, self.std_deviation.1)]
}
fn tag_name(&self) -> &'static str {
"feGaussianBlur"
}
}

View file

@ -0,0 +1,3 @@
// TODO
#[derive(Debug)]
pub struct Image;

View file

@ -0,0 +1,3 @@
// TODO
#[derive(Debug)]
pub struct Merge;

View file

@ -0,0 +1,37 @@
use super::WriteElement;
use std::fmt::Display;
/// [feMorphology](https://www.w3.org/TR/SVG11/filters.html#feMorphologyElement)
#[derive(Debug)]
pub struct Morphology {
operator: Operator,
radius: (f32, f32),
}
impl WriteElement for Morphology {
fn attrs(&self) -> Vec<quick_xml::events::attributes::Attribute> {
gen_attrs![
b"operator": self.operator,
b"radius": format!("{} {}", self.radius.0, self.radius.1)
]
}
fn tag_name(&self) -> &'static str {
"feMorphology"
}
}
#[derive(Debug)]
enum Operator {
Erode,
Dilate,
}
impl Display for Operator {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.write_str(match self {
Operator::Erode => "erode",
Operator::Dilate => "dilate",
})
}
}

View file

@ -0,0 +1,24 @@
use super::WriteElement;
/// [feOffset](https://www.w3.org/TR/SVG11/filters.html#feOffsetElement)
#[derive(Debug)]
pub struct Offset {
dx: f32,
dy: f32,
}
impl Offset {
pub fn new(dx: f32, dy: f32) -> Self {
Self { dx, dy }
}
}
impl WriteElement for Offset {
fn attrs(&self) -> Vec<quick_xml::events::attributes::Attribute> {
gen_attrs![b"dx": self.dx, b"dy": self.dy]
}
fn tag_name(&self) -> &'static str {
"feOffset"
}
}

Some files were not shown because too many files have changed in this diff Show more