5 releases
| 0.1.4 | Mar 31, 2022 |
|---|---|
| 0.1.3 | Mar 29, 2022 |
| 0.1.2 | Mar 28, 2022 |
| 0.1.1 | Mar 28, 2022 |
| 0.1.0 | Mar 28, 2022 |
#6 in #regex-lexer
13KB
255 lines
lexer-generator
This crate is a small scale lexer package which is parsed from JSON
Example: Basic Tokenizing
Potential code one might use to lex tokens for a calculator
key.json:
{
"literals": {
"number": "[0-9]*(\\.[0-9]*){0, 1}",
"subtract": "-",
"add": "\\+",
"divide": "/",
"multiply": "\\*"
},
"whitespace": "\n| |\r|\t"
}
main.rs:
let json: String = std::fs::read_to_string("key.json").unwrap();
let source: String = String::from("123 + 456 * 789");
let mut lexer = Lexer::from(json, source);
// parsing, runtime, whatever one would want to do with their tokens
"123 + 456 * 789" -> Token("number", "123"), Token("add", "*"), Token("number", "456"), Token("multiply", "*"), Token("number", "789") // ignoring line position and the incremental nature of the lexer
lexer-generator
Lexer crate derived from Regex patterns with user customizeable tokens
Example: Basic Tokenizing
Potential code one might use to lex tokens for a calculator
key.json:
{
"literals": {
"number": "[0-9]*(\\.[0-9]*){0, 1}",
"subtract": "-",
"add": "\\+",
"divide": "/",
"multiply": "\\*"
},
"whitespace": "\n| |\r|\t"
}
main.rs:
let json: String = std::fs::read_to_string("key.json").unwrap();
let source: String = String::from("123 + 456 * 789");
let mut lexer = Lexer::from(json, source);
while !lexer.done() {
println!("{}", lexer.next_token().unwrap());
}
number(123)
add(+)
number(456)
multiply(*)
number(789)
Dependencies
~2.4–4.5MB
~81K SLoC