summaryrefslogtreecommitdiff
path: root/integration_tests/tests
diff options
context:
space:
mode:
authorMartin Fischer <martin@push-f.com>2023-09-10 19:37:34 +0200
committerMartin Fischer <martin@push-f.com>2023-09-28 10:36:08 +0200
commit852d5c6f2e65a5ab466662ae1c649a0ed25c70a9 (patch)
tree96d6bcdb2f2274f1081a0b6cfbde314f319159a1 /integration_tests/tests
parenta03cea75d9d120a7519be91ec872b143b5d74276 (diff)
break!: move offsets out of Token
Previously the Token enum contained the offsets using the O generic type parameter, which could be a usize if you're tracking offsets or a zero-sized type if you didn't care about offsets. This commit moves all the byte offset and syntax information to a new Trace enum, which has several advantages: * Traces can now easily be stored separately, while the tokens are fed to the tree builder. (The tree builder only has to keep track of which tree nodes originate from which tokens.) * No needless generics for functions that take a token but don't care about offsets (a tree construction implementation is bound to have many of such functions). * The FromIterator<(String, String)> impl for AttributeMap no longer has to specify arbitrary values for the spans and the value_syntax). * The PartialEq implementation of Token is now much more useful (since it no longer includes all the offsets). * The Debug formatting of Token is now more readable (since it no longer includes all the offsets). * Function pointers to functions accepting tokens are possible. (Since function pointer types may not have generic parameters.)
Diffstat (limited to 'integration_tests/tests')
-rw-r--r--integration_tests/tests/test_html5lib.rs4
1 files changed, 2 insertions, 2 deletions
diff --git a/integration_tests/tests/test_html5lib.rs b/integration_tests/tests/test_html5lib.rs
index eac11dd..42d93f1 100644
--- a/integration_tests/tests/test_html5lib.rs
+++ b/integration_tests/tests/test_html5lib.rs
@@ -107,7 +107,7 @@ fn run_test_inner<R, O, E, T>(
R: Reader + Position<O>,
O: Offset,
E: Emitter<O> + Iterator<Item = T> + DrainErrors<O>,
- T: Into<Token<O>>,
+ T: Into<Token>,
{
println!(
"==== FILE {}, TEST {}, STATE {:?}, TOKENIZER {} ====",
@@ -156,7 +156,7 @@ fn run_test_inner<R, O, E, T>(
actual_tokens.push(TestToken::Character(c.into()));
}
}
- Token::Comment(comment) => actual_tokens.push(TestToken::Comment(comment.data)),
+ Token::Comment(comment) => actual_tokens.push(TestToken::Comment(comment)),
Token::Doctype(doctype) => actual_tokens.push(TestToken::Doctype {
name: doctype.name,
public_id: doctype.public_id,