summaryrefslogtreecommitdiff
path: root/integration_tests
diff options
context:
space:
mode:
authorMartin Fischer <martin@push-f.com>2023-08-28 19:04:58 +0200
committerMartin Fischer <martin@push-f.com>2023-09-03 23:00:05 +0200
commitf31bffb8426f04aaadea911e7c42b130a9ee80a5 (patch)
tree9e6e999f7a23baa797e2e7c99d0c6445866b06db /integration_tests
parent5600dd2fd373879bede0949544cde71c232eb4f4 (diff)
fix!: remove adjusted_current_node_present_and_not_in_html_namespace
Conceptually the tokenizer emits tokens, which are then handled in the tree construction stage (which this crate doesn't yet implement). While the tokenizer can operate almost entirely based on its state (which may be changed via Tokenizer::set_state) and its internal state, there is the exception of the 'Markup declaration open state'[1], the third condition of which depends on the "adjusted current node", which in turn depends on the "stack of open elements" only known to the tree constructor. In 82898967320f90116bbc686ab7ffc2f61ff456c4 I tried to address this by adding the adjusted_current_node_present_and_not_in_html_namespace method to the Emitter trait. What I missed was that adding this method to the Emitter trait effectively crippled the composability of the API. You should be able to do the following: struct TreeConstructor<R, O> { tokenizer: Tokenizer<R, O, SomeEmitter<O>>, stack_of_open_elements: Vec<NodeId>, // ... } However this doesn't work if the implementation of SomeEmitter depends on the stack_of_open_elements field. This commits remedies this oversight by removing this method and instead making the Tokenizer yield values of a new Event enum: enum Event<T> { Token(T), CdataOpen } Event::CdataOpen signals that the new Tokenizer::handle_cdata_open method has to be called, which accepts a CdataAction: enum CdataAction { Cdata, BogusComment } the variants of which correspond exactly to the possible outcomes of the third condition of the 'Markup declaration open state'. Removing this method also has the added benefit that the DefaultEmitter is now again spec-compliant, which lets us expose it again in the next commit in good conscience (previously it just hard-coded the method implementation to return false, which is why I had removed the DefaultEmitter from the public API in the last release). [1]: https://html.spec.whatwg.org/multipage/parsing.html#markup-declaration-open-state
Diffstat (limited to 'integration_tests')
-rw-r--r--integration_tests/tests/test_html5lib.rs13
1 files changed, 10 insertions, 3 deletions
diff --git a/integration_tests/tests/test_html5lib.rs b/integration_tests/tests/test_html5lib.rs
index fd69524..f351f85 100644
--- a/integration_tests/tests/test_html5lib.rs
+++ b/integration_tests/tests/test_html5lib.rs
@@ -4,7 +4,8 @@ use html5lib_tests::{
parse_tests, Error as TestError, InitialState, Output, Test, Token as TestToken,
};
use html5tokenizer::{
- offset::NoopOffset, reader::Reader, DefaultEmitter, InternalState, Token, Tokenizer,
+ offset::NoopOffset, reader::Reader, CdataAction, DefaultEmitter, Event, InternalState, Token,
+ Tokenizer,
};
use similar_asserts::assert_eq;
@@ -119,8 +120,14 @@ fn run_test_inner<R: Reader>(
tokens: Vec::new(),
};
- for token in tokenizer {
- let token = token.unwrap();
+ while let Some(event) = tokenizer.next() {
+ let token = match event.unwrap() {
+ Event::CdataOpen => {
+ tokenizer.handle_cdata_open(CdataAction::BogusComment);
+ continue;
+ }
+ Event::Token(token) => token,
+ };
match token {
Token::Error { error, .. } => actual.errors.push(TestError {