json-lexer: limit the maximum size of a given token
Signed-off-by: Michael Roth <mdroth@linux.vnet.ibm.com>Signed-off-by: Anthony Liguori <aliguori@us.ibm.com>
json-streamer: limit the maximum recursion depth and maximum token count
json-streamer: make sure to reset token_size after emitting a token list
json-parser: detect premature EOI
json-lexer: reset the lexer state on an invalid token
json-lexer: fix flushing logic to not always go to error state
Currently we flush the lexer by passing in a NULL character. Thisgenerally forces the lexer to go to the corresponding TERMINAL statefor whatever token type it is currently parsing, emits the token to the...
json-lexer: make lexer error-recovery more deterministic
Currently when we reach an error state we effectively flush everythingfed to the lexer, which can put us in a state where we keep feedingtokens into the parser at arbitrary offsets in the stream. This makes it...
json-streamer: add handling for JSON_ERROR token/state
This allows a JSON_ERROR state to be passed to the streamer to force aflush of the current tokens and pass a NULL token list to the parserrather that have it churn on bad data. (Alternatively we could just not...
json-parser: propagate error from parser
json-streamer: allow recovery after bad input
Once we detect a malformed message, make sure to reset our state.
View all revisions | View revisions
Also available in: Atom