Tokenization & Lexical Analysis

179 Views
Stephan Kreutzer
Stephan Kreutzer
21 Jul 2021

Loading (deserializing) structured input data into computer memory as an implicit chain of tokens in order to prepare subsequent processing, syntactical/semantical analysis, conversion, parsing, translation or execution.

Show more

0 Comments Sort By

No comments found

Up next