Next | Regular Expression Mastery | 73 |
Tokens are the basic syntactically meaningful portions of an input.
For example, in
print 12+3;
The tokens are print, 12, +, 3, and ;
Individual characters are not generally meaningful.
Tokenizing is the act of converting a character stream into a token stream.
Also called lexing
Next | Copyright © 2002 M. J. Dominus |