翻訳と辞書 |
Lexical analysis : ウィキペディア英語版 | Lexical analysis In computer science, lexical analysis is the process of converting a sequence of characters (such as in a computer program or web page) into a sequence of tokens (strings with an identified "meaning"). A program that performs lexical analysis may be called a lexer, tokenizer,〔(www.cs.man.ac.uk )〕 or scanner (though "scanner" is also used to refer to the first stage of a lexer). Such a lexer is generally combined with a parser, which together analyze the syntax of programming languages, web pages, and so forth. ==Applications== A lexer forms the first phase of a compiler frontend in modern processing, and is generally done in a single pass. Lexers and parsers are most often used for compilers, but can be used for other computer language tools, such as prettyprinters or linters. Lexing itself can be divided into two stages: the ''scanning,'' which segments the input sequence into groups and categorizes these into token classes; and the ''evaluating,'' which converts the raw input characters into a processed value. Lexers are generally quite simple, with most of the complexity deferred to the parser or semantic analysis phases, and can often be generated by a lexer generator, notably lex or derivatives. However, lexers can sometimes include some complexity, such as phrase structure processing to make input easier and simplify the parser, and may be written partially or completely by hand, either to support additional features or for performance.
抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Lexical analysis」の詳細全文を読む
スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース |
Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.
|
|