COMPILER CONSTRUCTION TOOLS: The compiler writer like any programmer can profitably use software tools such

as debuggers,version managers,profiters and so on. Compiler construction tools are • • • • • Parser generators Scanner generators Syntax-directed translations engines Automatic code generators Dataflow engines

PARSER GENERATORS: These produce syntax analysers,normally from input that is based on CFG. Eg: PIC,EQM SCANNER GENERATOR: These automatically generate lexical analyser,normally from a specification based on regular expressions. SYNTAX-DIRECTED TRANSLATION ENGINES: These produce intermediate code with three address format,normally from input that is based on the parse tree. AUTOMATIC CODE GENERATOR:

It takes a collection of rules that define the translation of each operation of the intermediate language in to the machine language for the target machine. The input specification for these systems may contain: 1. A description of the lexical and syntactic structure of the source language. 2. A description of what output is to be generated for each source language construct. 3. A description of the target machine.

DATAFLOW ENGINES:

Another task is converting error messages from the compiler with the source program. • Two phases . the lexical analyser reads input characters until it can dentify the next token. • • • Compiler. 1. One task is stripping out from the source program comments and while space in the form of blank.Much of the information needed to perform good code optimization involves “dataflow analysis”. Compiler-generators Translator-writing systems ROLE OF LEXICAL ANALYSER: To read the input characters and produce as output a sequence of tokens that the parser uses for syntax analysis.tale.newline characters. These systems have often been referred as. 2. Its secondary takes are.compilers. tokens Source program Lexical analyser parser Get next token Symbol table • • Receiving a “get next token” command from the parser. the gathering of information about how values are transmitted from one part of a program to each other part.

3. Special symbols 5. It keeps track of line number. It produces the stream of tokens. Compiler efficiency is improved. Typical tokens are.constants encounted in the input. • • • Simpler design. Compiler portability is enhanced. Operators 4. TOKEN: It is a sequence of character that can be treated as a single logical entity. It generates symbol table which stores the information about ID.1. Identifiers 2. 4. 2. ISSUES IN LEXICAL ANALYSIS: There are several reasons for separating the analysis phase of compiling into lexical analysis and parsing. 5. Keywords 3. It reports the error encountered while interrupting the tokens. 1. It eliminates blank and commands. while the lexical analyser proper does the more complex operations. Constants PATTERN: . The scanner is responsible for doing simple tasks. Lexical analysis FUNCTIONS: 1. Scanning 2.

A set of strings in the input for which the same token is produced as output. LEXEME: It is sequence of characters in the source program that is matched by the pattern foe a token. This set of strings is described by a rule called a pattern associated with the token. .