You are on page 1of 4

Practical - 3

Aim :- Installation of LEX Tool

Lex is a lexical analyzer generator tool that is widely used in the field of
computer science to create scanner for computer programs. A scanner is a
program that reads the input character by character and produces a sequence
of tokens, which are the smallest syntactic units of a programming language.

Feature of LEX :-
● Lex is a tool that takes a file containing regular expressions and generates
a scanner program that can recognize these expressions in an input
stream. The input file is divided into two sections: the first section
contains declarations that define the tokens and other variables used by
the scanner, while the second section contains rules that specify how the
scanner should recognize the tokens.
● Lex uses regular expressions to describe the patterns that it should
match. A regular expression is a pattern that specifies a set of strings. For
example, the regular expression [a-zA-Z]+ matches one or more
consecutive letters of the alphabet, while the regular expression [0-9]+
matches one or more consecutive digits. The rules in the input file use
regular expressions to specify the patterns that the scanner should
recognize and the actions that should be taken when a pattern is
matched
● When a scanner is generated by Lex, it is typically written in C or another
programming languages. The generated scanner reads input from a file
or a stream, and when a pattern is matched, it calls a user-defined
function that performs some action based on the token that was
recognized.
History of LEX :-
● Lex was first developed in the late 1970s by Mike Lesk and Eric Schmidt
at Bell Laboratories. It was designed to be used in conjunction with the
YACC parser generator, which was also developed at Bell Labs. The
combination of Lex and Yacc was used to create the first version of the
Unix operating system.
● Lex was developed to address the problem of scanning input streams in
programming languages. Prior to the development of Lex, scanning was
typically done using ad-hoc methods that were difficult to maintain and
error-prone. Lex provides a way to generate a scanner automatically
from a high-level description of the token patterns.

Application of LEX :-
● Lex is used in many applications today, ranging from compiler and
interpreter to text processing utilities. In the area of compiler and
interpreter, Lex is often used in conjunction with Yacc to create complete
language processors for programming languages. In text processing, Lex
is used to recognize and tokenize input data, which can then be
processed by other tools.
● Lex is also used in the field of natural language processing to recognize
and tokenize input text. For example, Lex can be used to recognize
patterns in text that indicate the presence of certain named entities,
such as people, places, and organisations.

Conclusion :-
Lex is a powerful tool for creating scanners that can recognize complex patterns
in input data. Its use of regular expressions makes it easy to describe the
patterns that should be matched, and its ability to generate code in target
programming languages makes it highly flexible. Lex has a long history of use in
the development of programming languages and software tools.
Practical - 4

Aim :- Installation of YACC Tool

Yacc (Yet Another Compiler Compiler) is a powerful tool used in computer


science to generate parsers for programming languages. It was developed in
the early 1970s by stephen C. Johnson at Bells Labs as a part of the
development of the Unix operating system.

Features of YACC :-
● Yacc is a tool that takes as input a specification of the grammar of a
programming language and generates a parser that can recognize and
parse input conforming to that grammar. The input file is divided into
three sections: the first section contains declarations of variables and
other information used by the parser, the second section contains the
grammar rules, and the third section contains user-defined code that is
executed when a rule is matched.
● Yacc uses context-free grammar to describe the syntax of the language
being parsed. A context-free grammar consists of a set of productions,
each of which defines how a non-terminal symbol can be rewritten in
terms of other symbols. For example, a production might define how an
arithmetic expression can be rewritten in terms of operands and
operators.
● When a parser is generated by Yacc, it is typically written in C or another
programming language. The generated parser reads input from a file or a
stream, and when a grammar rule is matched, it calls a user-defined
function that performs some action based on the input that was
recognized.
History of YACC :-
● Yacc was developed in the early 1970s by Stephen C. Johnson at Bell
Labs, as part of the development of the Unix operating system. It was
designed to be used in conjunction with the Lex lexical analyzer
generator, which was also developed at Bell Labs.
● The development of Yacc was motivated by the need for a tool that could
automatically generate parsers for programming languages. Prior to the
development of Yacc, parsers were typically written by hand, which was
a time-consuming and error-prone process. Yacc provided a way to
generate parsers automatically from a high-level description of the
grammar of the language being parsed.

Application of YACC :-
● Yacc is used in many applications today, ranging from compilers and
interpreters to text processing utilities. In the area of compilers and
interpreters, Yacc is often used in conjunction with Lex to create
complete language processors for programming languages. In text
processing, Yacc is used to recognize and parse input data, which can
then be processed by other tools.
● Yacc is also used in the field of natural language processing to recognize
and parse input text. For example, Yacc can be used to recognize and
parse sentences in a natural language, such as English or French.

Conclusion :-
Yacc is a powerful tool for generating parsers for programming languages and
other formal languages. Its use of a context-free grammar makes it easy to
describe the syntax of the language being parsed, and its ability to generate
code in a target programming language makes it highly flexible.

You might also like