You are on page 1of 8

Boolean algebra

From Wikipedia, the free encyclopedia


(Redirected from Boolean logic)
Jump to navigationJump to search
For other uses, see Boolean algebra (disambiguation).
In mathematics and mathematical logic, Boolean algebra is a branch of algebra. It
differs from elementary algebra in two ways. First, the values of the variables are
the truth values true and false, usually denoted 1 and 0, whereas in elementary
algebra the values of the variables are numbers. Second, Boolean algebra uses
logical operators such as conjunction (and) denoted as ∧, disjunction (or) denoted
as ∨, and the negation (not) denoted as ¬. Elementary algebra, on the other hand,
uses arithmetic operators such as addition, multiplication, subtraction and
division. So Boolean algebra is a formal way of describing logical operations, in
the same way that elementary algebra describes numerical operations.

Boolean algebra was introduced by George Boole in his first book The Mathematical
Analysis of Logic[1] (1847), and set forth more fully in his An Investigation of
the Laws of Thought (1854).[2] According to Huntington, the term "Boolean algebra"
was first suggested by Sheffer in 1913,[3] although Charles Sanders Peirce gave the
title "A Boolean Algebra with One Constant" to the first chapter of his "The
Simplest Mathematics" in 1880.[4] Boolean algebra has been fundamental in the
development of digital electronics, and is provided for in all modern programming
languages. It is also used in set theory and statistics.[5]

Contents
1 History
2 Values
3 Operations
3.1 Basic operations
3.2 Secondary operations
4 Laws
4.1 Monotone laws
4.2 Nonmonotone laws
4.3 Completeness
4.4 Duality principle
5 Diagrammatic representations
5.1 Venn diagrams
5.2 Digital logic gates
6 Boolean algebras
6.1 Concrete Boolean algebras
6.2 Subsets as bit vectors
6.3 The prototypical Boolean algebra
6.4 Boolean algebras: the definition
6.5 Representable Boolean algebras
7 Axiomatizing Boolean algebra
8 Propositional logic
8.1 Applications
8.2 Deductive systems for propositional logic
8.2.1 Sequent calculus
9 Applications
9.1 Computers
9.2 Two-valued logic
9.3 Boolean operations
9.3.1 Natural language
9.3.2 Digital logic
9.3.3 Naive set theory
9.3.4 Video cards
9.3.5 Modeling and CAD
9.3.6 Boolean searches
10 See also
11 Notes
12 References
13 Further reading
13.1 Historical perspective
14 External links
History
A precursor of Boolean algebra was Gottfried Wilhelm Leibniz's algebra of concepts.
Leibniz's algebra of concepts is deductively equivalent to the Boolean algebra of
sets.[6]

Boole's algebra predated the modern developments in abstract algebra and


mathematical logic; it is however seen as connected to the origins of both fields.
[7] In an abstract setting, Boolean algebra was perfected in the late 19th century
by Jevons, Schröder, Huntington and others, until it reached the modern conception
of an (abstract) mathematical structure.[7] For example, the empirical observation
that one can manipulate expressions in the algebra of sets, by translating them
into expressions in Boole's algebra, is explained in modern terms by saying that
the algebra of sets is a Boolean algebra (note the indefinite article). In fact, M.
H. Stone proved in 1936 that every Boolean algebra is isomorphic to a field of
sets.

In the 1930s, while studying switching circuits, Claude Shannon observed that one
could also apply the rules of Boole's algebra in this setting,[8] and he introduced
switching algebra as a way to analyze and design circuits by algebraic means in
terms of logic gates. Shannon already had at his disposal the abstract mathematical
apparatus, thus he cast his switching algebra as the two-element Boolean algebra.
In modern circuit engineering settings, there is little need to consider other
Boolean algebras, thus "switching algebra" and "Boolean algebra" are often used
interchangeably.[9][10][11]

Efficient implementation of Boolean functions is a fundamental problem in the


design of combinational logic circuits. Modern electronic design automation tools
for VLSI circuits often rely on an efficient representation of Boolean functions
known as (reduced ordered) binary decision diagrams (BDD) for logic synthesis and
formal verification.[12]

Logic sentences that can be expressed in classical propositional calculus have an


equivalent expression in Boolean algebra. Thus, Boolean logic is sometimes used to
denote propositional calculus performed in this way.[13][14][15] Boolean algebra is
not sufficient to capture logic formulas using quantifiers, like those from first
order logic.

Although the development of mathematical logic did not follow Boole's program, the
connection between his algebra and logic was later put on firm ground in the
setting of algebraic logic, which also studies the algebraic systems of many other
logics.[7] The problem of determining whether the variables of a given Boolean
(propositional) formula can be assigned in such a way as to make the formula
evaluate to true is called the Boolean satisfiability problem (SAT), and is of
importance to theoretical computer science, being the first problem shown to be NP-
complete. The closely related model of computation known as a Boolean circuit
relates time complexity (of an algorithm) to circuit complexity.

Values
Whereas expressions denote mainly numbers in elementary algebra, in Boolean
algebra, they denote the truth values false and true. These values are represented
with the bits (or binary digits), namely 0 and 1. They do not behave like the
integers 0 and 1, for which 1 + 1 = 2, but may be identified with the elements of
the two-element field GF(2), that is, integer arithmetic modulo 2, for which 1 + 1
= 0. Addition and multiplication then play the Boolean roles of XOR (exclusive-or)
and AND (conjunction), respectively, with disjunction x ∨ y (inclusive-or)
definable as x + y - xy and negation ¬x as 1 − x. In GF(2), − may be replaced by +,
since they denote the same operation; however this way of writing Boolean
operations allows applying the usual arithmetic operations of integers (this may be
useful when using a programming language in which GF(2) is not implemented).

Boolean algebra also deals with functions which have their values in the set {0,
1}. A sequence of bits is a commonly used for such functions. Another common
example is the subsets of a set E: to a subset F of E, one can define the indicator
function that takes the value 1 on F, and 0 outside F. The most general example is
the elements of a Boolean algebra, with all of the foregoing being instances
thereof.

As with elementary algebra, the purely equational part of the theory may be
developed, without considering explicit values for the variables.[16]

Operations

This section needs additional citations for verification. Please help improve this
article by adding citations to reliable sources. Unsourced material may be
challenged and removed.
Find sources: "Boolean algebra" – news · newspapers · books · scholar · JSTOR
(April 2019) (Learn how and when to remove this template message)
Further information: Truth table
Basic operations
The basic operations of Boolean algebra are conjunction, disjunction, and negation.
These Boolean operations are expressed with the corresponding binary operators AND,
and OR and the unary operator NOT, collectively referred to as Boolean operators.
[17]

The basic Boolean operations on variables x and y are defined as follows:

Logical operation Operator Notation Alternative notations Definition


Conjunction AND x∧y x AND y, Kxy x∧y = 1 if x = y = 1, x∧y = 0 otherwise
Disjunction OR x∨y x OR y, Axy x∨y = 0 if x = y = 0, x∨y = 1 otherwise
Negation NOT ¬x NOT x, Nx, x̅, x', !x ¬x = 0 if x = 1, ¬x = 1 if x = 0
Alternatively the values of x∧y, x∨y, and ¬x can be expressed by tabulating their
values with truth tables as follows:

x\wedge y

x\vee y
0 0 0 0
1 0 0 1
0 1 0 1
1 1 1 1

x
¬

\neg x
0 1
1 0
If the truth values 0 and 1 are interpreted as integers, these operations may be
expressed with the ordinary operations of arithmetic (where x + y uses addition and
xy uses multiplication), or by the minimum/maximum functions:

=
min
(

(
1

)
=
max
(

)
¬

=
1

{\displaystyle {\begin{aligned}x\wedge y&=xy=\min(x,y)\\x\vee y&=x+y-xy=x+y(1-x)=\


max(x,y)\\\neg x&=1-x\end{aligned}}}
One might consider that only negation and one of the two other operations are
basic, because of the following identities that allow one to define conjunction in
terms of negation and the disjunction, and vice versa (De Morgan's laws):

=
¬
(
¬


¬

=
¬
(
¬


¬

)
{\begin{aligned}x\wedge y&=\neg (\neg x\vee \neg y)\\x\vee y&=\neg (\neg x\wedge \
neg y)\end{aligned}}
Secondary operations
The three Boolean operations described above are referred to as basic, meaning that
they can be taken as a basis for other Boolean operations that can be built up from
them by composition, the manner in which operations are combined or compounded.
Operations composed from the basic operations include the following examples:

Material conditional:

=
¬

{\textstyle x\rightarrow y=\neg {x}\vee y}


Material biconditional:

=
(

)

(
¬


¬
)
{\textstyle x\leftrightarrow y=(x\land y)\lor (\neg x\land \neg y)}
Exclusive OR (XOR):

=
¬
(

)
=
(

)

¬
(

)
=
(

)

(
¬


¬

)
=
(


¬

)

(
¬

)
{\textstyle x\oplus y=\neg (x\equiv y)=(x\vee y)\wedge \neg (x\wedge y)=(x\vee y)\
wedge (\neg x\vee \neg y)=(x\wedge \neg y)\vee (\neg x\wedge y)}
Logical equivalence:

=
¬
(

)
=
(

)

(
¬


¬

)
{\textstyle x\equiv y=\neg {(x\oplus y)}=(x\wedge y)\vee (\neg x\wedge \neg y)}
These definitions give rise to the following truth tables giving the values of
these operations for all four possible inputs.

Secondary operations. Table 1

x\rightarrow y

x\oplus y

x\equiv y
0 0 1 0 1
1 0 0 1 0
0 1 1 1 0
1 1 1 0 1
Material conditional
The first operation, x → y, or Cxy, is called material implication. If x is true,
then the result of expression x → y is taken to be that of y (e.g. if x is true and
y is false, then x → y is also false). But if x is false, then the value of y can
be ignored; however, the operation must return some boolean value and there are
only two choices. So by definition, x → y is true when x is false. (relevance logic
suggests this definition, by viewing an implication with a false premise as
something other than either true or false.)
Exclusive OR (XOR)
The second operation, x ⊕ y, or Jxy, is called exclusive or (often abbreviated as
XOR) to distinguish it from disjunction as the inclusive kind. It excludes the
possibility of both x and y being true (e.g. see table): if both are true then
result is false. Defined in terms of arithmetic it is addition where mod 2 is 1 + 1
= 0.
Logical equivalence
The third operation, the complement of exclusive or, is equivalence or Boolean
equality: x ≡ y, or Exy, is true just when x and y have the same value. Hence x ⊕ y
as its complement can be understood as x ≠ y, being true just when x and y are
different. Thus, its counterpart in arithmetic mod 2 is x + y. Equivalence's
counterpart in arithmetic mod 2 is x + y + 1.
Given two operands, each with two possible values, there are 22 = 4 possible
combinations of inputs. Because each output can have two possible values, there are
a total of 24 = 16 possible binary Boolean operations. Any such operation or
function (as well as any Boolean function with more inputs) can be expressed with
the basic operations from above. Hence the basic operations are functionally
complete.

Laws
A law of Boolean algebra is an identity such as x ∨ (y ∨ z) = (x ∨ y) ∨ z between
two Boolean terms, where a Boolean term is defined as an expression built up from
variables and the constants 0 and 1 using the operations ∧, ∨, and ¬. The concept
can be extended to terms involving other Boolean operations such as ⊕, →, and ≡,
but such extensions are unnecessary for the purposes to which the laws are put.
Such purposes include the definition of a Boolean algebra as any model of the
Boolean laws, and as a means for deriving new laws from old as in the derivation of
x ∨ (y ∧ z) = x ∨ (z ∧ y) from y ∧ z = z ∧ y (as treated in § Axiomatizing Boolean
algebra).

You might also like