You are on page 1of 8

Lesson 2

Threshold Logic

Threshold Unit or Threshold Logic Unit (TLU)

TLU

Activation a = w1 x1 + w2 x2 + wn xn Example: = (0.5, 1.0, 1.2) , = (1, 1, 0) , = (0.5 1) + (1.0 1) + (1.2 0) = 0.5 + 1.0 = 1.5 .

then

To emulate the generation of action potentials, we need a threshold value . 1 if a >= y= 0 if a < .

In the previous example, if = 1, then y would equal 1 (because a is >= 1).

Resilience To Noise and Hardware Failure

Consider the TLU shown below, and whose behavior is shown in the table to the right. The TLU fires, or outputs one only when x2 is 1:

x1 0 0 1 1

x2 0 1 0 1

Activation Output 0 0 1 1 0 0 1 1

 Now. Example: instead of input value 1. produce no change in the output. The revised table shown above would then be as follows: x1 0 0 1 1 x2 0 1 0 1 Activation Output 0 0 0. consider a case where an activation change from 0.2.8). we have 0. whereas an activation change from 0. This is because 0.8. as long as they don't cross the threshold. Non-linear systems do not obey a proportionality restraint so that the magnitude of change in output does not necessarily reflect that of the input. Our revised table appears below: .   Similarly. if input signals become degraded in some way.e. Thus:  Changes in the activation. once again. suppose the hardware which implements the TLU is faulty. and instead of input 0.51 is larger than the threshold 0.49 to 0.0 to 0. we have 0. 0. Example: In the above TLU.2 0 1 1  In the revised table above. thus giving us weights encoded as (0.2 produces no change in the output. due to noise or a partial loss.2. correct outputs occur. the output is proportionally related to the input: i.8 1 0.5.  Non-Linear Systems  In a linear system. but the output is the same. we see that the activation has changed.51 produces a change in the output from 0 to 1. small/large changes in the input always produce corresponding small/large changes in the output..

the number of TLU's giving incorrect results will gradually increase as well.8 0. and those that give output 1. as the degree of hardware and/or signal degradation increases.8 0. TLU's are robust in the presence of noisy or corrupted input signals. Consider the TLU shown here which classifies its input patterns into two groups: those that give output 0.8 1 0.2 0.8 1 non-linear behavior of TLU's x2  Thus.x1 0. x1 0 0 1 1 x2 0 1 0 1 Activation Output 0 0 1 0 1 0 2 1   The pattern space of this TLU is as follows: .2 0.2 0 0.2 0.8 Activation Output 0.2 0 0. Graceful degradation: In a large network.   Geometric Interpretation of TLU action  A TLU separates its input patterns into two categories according to its binary response (0 or 1) to each pattern.8 0.2 0. Contrast this with what happens in the case of conventional computers.

 The critical condition for classification occurs when the activation equals the threshold we have: w1 x1 + w2 x2 = Subtracting w1 x1 from both sides w2 x2 = . The Linear Separation of Classes Inputs: x1. b = ( / w2) = 1.(w1 / w2) = -1 .   Our previous example revisited: w1 = 1. This is of the general form: y=m x+b Recall that this is the linear (line) equation where m is the slope. y = -x + 1.5.5 . x2 Activation: a Threshold: . = 1.5 / 1 = 1.w1 x1 + Dividing both sides by w2 gives x2 = . m = .(w1 / w2) x1 + ( / w2) . and b is the y-intercept. w2 = 1.5 x y .

or more generally. the vector can now be considered as an ordered pair.)    Scalar Multiplication of a vector: .g..  Vectors  A vector has a magnitude and a direction.5 The two classes of the TLU output are separated by the red line. Alternate Representation . where v1. We denote a vector v with . With the above representation. v2). = (v1. v1).5 0 0 1. This will always be the case . an ordered list of numbers . (v1.) We denote a vector's magnitude with . (in Europe. the x-axis). in general.  TLU's are linear separators and their patterns are linearly separable. .    A vector is defined by a pair of numbers ( .. (e.note that the order is important. v2 are the components of the vector.Cartesian co-ordinate system. For example. where is the angle the vector makes with some reference direction (e.in Rn we will have separating hyperplanes. sometimes with r.g. v2) (v2.1. ).

= sqrt( vi 2 ).  Vector Addition:  Two vectors may be added in 2D by appending one to the end of the other. then   In n-dimensions..Geometric Form:  Useful to have some measure of how well aligned two vectors are.) .. u2 + v2. In terms of its components. with i running from 1 to n. un + vn). kvn). we have: = + = (u1 + v1. Note that if is > 360° we have equivalence to 360 .)   Vector Substraction The Length of a Vector    in 2D. i. + = + ... = (2. Note that the vector may be drawn anywhere in the space as long as its magnitude and direction are preserved. if w 1 = u1 + v 1 w 2 = u2 + v 2 . Inner Product: = cos where is the angle between the two. to know whether they point in the same or opposite direction.. Generalizing to n dimensions. Note: Addition is commutative.0).e. = sqrt (v12 + v22). . we have k = (kv1.. (note the angle between them is 45°. (i.   The Inner Product .. .e.kv2 .. This product is also known as the scalar product..  Comparing Vectors  The Inner Product .Algebraic Form:  Given = (1. in n-dimensions.1). = + .

with i running from 1 to n. and that = sqrt(22+02) = 2. If the sum is positive. = sqrt(2) * 2 * (1/sqrt(2)) = 2. into is:   . What happens when the activation equals the threshold? Let n = 2. the projection of xw = / . If the sum is zero. This is also called the dot product. then the vectors are lined up to some extent. where vw = Alternatively.      Vector Projection How much of lies in the direction of ? Drop a perpendicular projection vw of onto .. cos . we have = vi wi. = (*) For an arbitrary . we have = v1 w 1 + v2 w 2 = 1*2 + 1*0 = 2.   TLU's and Linear Separability Revisited  The activation a of an n-input TLU is given by a= .e. then the vectors are pointing away from each other. Alternatively. We have w1x1 + w2x2 = i. If the sum is negative. In n-dimensions. observe that = sqrt(12+12) = sqrt(2). then the vectors are at right angles. we may write: vw = ( cos ) / =( )/ .

we have: xw = / . and x must lie in region B. the relation = defines a straight line. in 2D. TLU is a linear classifier. then they cannot be classified with a single TLU. Therefore. then the projection is shorter. If patterns cannot be separated by a hyperplane.  If xw < / . Suppose xw > / . When x lies on the hyperplane: = and hence y = 1. the projection xw is constant and. in the case of 2D. Generalizing to ndimensions. If the constraint in (*) is imposed. assuming and are constant. the counterpart is a hyperplane. then x must lie in region A. as    > and y = 1. (For example: consider when x = 2 and y = 3)  So. xw must lie along the perpendicular to the weight vector.  .