You are on page 1of 2

A sensor is a device that produces an output signal for the purpose of sensing a

physical phenomenon.

In the broadest definition, a sensor is a device, module, machine, or subsystem


that detects events or changes in its environment and sends the information to
other electronics, frequently a computer processor. Sensors are always used with
other electronics.

Sensors are used in everyday objects such as touch-sensitive elevator buttons


(tactile sensor) and lamps which dim or brighten by touching the base, and in
innumerable applications of which most people are never aware. With advances in
micromachinery and easy-toAnalog sensors such as potentiometers and force-sensing
resistors are still widely used. Their applications include manufacturing and
machinery, airplanes and aerospace, cars, medicine, robotics and many other aspects
of our day-to-day life. There is a wide range of other sensors that measure
chemical and physical properties of materials, including optical sensors for
refractive index measurement, vibrational sensors for fluid viscosity measurement,
and electro-chemical sensors for monitoring pH of fluids.

A sensor's sensitivity indicates how much its output changes when the input
quantity it measures changes. For instance, if the mercury in a thermometer moves 1
cm when the temperature changes by 1 °C, its sensitivity is 1 cm/°C (it is
basically the slope dy/dx assuming a linear characteristic). Some sensors can also
affect what they measure; for instance, a room temperature thermometer inserted
into a hot cup of liquid cools the liquid while the liquid heats the thermometer.
Sensors are usually designed to have a small effect on what is measured; making the
sensor smaller often improves this and may introduce other advantages.[3]-use
microcontroller platforms, the uses of sensors have expanded beyond
Technological progress allows more and more sensors to be manufactured on a
microscopic scale as microsensors using MEMS technology. In most cases, a
microsensor reaches a significantly faster measurement time and higher sensitivity
compared with macroscopic approaches.[3][4] Due to the increasing demand for rapid,
affordable and reliable information in today's world, disposable sensors—low-cost
and easy‐to‐use devices for short‐term monitoring or single‐shot measurements—have
recently gained growing importance. Using this class of sensors, critical
analytical information can be obtained by anyone, anywhere and at any time, without
the need for recalibration and worrying about contamination.[5]
Most sensors have a linear transfer function. The sensitivity is then defined as
the ratio between the output signal and measured property. For example, if a sensor
measures temperature and has a voltage output, the sensitivity is constant with the
units [V/K]. The sensitivity is the slope of the transfer function. Converting the
sensor's electrical output (for example V) to the measured units (for example K)
requires dividing the electrical output by the slope (or multiplying by its
reciprocal). In addition, an offset is frequently added or subtracted. For example,
−40 must be added to the output if 0 V output corresponds to −40 C input.

For an analog sensor signal to be processed or used in digital equipment, it needs


to be converted to a digital signal, using an analog-to-digital converter.

Sensor deviations
Since sensors cannot replicate an ideal transfer function, several types of
deviations can occur which limit sensor accuracy:
Since the range of the output signal is always limited, the output signal will
eventually reach a minimum or maximum when the measured property exceeds the
limits. The full scale range defines the maximum and minimum values of the measured
property.[citation needed]
The sensitivity may in practice differ from the value specified. This is called a
sensitivity error. This is an error in the slope of a linear transfer function.
If the output signal differs from the correct value by a constant, the sensor has
an offset error or bias. This is an error in the y-intercept of a linear transfer
function.
Nonlinearity is deviation of a sensor's transfer function from a straight line
transfer function. Usually, this is defined by the amount the output differs from
ideal behavior over the full range of the sensor, often noted as a percentage of
the full range.
Deviation caused by rapid changes of the measured property over time is a dynamic
error. Often, this behavior is described with a bode plot showing sensitivity error
and phase shift as a function of the frequency of a periodic input signal.
If the output signal slowly changes independent of the measured property, this is
defined as drift. Long term drift over months or years is caused by physical
changes in the sensor.
Noise is a random deviation of the signal that varies in time.
A hysteresis error causes the output value to vary depending on the previous input
values. If a sensor's output is different depending on whether a specific input
value was reached by increasing vs. decreasing the input, then the sensor has a
hysteresis error.
If the sensor has a digital output, the output is essentially an approximation of
the measured property. This error is also called quantization error.
If the signal is monitored digitally, the sampling frequency can cause a dynamic
error, or if the input variable or added noise changes periodically at a frequency
near a multiple of the sampling rate, aliasing errors may occur.
The sensor may to some extent be sensitive to properties other than the property
being measured. For example, most sensors are influenced by the temperature of
their environment.
The sensor resolution or measurement resolution is the smallest change that can be
detected in the quantity that it is being measured. The resolution of a sensor with
a digital output is usually the numerical resolution of the digital output. The
resolution is related to the precision with which the measurement is made, but they
are not the same thing. A sensor's accuracy may be considerably worse than its
resolution.

For example, the distance resolution is the minimum distance that can be accurately
measured by any distance measuring devices. In a time-of-flight camera, the
distance resolution is usually equal to the standard deviation (total noise) of the
signal expressed in unit of length.

You might also like