You are on page 1of 12

COLLEGE OF ENGINEERING

CENTRAL PHILIPPINE UNIVERSITY


Jaro, Iloilo City, Philippines
Tel No: 63 (33) 3291971 loc 1084

Pressure and Temperature Measurement


Presented to
The Mechanical Engineering Department
College of Engineering
Central Philippine University

In Partial Fulfilment of the Requirements for Mechanical


Engineering Laboratory 1

By:
Opao, Ian Dave E.

Alejandro R. Manderico
Lecturer

May 2023

Pressure and Temperature Measurement


Pressure and temperature are two essential physical quantities that are widely
used in various fields, including industrial manufacturing, energy production, and
scientific research. Pressure is defined as the force per unit area applied on a surface,
and is commonly measured in units such as pounds per square inch (psi), pascals
(Pa), or bars. Temperature, on the other hand, is a measure of the degree of hotness or
coldness of an object or substance, and is usually measured in units such as Celsius
(°C), Fahrenheit (°F), or Kelvin (K).
Accurate measurement of pressure and temperature is crucial in many
applications. For example, in industrial manufacturing, pressure and temperature
measurement are essential for process control, ensuring that manufacturing processes
run smoothly and efficiently. In energy production, pressure and temperature
measurement are critical for the safe and efficient operation of power plants and oil
refineries. In scientific research, precise measurements of pressure and temperature
are necessary to understand the physical properties of materials, study chemical
reactions, and explore the behavior of gases.

Function of Pressure Measurement

The function of pressure measurement is to provide a quantitative and accurate


measurement of the pressure being exerted in a system or on a surface. Pressure
measurement plays a crucial role in various fields, including engineering, physics,
chemistry, and medicine.
The main functions of pressure measurement include:
1. Control and Monitoring: Pressure measurement is used to control and monitor
various industrial processes. In manufacturing, pressure measurement is essential
to ensure that machinery is running within acceptable limits, and to detect any
abnormal changes in pressure that may indicate equipment failure.

2. Safety: Pressure measurement is also used to ensure the safety of personnel and
equipment in various applications. For example, in oil and gas production,
pressure measurement is used to monitor the pressure of wellheads and pipelines
to prevent leaks and explosions.

3. Diagnosis: Pressure measurement is used in medical settings to diagnose and


monitor conditions such as hypertension, which is characterized by abnormally
high blood pressure.
4. Testing and Calibration: Pressure measurement is used to test and calibrate a wide
range of equipment, such as pressure gauges, valves, and pumps.
5. Research and Development: Pressure measurement is also used in research and
development to investigate and understand the behavior of fluids and gases under
different pressure conditions.
Overall, pressure measurement is critical for ensuring the safe and efficient operation
of various systems and processes, and for advancing knowledge and understanding in
various fields.

Types of Pressure Measurement

There are several types of pressure measurement techniques used in various


applications. Some common types of pressure measurement include:
1. Gauge pressure measurement: This measures pressure relative to atmospheric
pressure as a reference. Gauge pressure sensors are commonly used in
applications such as tire pressure monitoring, HVAC systems, and fuel
pressure monitoring.

2. Absolute pressure measurement: This measures pressure relative to a perfect


vacuum. Absolute pressure sensors are commonly used in applications such as
weather monitoring, vacuum systems, and barometric pressure measurement.

3. Differential pressure measurement: This measures the difference in pressure


between two points. Differential pressure sensors are commonly used in flow
measurement, filter monitoring, and level measurement.

4. Hydrostatic pressure measurement: This measures the pressure exerted by a


fluid due to its weight. Hydrostatic pressure sensors are commonly used in
applications such as liquid level measurement and water pressure monitoring.

5. Capacitive pressure measurement: This measures the change in capacitance


between two electrodes due to changes in pressure. Capacitive pressure sensors
are commonly used in applications such as pressure measurement in harsh
environments.

6. Piezoelectric pressure measurement: This measures pressure by detecting the


change in electric charge generated by a piezoelectric crystal due to changes in
pressure. Piezoelectric pressure
Pressure Measurement Techniques

There are various techniques for measuring pressure, each with its advantages and
disadvantages depending on the application. Here are some common pressure
measurement techniques:

1. Bourdon Tube: This mechanical method is based on the deformation of a curved


tube, which is connected to a pressure source. As the pressure increases, the tube
deforms, and this deformation is converted into a pressure reading using a
mechanical linkage.

2. Diaphragm Gauge: Similar to the Bourdon tube, this method uses a flexible
diaphragm that deforms under pressure, which is then measured using a
mechanical linkage.

3. Manometer: A manometer is a simple device that uses a column of liquid to


measure pressure. The height of the liquid column is proportional to the pressure
being measured.

4. Pressure Transducer: A pressure transducer is an electronic device that converts


pressure into an electrical signal. This can be done using various techniques such
as piezoelectric, capacitive, or strain gauge sensors.

5. Strain Gauge: A strain gauge is a device that measures the deformation of a


material under stress, which is then used to calculate the pressure.

6. Capacitive Sensor: A capacitive sensor measures the change in capacitance


between two electrodes as a result of pressure changes.

7. Piezoelectric Sensor: A piezoelectric sensor generates an electrical charge when


subjected to pressure, which can then be measured.

8. Optical Pressure Sensor: Optical pressure sensors use techniques such as


interferometry or fiber Bragg gratings to measure pressure changes.

Temperature Measurement Techniques


There are various techniques for measuring temperature, each with its
advantages and disadvantages depending on the application. Here are some common
temperature measurement techniques:

1. Thermocouples: Thermocouples are the most commonly used temperature sensors.


They consist of two different metals joined together, and the voltage produced at
the junction is proportional to the temperature difference between the two ends.

2. Resistance Temperature Detectors (RTDs): RTDs are temperature sensors made of


materials whose electrical resistance changes as a function of temperature. They
are typically made of platinum, copper or nickel and offer high accuracy.

3. Thermistors: Thermistors are similar to RTDs but have a negative temperature


coefficient (NTC) meaning the electrical resistance decreases as the temperature
increases. They are often used in applications that require high sensitivity.

4. Infrared Sensors: Infrared sensors detect the amount of infrared radiation emitted
by an object, which is proportional to its temperature. They are commonly used in
non-contact temperature measurements.

5. Bimetallic strips: Bimetallic strips are made of two different metals with different
coefficients of thermal expansion. As the temperature changes, the strip bends due
to the different expansion rates.

6. Liquid-in-glass thermometers: These are simple glass tubes filled with a liquid that
expands as the temperature rises. The temperature is read by observing the level of
the liquid in the graduated scale on the side of the thermometer.

7. Fiber optic temperature sensors: These sensors use a fiber optic cable to measure
temperature by sensing changes in the optical properties of the cable as a result of
temperature changes.

Each temperature measurement technique has its own advantages and


disadvantages, depending on factors such as accuracy, range, speed, and cost. The
choice of technique depends on the specific application and requirements.
Pressure Units and Conversion

Pressure can be measured using various units depending on the country,


industry or application. Some of the commonly used units of pressure are:

 Pascal (Pa): The SI unit of pressure. 1 Pa is equal to the force of one newton
acting on an area of one square meter.
 Kilopascal (kPa): 1 kPa is equal to 1000 Pa.
 Bar: 1 bar is equal to 100 kPa or approximately 14.5 psi.
 Pound per square inch (psi): This is the pressure unit commonly used in the
United States. 1 psi is equal to 0.0689476 bar.
 Atmosphere (atm): This is the average pressure at sea level on Earth. 1 atm is
equal to 101.325 kPa or approximately 14.7 psi.
 Torr: A non-SI unit of pressure, commonly used in vacuum measurements. 1
torr is equal to 1/760th of 1 atm.

To convert between different pressure units, conversion factors can be used. Here are
some common pressure unit conversions:
1 atm = 101.325 kPa
1 kPa = 0.145038 psi
1 bar = 100 kPa
1 psi = 6.89476 kPa
1 torr = 1/760th atm
It's important to make sure to use the correct pressure unit in any calculations or
measurements to ensure accuracy.

Temperature Units and Conversion


Temperature is the measure of the degree of hotness or coldness of an object or
a system. There are different temperature units used in different countries, industries,
and applications. The most commonly used temperature units are:

 Celsius (°C): The Celsius scale is based on the freezing and boiling points of
water at sea level. The freezing point of water is 0°C and the boiling point of
water is 100°C at standard atmospheric pressure.
 Fahrenheit (°F): The Fahrenheit scale is also based on the freezing and boiling
points of water at sea level. The freezing point of water is 32°F and the boiling
point of water is 212°F at standard atmospheric pressure.
 Kelvin (K): The Kelvin scale is based on the thermodynamic temperature of an
object or system. The zero point of the Kelvin scale is absolute zero (-273.15°C),
which is the theoretical temperature at which all molecular motion stops.

To convert between different temperature units, conversion factors can be used. Here
are some common temperature unit conversions:

°C = (°F - 32) x 5/9


°F = (°C x 9/5) + 32
K = °C + 273.15
It's important to use the correct temperature unit in any calculations or
measurements to ensure accuracy.

Calibration and Accuracy

Calibration and accuracy are critical factors in ensuring that pressure and
temperature measurements are reliable and accurate. Calibration is the process of
comparing the measurement values of an instrument or sensor with a known
standard to determine any deviation or error. The aim of calibration is to ensure that
the instrument or sensor provides accurate and reliable readings.

Pressure and temperature measurement instruments should be calibrated


regularly to maintain accuracy. The calibration process involves comparing the
readings of the instrument under test with a known reference value, and adjustments
are made to the instrument if necessary.
The accuracy of pressure and temperature measurement instruments is defined
by their tolerance or uncertainty, which is the degree to which the readings of the
instrument may deviate from the true value. The tolerance is usually expressed as a
percentage of the full-scale range or as a specific value. For example, an instrument
with a tolerance of +/- 0.5% of full-scale range indicates that the reading may deviate
by up to 0.5% of the full-scale range.

The accuracy of pressure and temperature measurement instruments is affected by


various factors such as:

 Environmental factors such as temperature, humidity, and air pressure


 The condition and age of the instrument or sensor
 The method and frequency of calibration
 The quality and reliability of the calibration reference standards used.
 To ensure accurate and reliable pressure and temperature measurement, it is
essential to calibrate instruments regularly, maintain them in good condition,
and use high-quality calibration reference standards.

Applications of Pressure and Temperature Measurement

Pressure and temperature measurement has numerous applications in various


industries. Some of the common applications include:

1. HVAC (heating, ventilation, and air conditioning): Pressure and temperature


measurement is used to monitor and control temperature, pressure, and
humidity in HVAC systems.

2. Process Control: Pressure and temperature measurement is used in process


control systems to monitor and control the parameters of chemical reactions,
manufacturing processes, and other industrial processes.

3. Medical applications: Pressure measurement is used to monitor blood pressure,


intracranial pressure, and pulmonary pressure in medical applications.
Temperature measurement is used in various medical applications such as
monitoring body temperature and controlling the temperature of medical
equipment.
4. Automotive: Pressure and temperature measurement is used in automotive
applications to monitor and control engine performance, transmission
performance, and tire pressure.

5. Aerospace: Pressure and temperature measurement is used in aerospace


applications to monitor and control the performance of aircraft engines, air
conditioning systems, and other critical components.

6. Energy: Pressure and temperature measurement is used in the energy industry


to monitor and control the parameters of power generation, oil and gas
production, and refining processes.

7. Research and Development: Pressure and temperature measurement is used in


research and development to measure and analyze various physical parameters
such as pressure, temperature, and flow rate.

Overall, pressure and temperature measurement is an essential tool in many


industries and applications, ensuring safe and efficient operation and helping to
optimize processes and performance.

Challenges and Considerations in Pressure Measurement

Pressure measurement can present certain challenges and considerations that


need to be taken into account. One challenge is the measurement of very high or very
low pressures. High-pressure measurements require robust and specialized equipment
capable of withstanding extreme conditions, while low-pressure measurements may
require sensitive sensors and techniques to achieve accurate results.
Another consideration is the compatibility of pressure measurement devices
with the medium being measured. Certain fluids or gases can be corrosive, abrasive,
or prone to contamination, which can affect the performance and lifespan of pressure
sensors and gauges. In such cases, it is important to select materials and designs that
can withstand the specific properties of the medium being measured.
Environmental conditions, such as temperature and humidity, can also impact
pressure measurements. Changes in ambient conditions can introduce errors in the
readings and affect the accuracy of the measurements. Therefore, it is important to
account for temperature and humidity effects and, if necessary, use compensation
techniques or specialized sensors that are designed to minimize these influences.
Advancements and Future Trends in Pressure Measurement

Advancements in technology continue to drive improvements in pressure


measurement techniques. One notable trend is the miniaturization of pressure
sensors, which has led to the development of compact and integrated pressure
measurement devices. These advancements have enabled pressure measurement in
smaller and more constrained environments, expanding the range of applications.
Additionally, wireless and remote monitoring capabilities have emerged,
allowing real-time pressure data to be collected and transmitted for analysis and
control purposes. This enables efficient monitoring and control of pressure in remote
or inaccessible locations and enhances the overall system performance and safety.
Furthermore, the integration of pressure measurement with other sensing
technologies, such as temperature and flow, is gaining prominence. This integration
provides a more comprehensive understanding of system behavior and enables
enhanced process control and optimization.
In the future, advancements in materials, sensor technologies, and data
analytics are expected to further enhance the accuracy, reliability, and performance of
pressure measurement. These advancements will continue to push the boundaries of
pressure measurement capabilities and enable new applications in various industries.

Challenges and Considerations in Temperature Measurement

Temperature measurement can present certain challenges and considerations


that need to be taken into account. One challenge is the non-uniformity of
temperature within an object or system. In some cases, temperature variations may
occur across different regions of the object, making it challenging to obtain accurate
and representative temperature readings. Techniques such as multiple sensors or
averaging methods may be employed to address this challenge.
High temperature and pressure environments also pose challenges in
temperature measurement. Extreme temperatures can affect the performance and
lifespan of temperature sensors, leading to inaccuracies or sensor failure. Similarly,
high pressure can cause damage to temperature sensors and affect their accuracy. In
such cases, specialized sensors and materials that can withstand these conditions are
necessary for accurate temperature measurements.
Inaccessible locations can be another consideration in temperature
measurement. Some objects or systems may be located in areas that are difficult to
access, such as enclosed chambers or sealed containers. This makes it challenging to
place temperature sensors in direct contact with the object or accurately measure its
temperature. Techniques like non-contact temperature measurement using infrared
sensors can be used in such scenarios.
Electromagnetic interference (EMI) is another factor that can affect temperature
measurements. Electrical noise, electromagnetic fields, and radio frequency
interference can disrupt the signals from temperature sensors, leading to
inaccuracies. Shielding techniques and proper sensor placement can help minimize
the impact of EMI on temperature measurements.
Calibration is a critical consideration in temperature measurement.
Temperature sensors and instruments should be calibrated regularly to ensure
accurate and reliable readings. Improper or infrequent calibration can result in
measurement errors and compromise the accuracy of temperature readings.
Calibration should be performed using traceable reference standards and under
controlled conditions to achieve the desired level of accuracy.

Advancements and Future Trends in Temperature Measurement

Advancements in technology continue to drive improvements in temperature


measurement techniques and devices. One notable trend is the development of
compact and integrated temperature sensors and instruments. Miniaturization has
enabled temperature measurement in smaller and more complex systems, expanding
the range of applications.
Wireless and remote temperature monitoring capabilities have also gained
prominence. These advancements allow real-time temperature data to be collected and
transmitted for analysis and control purposes. Wireless temperature monitoring
systems offer convenience, flexibility, and the ability to monitor temperature in remote
or inaccessible locations.
Furthermore, the integration of temperature measurement with other sensing
technologies is an emerging trend. Combining temperature measurement with
parameters such as pressure, humidity, or flow provides a more comprehensive
understanding of system behavior and enables enhanced process control and
optimization.
In the future, advancements in sensor technologies, such as the use of
nanomaterials or advanced sensing principles, are expected to further improve the
accuracy, sensitivity, and response time of temperature measurement. The
development of smart and self-calibrating temperature sensors may also simplify the
calibration process and enhance measurement accuracy.
Calibration and accuracy are crucial aspects of temperature measurement
instruments. Calibration involves comparing the instrument to a known reference
temperature source and making adjustments for accurate readings. The accuracy is
typically expressed as a percentage of the instrument's full-scale range, indicating how
closely its readings correspond to the true temperature. Factors like instrument
quality, environmental conditions, and proper calibration affect accuracy.
Temperature measuring instruments are calibrated using reference standards
traceable to national standards. This involves comparing the instrument's readings to
the reference standard over a temperature range and making necessary adjustments.
Calibration should be performed by qualified technicians using appropriate
procedures.
Different methods exist for calibration, including comparison calibration, fixed
point calibration using melting or boiling points, and interpolation calibration with a
series of reference standards. A calibration certificate documents the process and
results.
Design of experiments (DOE) can be employed to optimize temperature
measurement and calibration. Key steps include defining objectives, identifying
influential factors, determining factor levels, designing the experiment, conducting it,
analyzing results, and optimizing the calibration process based on significant factors.
Regular calibration is essential to maintain accuracy and reliability. The
calibration frequency varies depending on the instrument and intended use, typically
ranging from once per year to once per month. Using DOE in the measurement and
calibration of temperature instruments helps optimize the process and enhance
accuracy.

You might also like