You are on page 1of 6

Chapter 1: The role of statistic in engineering

Ngô Tuấn Kiệt – 20221808, Nguyễn Văn Hoàng - 20221797

A
I.I THE ENGINEERING METHOD AND STATISTICAL THINKING
n engineer is a person who effectively applies scientific concepts to address societal
issues. Engineers achieve this by either improving an already-existing product or
process or by creating a brand-new product or process that satisfies the requirements of
consumers. The strategy used to formulate and resolve these issues is known as the engineering,
or scientific, technique. The engineering technique consists of the following steps:
1. Create a precise and succinct summary of the issue.
2. List the key elements that may impact this issue or contribute to a potential answer, at least in
part.
3. Using engineering or science understanding of the event being examined, propose a model for
the issue. Describe any model restrictions or underlying presumptions.
4. Execute suitable experiments and gather data to verify or prove the hypothesis or inferences
drawn in stages 2 and 3. 
5. Use the collected data to improve the algorithm.
6. Play with the model to help you come up with a fix for the issue.
7. Confirm the effectiveness and efficiency of the suggested answer to the issue by carrying out a
suitable exercise.
8. Based on the resolution of the issue, draw inferences or offer suggestions.

T he study of statistics focuses on the gathering, visualization, analysis, and application of


data for decision-making, problem-solving, and the creation of goods and procedures.
Statistics is the study of numbers, to put it simply. Obviously, statistics knowledge is just
as crucial to an engineer as knowledge of the other engineering disciplines because many parts of
engineering practice require dealing with data. In particular, statistical methods can be an
extremely useful tool for creating new goods and systems, enhancing already existing designs,

S
and creating, developing, and optimizing manufacturing processes.
tatistical techniques are used to explain and comprehend variation. By variability, we
mean that repeated measurements of a system or event do not yield the same outcome. We
all face variability in our daily lives, and statistical reasoning can help us integrate this
variability into our decision-making processes.
Furthermore, statistics also help us:
• Statistical techniques are useful for describing and understanding variability.
• By variability, we mean successive observations of a system or phenomenon do not produce exactly the
same result.
• Statistics gives us a framework for describing this variability and for learning about potential sources of
variability
Chapter 1: The role of statistic in engineering
Ngô Tuấn Kiệt – 20221808, Nguyễn Văn Hoàng - 20221797

1-2 COLLECTING ENGINEERING DATA


Chapter 1: The role of statistic in engineering
Ngô Tuấn Kiệt – 20221808, Nguyễn Văn Hoàng - 20221797

1-2.1 Basic Principles

W e showed some basic methods for summarizing data in the prior part. Sometimes the
data consists of all of the observations in a group. As a consequence, a survey is
conducted. However, in the engineering setting, the data is almost always a sample
drawn from the community. There are three fundamental data collection techniques.
- Using past data, a retrospective analysis was conducted.
- Observational research
- A planned exercise
An efficient data-collection method can significantly simplify analysis and contribute to a better
grasp of the population or process under study. We will now look at some instances of these
data-collection techniques.
1-2.2 Retrospective Study
Montgomery, Peck, and Vining (2006) describe an acetone-butyl alcohol distillation column
with a key variable being the percentage of acetone in the distillate or end product stream. The
reboil temperature, condensate temperature, and reflux rate are all variables that can influence
the product. The following documents are obtained and archived by production personnel:
+ Acetone content in an hourly test sample of final product
+ The reboil temperature graph, which is a time-series depiction of the reboil temperature
+ The temperature sensor record for the condenser
+ The constant hourly flow rate
For this procedure, the reflux rate should be kept consistent. As a result, production employees

A
shift very rarely.
retrospective analysis would use all or a subset of the historical process data that had
been stored over time. The study's goal could be to find the associations among The
effect of the two temps and the reflux rate on the acetone content in the output product
stream. However, this form of research has some drawbacks:
1. We may not be able to see a connection between the reflux rate and acetone content because
the reflux rate has not changed significantly over time.
2. The archived data on the two temperatures (which are almost constantly logged) do not
exactly match to the hourly acetone concentration readings. An approximate correspondence
may not be apparent to build.
3. Production keeps the two temperatures as near to the intended goals or fixed points as feasible.
Because the temperatures shift so slowly, determining their true effect on acetone concentration
may be challenging.
4. Within the limited areas where they do differ, the condensate temperature tends to rise with
the reboil temperature.
Chapter 1: The role of statistic in engineering
Ngô Tuấn Kiệt – 20221808, Nguyễn Văn Hoàng - 20221797

A s a result, separating the impacts of these two process factors on acetone concentration
may be challenging. As you can see, a retrospective study can generate a large amount
of data, but that data may contain little helpful information about the issue.
Furthermore, some pertinent data may be lacking, transcription or recording mistakes may have
resulted in anomalies (or odd numbers), or data on other important variables may not have been
gathered and preserved. As a result of these problems, statistical analysis of historical data
occasionally finds fascinating phenomena, but firm and dependable answers for these
phenomena are frequently difficult to acquire.

A
1-2.3 Observational Study
n observational study involves the engineer observing the process or populace while
causing as little disruption as possible and recording the amounts of interest. Because
these studies are typically performed for a brief amount of time, variables that are not
routinely measured can sometimes be included. When taking acetone concentration readings, the
engineer would create a form in the distillation column to capture the two temperatures and the
reflux rate. It may even be feasible to quantify the concentrations of the arriving feed stream so
that the impact of this element can be examined. In general, an observational research solves
problems 1 and 2 above and contributes significantly to the collection of precise and dependable
data. However, empirical research may not be beneficial for addressing issues 3 and 4.
1.2.4 Designed Experiments 

I n a planned experiment, the engineer intentionally modifies the system's or process's controllable
variables, monitors the output data that results, and then determines which factors are to blame for the
observed changes in output performance. The nylon connector example in Section 1-1 serves as an
example of a designed experiment; specifically, the wall thickness of the connector was purposefully
changed in order to determine whether or not a higher pull-off force could be attained. To establish
cause-and-effect linkages, experiments must be constructed using fundamental concepts like
randomization. 
In the engineering and physical-chemical sciences, a large portion of our knowledge has been generated
through testing or experimenting. Engineers frequently operate in issue domains where no scientific or
technical theory is totally or immediately applicable, making experimentation and observation of the
generated data the only ways to address the problem. It is generally always required to conduct tests or
experiments to confirm that the theory is in fact operational in the scenario or environment in which it is
being applied, even when there is a solid underlying scientific theory on which we may rely to explain the
phenomena of interest. Planning, carrying out, and analyzing the results of engineering experiments all
heavily rely on statistical reasoning and statistical tools. role in engineering design and development and
in the improvement of manufacturing processes. 
The distillation column is an example of a complex system that may be studied very effectively through
the use of designed experiments. We want to look into how the three variables in this process—the two
temperatures and the reflux rate—affect the output acetone concentration. We must be able to distinguish
between each of the three elements' effects on the acetone concentration in order to have a successful
experimental design for this issue. Factor levels are the three factors' defined values employed in the
experiment. 
1-2.5 Observing Processes Over Time 
Observations over time involve observing something specific over a given period of time to identify
changes that may contribute to a person’s findings.  This type of enquiry can take place over any period
of time and is commonly used within schools as it lends itself so well to our routine and timetable. 

N
1-3 MECHANISTIC AND EMPIRICAL MODELS 
early all engineering problems require analysis, and models are crucial. Learning about the
models pertinent to particular areas and the methods for utilizing these models in problem
formulation and solving constitutes a significant portion of an engineer's formal education.
Consider the straightforward scenario of measuring the current flow in a thin copper wire. Ohm's law may
be our paradigm for this phenomenon: 
Chapter 1: The role of statistic in engineering
Ngô Tuấn Kiệt – 20221808, Nguyễn Văn Hoàng - 20221797
                                                     

Or 
                                                        I=E/R 
We refer to this kind of model as a mechanistic model since it is based on our fundamental
understanding of the fundamental physical mechanism connecting these variables. However, if
we repeated this measurement procedure more than once, perhaps at various times or even on
various days, the measured current might vary slightly as a result of minute adjustments or
variations in factors that are not entirely under our control, such as changes in the surrounding
temperature, variations in the gauge's performance, minor impurities present at various points
along the wire, and drifts in the voltage source. As a result, a more accurate model of the
observed current could 

where the B's represent unidentified parameters. As with Ohm's law, this model will not
completely capture the phenomenon, so we must add another component to the model to account
for the additional sources of variability that could alter the molecular weight 

is the model that we will use to relate molecular weight to the other three variables. This type 
of model is called an empirical model; that is, it uses our engineering and scientific knowledge
of the phenomenon, but it is not directly developed from our theoretical or first-principles 
understanding of the underlying mechanism. 
1-4 PROBABILITY AND PROBABILITY MODELS 
A probability model is a mathematical representation of a chance occurrence. 
A model consists of a sample space, the set of all possible outcomes of an experiment, and a set
of probabilities assigned to each element of the sample space . These probabilities may or may
not be known. 
Chapter 1: The role of statistic in engineering
Ngô Tuấn Kiệt – 20221808, Nguyễn Văn Hoàng - 20221797

You might also like