You are on page 1of 15

c

PROJECT WORK FOR ADDITIONAL


MATHEMATICS 2010

c c
c
 c

NAMA : KHAIRUL SAKIRIN BIN AZMI


GURU PEMBIMBING : EN.NOORSHAHRIMAN PEARL BIN HASHIM
I/C : 930131-05-5329
SEKOLAH : SEKOLAH DATOǯ ABDUL RAZAK
CONTENT

No Contents Page
1 Introduction
2 Part 1
3 Part 2
4 Part 3
5 Part 4
6 Part 5
7 Conclusion
INTRODUCTION

 c cccc
 c  c
c cc c c c
c
cc

c
 c cc  cc c  c cc c c
  c
cc cc
  c cc
cccc c  c  c  c  c c
  cc
c   ccc  cc   c c cc 
 c
 cc cc

÷  
The word -  does not have a consistent direct definition. In fact, there are sixteen
broad categories of -  
-
 , whose adherents possess different (and
sometimes conflicting) views about the fundamental nature of probability:

1.c Frequentists talk about probabilities only when dealing with experiments that are
random and well-defined. The probability of a random event denotes the  

 

  of an experiment's outcome, when repeating the
experiment. Frequentists consider probability to be the relative frequency "in the
long run" of outcomes.[1]
2.c Bayesians, however, assign probabilities to any statement whatsoever, even when
no random process is involved. Probability, for a Bayesian, is a way to represent an
individual's u 

 in a statement, or an objective degree of rational belief,
given the evidence.
Part 1

The theory of probability has been applied in various fields such as market research,
medical research, transportation, management and so on.

History

The word   derives from - , a measure of the authority of a witness in a legal
case in Europe, and often correlated with the witness's nobility. In a sense, this differs
much from the modern meaning of - , which, in contrast, is used as a measure of
the weight of empirical evidence, and is arrived at from inductive reasoning and statistical
inference.[2][3]

The scientific study of probability is a modern development. Gambling shows that there has
been an interest in quantifying the ideas of probability for millennia, but exact
mathematical descriptions of use in those problems only arose much later.

According to Richard Jeffrey, "Before the middle of the seventeenth century, the term
'probable' (Latin - ) meant --  , and was applied in that sense, univocally, to
opinion and to action. A probable action or opinion was one such as sensible people would
undertake or hold, in the circumstances."[4] However, in legal contexts especially, 'probable'
could also apply to propositions for which there was good evidence.[5]

Aside from some elementary considerations made by Girolamo Cardano in the 16th
century, the doctrine of probabilities dates to the correspondence of Pierre de Fermat and
Blaise Pascal (1654). Christiaan Huygens (1657) gave the earliest known scientific
treatment of the subject. Jakob Bernoulli's  
u (posthumous, 1713) and
Abraham de Moivre's  

 (1718) treated the subject as a branch of
mathematics. See Ian Hacking's 
 

  and James Franklin's 



  for histories of the early development of the very concept of
mathematical probability.

The theory of errors may be traced back to Roger Cotes's - 


 (posthumous,
1722), but a memoir prepared by Thomas Simpson in 1755 (printed 1756) first applied the
theory to the discussion of errors of observation. The reprint (1757) of this memoir lays
down the axioms that positive and negative errors are equally probable, and that there are
certain assignable limits within which all errors may be supposed to fall; continuous errors
are discussed and a probability curve is given.

Pierre-Simon Laplace (1774) made the first attempt to deduce a rule for the combination of
observations from the principles of the theory of probabilities. He represented the law of
probability of errors by a curve  = ɔ(),  being any error and  its probability, and laid
down three properties of this curve:
1.c it is symmetric as to the -axis;
2.c the -axis is an asymptote, the probability of the error being 0;
3.c the area enclosed is 1, it being certain that an error exists.

He also gave (1781) a formula for the law of facility of error (a term due to Lagrange,
1774), but one which led to unmanageable equations. Daniel Bernoulli (1778) introduced
the principle of the maximum product of the probabilities of a system of concurrent errors.

The method of least squares is due to Adrien-Marie Legendre (1805), who introduced it in
his a 
u
-

u 
u
 
u
 (a
u


 

 

). In ignorance of Legendre's contribution, an Irish-
American writer, Robert Adrain, editor of "The Analyst" (1808), first deduced the law of
facility of error,

 being a constant depending on precision of observation, and  a scale factor ensuring that
the area under the curve equals 1. He gave two proofs, the second being essentially the
same as John Herschel's (1850). Gauss gave the first proof which seems to have been
known in Europe (the third after Adrain's) in 1809. Further proofs were given by Laplace
(1810, 1812), Gauss (1823), James Ivory (1825, 1826), Hagen (1837), Friedrich Bessel
(1838), W. F. Donkin (1844, 1856), and Morgan Crofton (1870). Other contributors were
Ellis (1844), De Morgan (1864), Glaisher (1872), and Giovanni Schiaparelli (1875). Peters's
(1856) formula for , the probable error of a single observation, is well known.

In the nineteenth century authors on the general theory included Laplace, Sylvestre Lacroix
(1816), Littrow (1833), Adolphe Quetelet (1853), Richard Dedekind (1860), Helmert
(1872), Hermann Laurent (1873), Liagre, Didion, and Karl Pearson. Augustus De Morgan
and George Boole improved the exposition of the theory.

Andrey Markov introduced the notion of Markov chains (1906) playing an important role
in theory of stochastic processes and its applications.

The modern theory of probability based on the meausure theory was developed by Andrey
Kolmogorov (1931).

On the geometric side (see integral geometry) contributors to 


u 
 were
influential (Miller, Crofton, McColl, Wolstenholme, Watson, and Artemas Martin).


Ô




In mathematics, a probability of an event  is represented by a real number in the range


from 0 to 1 and written as P(), p() or Pr().[6] An impossible event has a probability of 0,
and a certain event has a probability of 1. However, the converses are not always true:
probability 0 events are not always impossible, nor probability 1 events certain. The rather
subtle distinction between "certain" and "probability 1" is treated at greater length in the
article on "almost surely".

The -- or - of an event  is the event [not ] (that is, the event of  not
occurring); its probability is given by P(not ) = 1 - P().[7] As an example, the chance of not
rolling a six on a six-sided die is 1 Ȃ (chance of rolling a six) . See
Complementary event for a more complete treatment.

If both the events  and ! occur on a single performance of an experiment this is called the
intersection or joint probability of  and !, denoted as . If two events,  and !
are independent then the joint probability is

for example, if two coins are flipped the chance of both being heads is [8]

If either event  or event ! or both events occur on a single performance of an experiment


this is called the union of the events  and ! denoted as . If two events are
mutually exclusive then the probability of either occurring is

For example, the chance of rolling a 1 or 2 on a six-sided die is

If the events are not mutually exclusive then

For example, when drawing a single card at random from a regular deck of cards, the
chance of getting a heart or a face card (J,Q,K) (or one that is both) is ,
because of the 52 cards of a deck 13 are hearts, 12 are face cards, and 3 are both: here the
possibilities included in the "3 that are both" are included in each of the "13 hearts" and the
"12 face cards" but should only be counted once.
u
-  is the probability of some event , given the occurrence of some
other event !. Conditional probability is written (|!), and is read "the probability of ,
given !". It is defined by

[9]

If (!) = 0 then is undefined.

Summary of probabilities
Y
  
A
not A

A or B

A and B

A given B


 

Main article: Probability theory

Like other theories, the theory of probability is a representation of probabilistic concepts in


formal termsȄthat is, in terms that can be considered separately from their meaning.
These formal terms are manipulated by the rules of mathematics and logic, and any results
are then interpreted or translated back into the problem domain.

There have been at least two successful attempts to formalize probability, namely the
Kolmogorov formulation and the Cox formulation. In Kolmogorov's formulation (see
probability space), sets are interpreted as events and probability itself as a measure on a
class of sets. In Cox's theorem, probability is taken as a primitive (that is, not further
analyzed) and the emphasis is on constructing a consistent assignment of probability
values to propositions. In both cases, the laws of probability are the same, except for
technical details.

There are other methods for quantifying uncertainty, such as the Dempster-Shafer theory
or possibility theory, but those are essentially different and not compatible with the laws of
probability as they are usually understood.
›-- 

Two major applications of probability theory in everyday life are in risk assessment and in
trade on commodity markets. Governments typically apply probabilistic methods in
environmental regulation where it is called "pathway analysis", often measuring well-being
using methods that are stochastic in nature, and choosing projects to undertake based on
statistical analyses of their probable effect on the population as a whole.

A good example is the effect of the perceived probability of any widespread Middle East
conflict on oil prices - which have ripple effects in the economy as a whole. An assessment
by a commodity trader that a war is more likely vs. less likely sends prices up or down, and
signals other traders of that opinion. Accordingly, the probabilities are not assessed
independently nor necessarily very rationally. The theory of behavioral finance emerged to
describe the effect of such groupthink on pricing, on policy, and on peace and conflict.

It can reasonably be said that the discovery of rigorous methods to assess and combine
probability assessments has had a profound effect on modern society. Accordingly, it may
be of some importance to most citizens to understand how odds and probability
assessments are made, and how they contribute to reputations and to decisions, especially
in a democracy.

Another significant application of probability theory in everyday life is reliability. Many


consumer products, such as automobiles and consumer electronics, utilize reliability
theory in the design of the product in order to reduce the probability of failure. The
probability of failure may be closely associated with the product's warranty.
Part 2

A)c We are playing the monopoly with my two friends. To start the game, each player
will have to toss dice twice. The player who obtains the highest number will start
the game. All the possible outcomes when the dice is tossed once is

:{1},{2},{3}

B)c Instead of one dice, two dice can also be tossed simultaneously by each player. The
player will move the token according to the sum of all dots on both turned-up
faces. For example, if the two dice are tossed simultaneously and Dz2dz appears on
one dice and Dz3dz appears on the other, the outcome of the toss is (2,3). Hence the
player shall move the token 5 spaces. Note: The events (2,3) and (3,2) should be
treated as two different events.

List all the possible outcomes when two dice are tossed simultaneously. Organize
and present your list clearly. Consider the use of table, chart, or even tree diagram.

:{1,1},{1,2},{1,3}
Part 3

Table 1 shows the sum of all dots on both turned-up when two dice are tossed
simultaneously.

A)c Complete Table 1 by listing all possible outcomes and their corresponding
probabilities.
c
ccc c
c c   c c    cc

 c cc
c c
c
c c
c
c c
c
c !c
c
c c
c
c c
"c
c c
c
c !c
!c
c c
#c
c c
c
c c
c

$ cc
B)c Based on table 1 that have I completed. All the possible outcomes of the following
event and their corresponding probabilities..

A ={}

B = {0}

C = {}

D = {2,2}
Part 4
A)c Conduct an activity by tossing two dice simultaneously 50 times.
Observations sum of all dots on both turned-up faces. Complete the
frequency table below.

cc%c
  cc & '
c
c
c
c
c
c
c
c
c
c
c
c
"c
c
c
c
!c
c
#c
c
c
c
c

$ cc
c
c
Based on the table 2 that I have completed, the value of
(1)cMean

(2)cVariance
(3)cStandard deviation of the data
c
c
(B) Predict the value of the mean if the number of tosses is increased to 100
times
:

(C) Test your prediction in (b) by continuing Activity 3(a) until the total
number of tosses is 100 times. Then determine the value of

(1)cMean

(2)cVariance

(3)cStandard deviation of the data

Was the prediction proven?

:
Part 5
Conclusion

c
c
c
c
c
c

You might also like