Welcome to Scribd. Sign in or start your free trial to enjoy unlimited e-books, audiobooks & documents.Find out more
Download
Standard view
Full view
of .
Look up keyword
Like this
73Activity
0 of .
Results for:
No results containing your search query
P. 1
Probability.pdf

Probability.pdf

Ratings:

4.0

(1)
|Views: 2,425|Likes:
Published by anshu_121

More info:

Published by: anshu_121 on Aug 06, 2009
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as DOCX, PDF, TXT or read online from Scribd
See more
See less

05/11/2014

pdf

text

original

 
Probability
.
Probability
, or chance, is a way of expressing knowledge or belief that aneventwill occur or  has occurred. Inmathematicsthe concept has been given an exact meaning in probability theory, that is used extensively in suchareas of studyasmathematics, statistics,finance,gambling,  science,and philosophyto draw conclusions about the likelihood of potential events and the underlying mechanics of complex systems.
Interpretations
The word
 probability
does not have a consistent directdefinition. In fact, there are two broadcategories of 
probability interpretations
, whose adherents possess different (and sometimesconflicting) views about the fundamental nature of probability:
1.
Frequentiststalk about probabilities only when dealing withexperimentsthat arerandom andwell-defined. The probability of a random event denotes the
relative frequency of occurrence
of an experiment's outcome, when repeating the experiment. Frequentistsconsider probability to be the relative frequency "in the long run" of outcomes.
 
2.
Bayesians, however, assign probabilities to anystatementwhatsoever, even when no random process is involved. Probability, for a Bayesian, is a way to represent anindividual's 
degree of belief 
in a statement, given theevidence. 
Etymology
The word
 probability
 derivesfrom
, a measure of theauthorityof awitnessin alegal case  inEurope, and often correlated with the witness'snobility. In a sense, this differs much from the modern meaning of 
 probability
, which, in contrast, is used as a measure of the weight of empirical evidence, and is arrived at frominductive reasoningandstatistical inference.
History
 Further information: History of statistics
The scientific study of probability is a modern development.Gamblingshows that there has beenan interest in quantifying the ideas of probability for millennia, but exact mathematicaldescriptions of use in those problems only arose much later.According to Richard Jeffrey, "Before the middle of the seventeenth century, the term 'probable'(Latin
 probabilis
) meant
approvable
, and was applied in that sense, univocally, to opinion and toaction. A probable action or opinion was one such as sensible people would undertake or hold, in
 
the circumstances."
However, in legal contexts especially, 'probable' could also apply to propositions for which there was good evidence.
Aside from some elementary considerations made byGirolamo Cardanoin the 16th century, thedoctrine of probabilities dates to the correspondence of Pierre de FermatandBlaise Pascal (1654).Christiaan Huygens(1657) gave the earliest known scientific treatment of the subject.Jakob Bernoulli's 
(posthumous, 1713) andAbraham de Moivre's 
 (1718) treated the subject as a branch of mathematics. SeeIan Hacking's 
The Emergence of Probability
The Science of Conjecture
for histories of theearly development of the very concept of mathematical probability.The theory of errors may be traced back toRoger Cotes's 
Opera Miscellanea
(posthumous,1722), but a memoir prepared byThomas Simpsonin 1755 (printed 1756) first applied the theoryto the discussion of errors of observation. The reprint (1757) of this memoir lays down theaxioms that positive and negative errors are equally probable, and that there are certainassignable limits within which all errors may be supposed to fall; continuous errors are discussedand a probability curve is given.Pierre-Simon Laplace(1774) made the first attempt to deduce a rule for the combination of observations from the principles of the theory of probabilities. He represented the law of  probability of errors by a curve
 y
= φ(
 x
)
,
 x
being any error and
 y
its probability, and laid downthree properties of this curve:
1.
it is symmetric as to the
 y
-axis;
2.
the
 x
-axis is anasymptote,the probability of the error being 0;
3.
the area enclosed is 1, it being certain that an error exists.He also gave (1781) a formula for the law of facility of error (a term due to Lagrange, 1774), butone which led to unmanageable equations.Daniel Bernoulli(1778) introduced the principle of the maximum product of the probabilities of a system of concurrent errors.Themethod of least squaresis due toAdrien-Marie Legendre(1805), who introduced it in his
 Nouvelles méthodes pour la détermination des orbites des comètes
(
 New Methods for  Determining the Orbits of Comets
). In ignorance of Legendre's contribution, an Irish-Americanwriter,Robert Adrain,editor of "The Analyst" (1808), first deduced the law of facility of error,
h
being a constant depending on precision of observation, and
c
a scale factor ensuring that thearea under the curve equals 1. He gave two proofs, the second being essentially the same asJohnHerschel's(1850).Gaussgave the first proof which seems to have been known in Europe (the third after Adrain's) in 1809. Further proofs were given by Laplace (1810, 1812), Gauss (1823),James Ivory(1825, 1826), Hagen (1837),Friedrich Bessel(1838),W. F. Donkin(1844, 1856), andMorgan Crofton(1870). Other contributors were Ellis (1844),De Morgan(1864),Glaisher   (1872), andGiovanni Schiaparelli(1875). Peters's (1856) formula for 
, the probable error of asingle observation, is well known.In the nineteenth century authors on the general theory includedLaplace,Sylvestre Lacroix  (1816), Littrow (1833),Adolphe Quetelet(1853),Richard Dedekind(1860), Helmert (1872), Hermann Laurent(1873), Liagre, Didion, andKarl Pearson. Augustus De MorganandGeorge Booleimproved the exposition of the theory.On the geometric side (seeintegral geometry) contributors to
wereinfluential (Miller, Crofton, McColl, Wolstenholme, Watson, andArtemas Martin).
 
Mathematical treatment
In mathematics, a probability of anevent 
 A
is represented by a real number in the range from 0 to1 and written as P(
 A
), p(
 A
) or Pr(
 A
)
. An impossible event has a probability of 0, and a certainevent has a probability of 1. However, the converses are not always true: probability 0 events arenot always impossible, nor probability 1 events certain. The rather subtle distinction between"certain" and "probability 1" is treated at greater length in the article on "almost surely".The
opposite
or 
complement 
of an event
 A
is the event [not
 A
] (that is, the event of 
 A
notoccurring); its probability is given by P(not
 A
) = 1 - P(
 A
)
.As an example, the chance of notrolling a six on a six-sided die is 1 - (chance of rolling a six) = . SeeComplementary eventfor amore complete treatment.If both the events
 A
and
 B
occur on a single performance of an experiment this is called theintersection or  joint probabilityof 
 A
and
 B
, denoted as . If two events,
 A
and
 B
areindependent then the joint probability isfor example, if two coins are flipped the chance of both being heads is
.If either event
 A
or event
 B
or both events occur on a single performance of an experiment this iscalled the union of the events
 A
and
 B
denoted as . If two events aremutually exclusivethen the probability of either occurring isFor example, the chance of rolling a 1 or 2 on a six-sided die is .If the events are not mutually exclusive then.For example, when drawing a single card at random from a regular deck of cards, the chance of getting a heart or a face card (J,Q,K) (or one that is both) is , because of the 52 cards of a deck 13are hearts, 12 are face cards, and 3 are both: here the possibilities included in the "3 that are both" are included in each of the "13 hearts" and the "12 face cards" but should only be countedonce.
 is the probability of some event
 A
, given the occurrence of some other event
 B
. Conditional probability is written
 P 
(
 A
|
 B
), and is read "the probability of 
 A
, given
 B
". Itis defined by
 If 
 P 
(
 B
) = 0
then isundefined.
Summary of probabilitiesEvent Probability
Anot AA or B

Activity (73)

You've already reviewed this. Edit your review.
1 hundred reads
1 thousand reads
moonxxzz liked this
Azam Ariffin liked this
lionelgrealou liked this
Norlela Jonid La liked this
kapilnexus liked this
Black_Leo liked this
Nad Tgb Mjsc liked this

You're Reading a Free Preview

Download
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->