You are on page 1of 33

PROBABILITY AND RANDOM

VARIABLES

Instructor: Dr. Aqsa aslam


Email: aqsa.aslam@nu.edu.pk
WHY DO COMPUTER ENGINEERS NEED
TO STUDY PROBABILITY?
• Probability theory provides powerful tools To
• Explain, model, analyze, and design technology developed by computer engineers

• Applications
• Signal processing
• Computer memories
• Optical communication systems
• Computer network traffic
• Computer communication networks
APPLICATION COMPUTER COMMUNICATION
• Computer communication networks are ubiquitous and have many configurations
• Including local area networks (LANs)
• Wireless networks
• Satellite networks
• Internet
• The probability that a packet is damaged on a computer link (p).
• Under network models (single, multiple or parallel links (p) etc.)
• Analyze the performance of the network based on the value of p
• We are interested in the probability of packet losses in the network
• The expected number of packet transmissions for a large number of packets
WHY PROBABILITY THEORY?
• Most signals we deal with in practice are random
• (unpredictable or erratic) and not deterministic

• Random signals are encountered in one form or another


• Every practical communication system
• Communication both as information conveying signal and as unwanted noise signal

• Computer Network Information is in a random way


• Event Modify the behavior of links
• Nodes Random in nature

• Reason in quantitative ways Likelihood of events in a network,


• To predict the behavior of network components
WHY PROBABILITY THEORY?
• Example 1
• Measure the time between two packet arrivals into the cable of a local area network.
• How likely it is that the inter-arrival time between any two packets is less than T sec?

Any other EXAMPLE?


EXAMPLES
• Consider an experiment that can result in M possible outcomes, O1,...,OM

• Examples:
1. Tossing a die, one of the six sides will land facing up Classic example
• Oi denote the outcome that the ith side faces up, i = 1,...,6

2. The simplest example we consider is the flipping of a coin Classic example


• Oi could be are two possible outcomes, “heads” and “tails.”

3. Computer with six processors Computer


• Oi could denote the outcome that a program or thread is assigned to the ith processor

4. Whether or not a bit was correctly received over a digital communication system.

• No matter what the experiment


• We perform it n times and make a note of how many times each outcome occurred
PROBABILITY THEORY
• The theory of probabilities aims to study random phenomena
• Probability theory specifies a set of axioms
• For a well-defined mathematical model of physical experiments
whose outcomes exhibit random variability each time they are performed

• A mathematical model used to quantify the likelihood of events


• It consists of:
• Sample space: The set of all possible outcomes of a random experiment
• Set of events: Subsets of the sample space.
• Probability measure: Defined according to a probability law for all the events of the sample space.
SAMPLE SPACES
• A random phenomenon produces results by definition not predictable.
• The result of each trial is called an outcome.
• A sample space, generally noted by Ω, is the set of all the possible outcomes.

• Coin Tossing Classic example


• Each trial produces an outcome (head or tail) cannot be known in advance
• Results are either head or tail and nothing else.
• Computer Based Example
• If the occupancy (o) of a buffer with a capacity of 100 packets, Ω = {0,1,…, 100}.
• Each observation on the buffer is an outcome o ∈ Ω
• giving the buffer’s current occupancy.
EXAMPLES CORRESPONDING TO THE
APPLICATIONS
• Computer network traffic:
• If a router has a buffer that can store up to 70 packets
• Model the actual number of packets waiting for transmission
Sample space Ω = {0,1,2,...,70}
• 0 to account for the possibility that there are no packets waiting to be sent

• Wireless communication systems:


• Non-coherent receivers measure the energy of the incoming waveform.
• Since energy is a nonnegative quantity Model it with the sample space Ω =[0,∞]
• consisting of the nonnegative real numbers,
EXAMPLES OF SAMPLE SPACE
• In the IEEE 802.11 protocol, the congestion window (CW) parameter is used as follows:
• Initially, a terminal waits for a random time period (called backoff) chosen in the range [1, 2CW]
before sending a packet. If an acknowledgement for the packet is not received in time, then CW is
doubled, and the process is repeated, until CW reaches the value CWMAX. The initial value of
CW is CWMIN.

• What is the sample space for:


1. the value of CW?
2. the value of the backoff?
EVENTS
• An event is one or more outcomes of an experiment
• Elements or points in the sample space Ω are called outcomes
• Collections of outcomes are called events.

• Classic Examples:
• With “coin tossing” (Ω = {head, tail}), we can focus only on the event A = {head}.

• Computer based example buffer occupancy (o).


• For instance, o < 25 may be considered as a weak load situation
• whereas o > 75 can be considered as a prelude to overflow.

• An event, often noted by a capital letter, is a subset of Ω


• Gathering one or more outcomes.
• Let A be an event, A ⊂ Ω A (Ω) where (Ω) is the partition of Ω.
EXAMPLES CORRESPONDING TO THE
APPLICATIONS
• A router has three networking interfaces Ω = {if0, if1, if2}.
• One event could be A = {if0}
• i.e. the fact that a packet is routed to “if0”

• Some special cases:


• ∅ symbolizes an “event” which never takes place, i.e. an impossibility;
• let w be a particular outcome of Ω (w ∈ Ω)
• Its occurrence is an event: Eω = {ω}
• Ω is also an “event” it always occurs, i.e. it symbolizes a certainty.
PROBABILITY MEASURE
• Probability is a measure Gives a value (a real ranging in(0,1)) to each possible event.
• Quantification of the occurrence frequency of the event.

• Probability of an event: A number assigned to the event reflecting the likelihood with which it can
occur or has been observed to occur.
• Probability Law: A rule that assigns probabilities to events in a way that reflects our intuition of
how likely the events are.

What we need then is a formal way of assigning these numbers to events and any combination of
events that makes sense in our experiments!
PROBABILITY MEASURE
• Given a nonempty set Ω ,called the sample space
• A function P defined on the subsets of Ω
• we say P is a probability measure
PROBABILITY SPACE
• Summary and intuitive introduction which proposes the following three concepts:
• Identification of all of the possible outcomes
• Identification of all of the interesting events
• Quantification of these events

How to formalize these concepts on the basis of probability theory?


PROBABILITY LAW
• Let E be a random experiment
• Let A be an event in E
• The probability of A is denoted by P(A)
• Probability law for E is a rule that:
• assigns P(A) to A in a way that the following conditions, taken from our daily experience, are
satisfied:
• A may or may not take place; it has some likelihood (which may be 0 if it never occurs)
• Something must occur in our experiment.
• If one event negates another, then the likelihood that either occurs is the likelihood that one occurs plus the
likelihood that the other occurs.
PROBABILITY LAW
Axioms of probability
• Axiom 1: The probability of an event is a real number greater than or equal to 0.
• Axiom 2: The probability that at least one of all the possible outcomes of a process (such as
rolling a die) will occur is 1.
• Axiom 3: If two events A and B are mutually exclusive, then the probability of
either A or B occurring is the probability of A occurring plus the probability of B occurring.
PROBABILITY LAW
• Axiom 1
• For any event A, the probability of A is non-negative,

P(A) ≥ 0

For any event A, the probability of A is greater or equal to 0”.


PROBABILITY LAW
• Axiom 2
• The probability of the certain event Ω is equal to one,
P(Ω) = 1.
• When Ω is the sample space of an experiment
• The set of all possible outcomes:

• “The probability of any of the outcomes happening is one hundred percent”


or
• “anytime this experiment is performed, something happens”.
PROBABILITY LAW
• Axiom 3
• For any two mutually exclusive events, A and B, the probability of the union is the sum of the
probabilities of the individual events,

P(A ∪ B ) = P(A) + P(B)

• Here ∪ stands for ‘union’.


• Example
• Three optical packets A, B, and C, are contending for an output port, in a switch node. We
assume that one and only one packet can get through the port. The sample space may be taken as
the 3-element set O={A,B,C}, where each element corresponds to the A and B e outcome of that
candidate’s getting into the output port. Suppose that have the same chance of winning, and that
C has only1/2 the chance of A or B.
Thus, we assign the following elementary probabilities:
CONDITIONAL PROBABILITY
• Conditional probability is defined as:
“the likelihood of an event or outcome occurring, based on the occurrence of a previous event or outcome”

• The dependence of event A on event B is measured by the conditional probability P(A|B) given by:

• This formula is also known as Bayes’s formula.

• P(A|B) the probability of event A occurring, given event B has occurred


• P(B|A) the probability of event B occurring, given event A has occurred
• P(A) the probability of event A
• P(B) the probability of event B
EXAMPLE OF CONDITIONAL
PROBABILITY
• Due to an Internet configuration error packets sent from New York to Los Angeles are
routed through El Paso, Texas with probability 3/4.
• Given that a packet is routed through El Paso,
• Suppose it has conditional probability 1/3 of being dropped.
• Given that a packet is not routed through El Paso
• Suppose it has conditional probability 1/4 of being dropped.

Find the conditional probability that a packet is routed through El Paso given that it is
not dropped?
EXAMPLE OF CONDITIONAL
PROBABILITY
• Solution
• The notation:
• E = {routed through El Paso}
• D = {packet is dropped}
• With this notation, it is easy to interpret the problem as telling us that

• P(D|E) = 1/3
• P(D|Ec) = 1/4 and P(E) = ¾
• P(E|Dc) = P(D|Ec) P(E) Bayes’s formula
P(Dc)
P(E|Dc) = 8/11
EXAMPLE OF CONDITIONAL
PROBABILITY
• Consider a measurement device that measures the packet header types of every packet that
crosses a link.
• Suppose that during the course of a day the device samples 1,000,000 packets and of these
450,000 packets are UDP packets, 500,000 packets are TCP packets, and the rest are
from other transport protocols.
• Given the large number of underlying observations, to a first approximation,
• We can consider that the probability that a randomly selected packet uses the UDP protocol to be
450,000/1,000,000 = 0.45.
EXAMPLE OF CONDITIONAL
PROBABILITY
• More precisely, we state:

• where UDPCount(t) is the number of UDP packets seen during a measurement interval of
duration t, and TotalPacket-Count(t) is the total number of packets seen during the same
measurement interval.
EXAMPLE OF CONDITIONAL
PROBABILITY
• The probability that a cell in a wireless system is overloaded is 1/3. Given that it is
overloaded, the probability of a blocked call is 0.3. Given that it is not overloaded, the
probability of a blocked call is 0.1.
Find the conditional probability that the system is overloaded given that your call is
blocked?
• Consider a device that samples packets on a link.
a) Suppose that measurements show that 20% of packets are UDP, and that 10% of all packets
are UDP packets with a packet size of 100 bytes.
1. What is the conditional probability that a UDP packet has size 100 bytes?

b) Suppose 50% of packets were UDP, and 50% of UDP packets were 100 bytes long.
1. What fraction of all packets are 100 byte UDP packets?
EXAMPLE OF CONDITIONAL
PROBABILITY
• The binary channel shown in Figure 1.17 operates as follows. Given that a 0 is transmitted,
the conditional probability that a 1 is received is ε. Given that a 1 is transmitted, the
conditional probability that a 0 is received is δ. Assume that the probability of transmitting a
0 is the same as the probability of transmitting a 1. Given that a 1 is received,
find the conditional probability that a 1 was transmitted?
STATISTICAL INDEPENDENCE
• Event does not depend on prior events
• Example:
• Our probability model should not have to account for the entire history of a LAN.
• Independence of an even with respect to another means that its likelihood does not depend
on that other event.
• A is independent of B if P(A | B) = P(A)
• B is independent of A if P(B | A) = P(B)

So the likelihood of A does not change by knowing about B and viceversa!   This also means:
P( A ∩ B ) = P( A ) P ( B )
EXAMPLE OF INDEPENDENCE
PROBABLITY
• An Internet packet travels from its source to router 1, from router 1 to router 2, and from
router 2 to its destination.
• If routers drop packets independently with probability p, what is the probability that a packet is
successfully transmitted from its source to its destination?
• Solution:
• A packet is successfully transmitted if and only if neither router drops it. To put this into the
language of events, for i = 1,2,
• Let Di denote the event that the packet is dropped by router i.
• Let S denote the event that the packet is successfully transmitted.
• Then S occurs if and only if the packet is not dropped by router 1 and it is not dropped by router 2.
EXAMPLE OF INDEPENDENCE
PROBABLITY
• A certain binary communication system has a bit-error rate of 0.1; i.e., in transmitting a
single bit, the probability of receiving the bit in error is 0.1.
• To transmit messages, a three-bit repetition code is used.
• In other words, to send the message 1, 111 is transmitted, and to send the message 0, 000 is
transmitted.
• At the receiver, if two or more 1s are received, the decoder decides that message 1 was sent;
otherwise, i.e., if two or more zeros are received, it decides that message 0 was sent.
• Assuming bit errors occur independently,
• find the probability that the decoder puts out the wrong message.
REFERENCES
❑ Chen, Ken. Performance evaluation by simulation and analysis with applications to
computer networks. John Wiley & Sons, 2015. part 3
❑ Sadiku, M. N., & Musa, S. M. (2013). Performance analysis of computer networks (Vol. 1).
Cham, Switzerland: Springer. Chapter 2

You might also like