You are on page 1of 6

1.

QUANTITATIVE TECHNIQUES:
Since managerial activities have become complex, we need to make proper decisions
in order to avoid heavy loss in business activities. Use the resources in an efficient
manner in a maximum possible way, be it a manufacturing unit or service
organization. So quantitative technique can be defined as programming techniques
and statistical blueprint that helps decision makers solve the problems efficiently.
It has been proven as the most powerful tool for analysis and decision making in
industries and big firms. Just by using quantitative data it is possible to achieve the
goals. Quantitative technique is further classified into statistical and programming
technique. Statistical analysis is done with the given data and a statistics is framed in
former and quantitative method of OR is used to make wise decisions in the latter.
Statistical technique is method of data collection, classify and draw the collected data
in table, it uses probability and sampling, it is form of correlation and regression, it
includes index numbers, time series analysis, analyzing variance, statistical
interpretation and inference also surveying techniques and methods. Programming
technique on other hand includes decision and queuing theory, linear programming,
game theory, integrated production model, inventory planning, replacement theory,
simulation and network analysis.

DIFFERENTIATION:
The process of finding the function that contains output of rate of change one variable
over rate of change of another variable.

CONSTANT FUNCTIONS, POLYNOMIAL FUNCTIONS & RELATION


FUNCTIONS:
Types of functions are identity, constant, polynomial, rational, modulus, singnum,
greatest integer function are seven types. Functions are the graphical representation of
constants and variables. The function that associates each real number to itself is
called the identity function and it is usually denoted by 'I'.
Constant function is that for example if 'c' is a fixed real number then a function f(x) is
given by f(x)=c for all x is the element of r. It is called constant function.
Polynomial function formula is f(x)=a0+a1x+a2x2+a3x3+...+anxn-1

APPLICATION OF LINEAR PROGRAMMING IN INDUSTRIES:


Linear programming is an optimizing method. It gives us a deeper knowledge of how
to use limited resources and capabilities in a business to obtain particular objective
provided the available resources having alternate uses. First part contains objective
function that describes primary purpose of the formulation that is to maximize return
and minimize cost of production. Second part is constraint set that deals with
equalities and in equalities that defines the restrictions under which the optimization
has to be accomplished. If you see in the case of calculus, it can handle only equal
constraints. But here in the case of linear programming, it is not. The applications can
be seen widely in various field. This is a good tool to measure various aspects of any
business. Industrial, management, miscellaneous including airline routine-dietics and
nutrition-education and politics-form planning. Then administrative area where it is
applied commonly. Non-industrial area covers agriculture, urban development and
environmental protection and facilities location. There are of course few limitations
that come alongside as any other method one of which is many real-world problems
are so complex, in terms of number of variables and relationships constrained in them,
that they tax the capacity of even the largest computer.

2. METHODS OF FINDING COEFFICIENT OF CORRELATION:


The term correlation means relation between the variables. It defines the degree of
linear relationship between two or more variables. It has five types. Positive, negative,
simple, partial and multiple are the five types of correlation. If there are two variables
x and y, when one variable increases other one also increases. It is positive
correlation. If for the given x and y variables, when value of x increases then y
decreases, then vice versa. This is negative correlation.
If x and y doesn't change it is simple correlation. For many given variables if you
consider only x and y it is partial correlation.
For many given variables if you find the value of all variables then it is multiple
correlation. There are four methods under simple linear correlation. They are scatter
diagram, Karl Pearson correlation coefficient, Spearman's rank correlation coefficient
and correlation coefficient by concurrent method.
Let's see Karl Pearson method: r= xy-xy/sq.rt. of x2-x2 - y2-y2
x values are 1,3,5,7,9 y values are 2,4,6,8,10. From the given data it is clear that this is
positive coefficient correlation. Substituting the given values of variables x and y, we
calculate the value of 'r' as 1.

TRAVELLING SALESMAN PROBLEM:


It is a wide known mathematical, operation research problem used quite often to test
the analytical thinking of decision makers in huge organizations. In which order does
the salesman visits the cities once by travelling all over and then return to the starting
point. Same like courier service delivering products in many houses in same area.
They follow a pattern to travel.
Exact thing to be looked here is heuristic algorithms and Christofides algorithms, 1.5
times more accurate can be used to resolve this problem.

A-PRIORI PROBABILITY:
If the microstates have the same energy, same volume and same number of particles
then they occur with equal frequency in the ensemble. The probability of finding the
phase point for any given system for a given region of phase space is identical with
respect to other region of phase space. A fundamental postulate of statistical
mechanics is that a macroscopic system in equilibrium. Microscopic states satisfying
the macroscopic conditions of the system at any given point is called the postulate of
equal A Priori probability. Consider a system composed of n identical particles in
volume. If the particles are non-interacting, then the total energy will be constant.

POISSON PROCESS:
Poisson process is the continuous time version of Bernoulli process.
In Bernoulli process, time is divided into slots. And during each one of the slots,
we may either have an arrival or no arrival. In the Poisson process, time is continuous.
And we may get arrivals at any time. We want to define the Poisson process by
introducing some assumptions that in some ways parallel the assumptions that we
made for the Bernoulli process like independence. In Poisson process, consider two
intervals, two time intervals that are disjoint and look at the random variable that
stands for the number of arrivals.

3. HEURISTIC PROGRAMMING:
Every day you make decisions and judgments. Sometimes you are able to think about
them carefully but other times you make them on the fly using little information. This
is where heuristics come in. Heuristics are straight forward rules of thumb that we
develop based on our past experiences. They are cognitive tools that help us make
quick decisions or judgments. Life would be exhausting if we had to deliberate over
every one of the hundreds of choices we make every day. So instead we use our
heuristics as shortcuts to make judgments about the world around us.
For example, rather than spending time deciding what to wear every day, you might
have some default outfits. Or when faced with a lunch menu with too many options,
you may opt for what you've enjoyed in the past. Heuristics aren't about making the
perfect decision or judgment, just about making one quickly.
Heuristics play a role in our reasoning about the broader world too.
As an example, consider the rate of violence in the world over the past century.
Is the world more or less violent in the past 20 years than previously?
Heuristic reasoning might lead to think that the world is more violent today than it has
been in the past. Every day we're confronted with images of tragedy in the news and
on social media. We might reasonable assume that the world is more violent today
than ever before, using what's called "availability heuristic".

OPTIMUM PATH:
It comes under dynamic programming. A problem has optimal substructure if an
optimal solution can be constructed efficiently from optimal solutions of its sub-
problems. Otherwise, we can solve larger problems given the solutions to its smaller
sub problems. It is taken from the book applied dynamic programming by
Richard.E.Bellman in the year 1957. Let's take an example of travelling from A to G.
there are paths to cross B, C, D, E, F in the middle.
f(A)=0, f(B)=2, f(C)=4, f(D)=6, f(E)=8, f(F)=10 find f(G). This problem has an
optimal substructure. To prove that you can prove by induction. Use Fibonacci series.

MONTE CARLO SIMULATION AND APPLICATIONS:


You decide to go for a walk but the place is crowded. You can't go in straight line or
as planned. You go for a random walk. Same way is this. This concept is taken from
randomly winning in casino or lottery. It is beta function distribution further classified
into: PERT function and symmetric triangular function and rectangular function. A
discrete distribution, multiple peaks are included. Significant computing power to
calculate a random choice for each activity for its duration. And it is determined by
likelihood of duration. Monte Carlo method is highly mathematical and scientific but
using computers for high accuracy and calculation makes it a simple task.
Primary schedule estimates made sensible with distribution and random counting
tasks. Monte Carlo and its eight activities cannot be delayed as its sequence of flow
with the series of functions can be determined by programming using python and any
other programming languages. It is giving a set of protocols and programmed part of
the system in arriving at logical solutions and concerned with risk management.

4. QUEUING MODELS:
It is nothing but waiting line theory. In general, real-life scenario, you go to shops,
service centers and find a long queue waiting for a long time as customer for buying a
product or service. Most of the queue follows FIFO or FCFS principle. First In First
Out or First Come First Served principle.
Whoever enters first buy the product or get the service first and leaves the place first.
Customer is the one who gets the service, service is the need of the customer and
service facility is the system that provides the service.
Lq is the average length of the queue. Ls is the average length of the system. Wq is the
average waiting time in the queue.
Ws is the average waiting time in the system.
Arrival rate is the number of arrivals per unit time.
Service rate is the number of services per unit time.
Above two follows Poisson distribution and probabilistic in nature.
Inter arrival rate is the time between two arrivals.
Inter service time is the time between two services.
Inter arrival time is the reciprocal of arrival time and same with inter service is
reciprocal of service time.
These above two follow exponential distribution and it is deterministic in nature.
Traffic intensity or utilization factor is the ratio between arrival rate and service rate.
Keep in mind that arrival rate must be lesser than service rate.
Factors affecting arrival are size of calling population that is finite and infinite, nature
of arrival that is random and constant, system capacity that's limited and unlimited.
Customer behavior includes balking, reneging, jockeying and collusion.

MINIMAX CRITERION:
Minimizing the maximum possible regret is minimax. Regret is the opportunity of
maximizing return lost. It is calculated for every option along scenario. Regret for an
option is regret from option opted minus the return from the best option. Regret for
best option is zero. Let us see an example. Under minimax regret rule we prepare
regret table. The values are compared from left to right.

OPPORTUNITY LOSS:
We can make decisions using expected opportunity loss approach.
Also there is relationship between EOL with expected monetary value (EMV) and
expected value of perfect information (EVPI). We use payoff tables where payoffs are
profits. To begin, we obtain a regret or an opportunity loss table. Regret is the
difference between best payoff in a particular state of nature and actual payoff
received.

DETERMINISTIC MODEL:
It relates the results of the performance and the factors that produce the results.
As an example take running race. Goal is to reach the target in time. So factors that
influence the goal are speed and distance.
Speed = distance/time and Time = speed/distance.
Speed of the athlete is directly proportional to distance travelled.
Overall four rules to be remembered:
Top of the model is the goal that is placed to determine the performance.
In order to impact the performance the variables at the bottom can be altered.
Model is subjective.
If mechanical connections go wrong, the model goes wrong.

5. COB DOUGLAS PRODUCTION FUNCTION:


In economics, production function is calculating what comes out of a production to
what has gone into it. CD production function was proposed by Charles Cobb and
Paul Douglas in 1928. This production function can be used for whole economy
structure. It calculates the ratio of input and output, the relationship between
production input and production output to improve efficiency of production and to
calculate change in production methods. The standard formula is Y=ALαKβ Here,
Y is the total production which is actually the real value of all the goods produced
annually. L is the labour input which is how many hours of work by a person in one
year. K is nothing but capital input which is the measurement of buildings, tools and
machines, capital input value divided by price of capital. A is the total factor
productivity and α and β are nothing but output elasticities of capital and labour.
These values are the constants.

You might also like