Attribution Non-Commercial (BY-NC)

10 views

Attribution Non-Commercial (BY-NC)

- ch08_7e
- Part 05
- Tutorial Math Modeling
- Seminar in Research Methodology
- Experimental Design and Statistical Analysis
- Normal Curve
- FRM2009Syllabus
- a72!2!2 Confidence Level and Confidence Intervals
- 6 3areaundernormcurve 121016134056 Phpapp02
- JAM MS Syllabus
- statistik - chap08
- Tutorial 2
- Continous Distribution
- ABIntervalEst.pdf
- Final Assign
- 8. Probability Distributions
- 2 Normal Distribution
- Factor of Safety and Reliability in Geotechnical Engineering_Aug 2001_Discussion_Baecher
- TOPIC 1- Normalcurve(qc)
- Syllabus STA302 UB S 2015

You are on page 1of 5

a sampling distribution or finite-sample distribution is the probability distribution of a given statistic based on a random sample. Sampling distributions are important in statistics because they provide a major simplification on the route to statistical inference. More specifically, they allow analytical considerations to be based on the sampling distribution of a statistic, rather than on the joint probability distribution of all the individual sample values.

The sampling distribution of a statistic is the distribution of that statistic, considered as a random variable, when derived from a random sample of size n. It may be considered as the distribution of the statistic for all possible samples from the same population of a given size. The sampling distribution depends on the underlying distribution of the population, the statistic being considered, the sampling procedure employed and the sample size used. There is often considerable interest in whether the sampling distribution can be approximated by an asymptotic distribution, which corresponds to the limiting case as n . For example, consider a normal population with mean and variance . Assume we repeatedly take samples of a given size from this population and calculate the arithmetic mean for each sample this statistic is called the sample mean. Each sample has its own average value, and the distribution of these averages is called the "sampling distribution of the sample mean". This distribution is normal since the underlying population is normal, although sampling distributions may also often be close to normal even when the population distribution is not (see central limit theorem). An alternative to the sample mean is the sample median. When calculated from the same population, it has a different sampling distribution to that of the mean and is generally not normal (but it may be close for large sample sizes). The mean of a sample from a population having a normal distribution is an example of a simple statistic taken from one of the simplest statistical populations. For other statistics and other populations the formulas are more complicated, and often they don't exist in closed-form. In such cases the sampling distributions may be approximated through Monte-Carlo simulations, bootstrap methods, or asymptotic distribution theory.

Standard error

The standard deviation of the sampling distribution of the statistic is referred to as the standard error of that quantity. For the case where the statistic is the sample mean, and samples are uncorrelated, the standard error is:

where is the standard deviation of the population distribution of that quantity and n is the size (number of items) in the sample. An important implication of this formula is that the sample size must be quadrupled (multiplied by 4) to achieve half (1/2) the measurement error. When designing statistical studies where cost is a factor, this may have a role in understanding cost-benefit tradeoffs.

Statistical inference

In the theory of statistical inference, the idea of a sufficient statistic provides the basis of choosing a statistic (as a function of the sample data points) in such a way that no information is lost by replacing the full probabilistic description of the sample with the sampling distribution of the selected statistic. In frequentist inference, for example in the development of a statistical hypothesis test or a confidence interval, the availability of the sampling distribution of a statistic (or an approximation to this in the form of an asymptotic distribution) can allow the ready formulation of such procedures, whereas the development of procedures starting from the joint distribution of the sample would be less straightforward. In Bayesian inference, when the sampling distribution of a statistic is available, one can consider replacing the final outcome of such procedures, specifically the conditional distributions of any unknown quantities given the sample data, by the conditional distributions of any unknown quantities given selected sample statistics. Such a procedure would involve the sampling distribution of the statistics. The results would be identical provided the statistics chosen are jointly sufficient statistics.

A sample is a subset of a population. Typically, the population is very large, making a census or a complete enumeration of all the values in the population impractical or impossible. The sample represents a subset of manageable size. Samples are collected and statistics are calculated from the samples so that one can make inferences or extrapolations from the sample to the population. This process of collecting information from a sample is referred to as sampling. 4.3 Sampling Distribution of the Mean

If I wanted to form a sampling distribution of the mean I would: 1. Sample repeatedly from the population 2. Calculate the statistic of interest (the mean) 3. Form a distribution based on the set of means I obtain from the samples The set of means I obtain will form a new distribution- a sampling distribution. In this case, the sampling distribution of the mean. Take a look at the following demonstration for a visual representation.

In this example a small population of four values is represented. Every possible combination of values from the population is sampled to form a true sampling distribution of the mean. Note, however, that a sampling distribution of any true population would be much larger, thereby making their very nature theoretical rather than practically demonstrable.

This demonstration illustrates Rule 1 of the Central Limit Theorem: The mean of the population and the mean of the sampling distribution of means will always have the same value. This rule is important to hypothesis testing because when we go to test a hypothesis even though our sample will not be exactly like the population, on average it will be exactly the same. We can be sure that repeated experiments will yield sample values close to the mean, and exactly the same over time. Rule 2 of the Central Limit Theorem states: the sampling distribution of the mean will be normal regardless of the shape of the population distribution. Whether the population distribution is normal, positively or negatively skewed, unimodal or bimodal in shape, the sampling distribution of the mean will have a normal shape. In the following example we start out with a uniform distribution. The sampling distribution of the mean, however, will contain variability in the mean values we obtain from sample to sample. Thus, the sampling distribution of the mean will have a normal shape, even though the population distribution does not. Notice that because we are taking a sample of values from all parts of the population, the mean of the samples will be close to the center of the population distribution.

t Distribution

Probability Density Function The formula for the probability density function of the t distribution is

where is the beta function and is a positive integer shape parameter. The formula for the beta function is

In a testing context, the t distribution is treated as a "standardized distribution" (i.e., no location or scale parameters). However, in a distributional modeling context (as with other probability distributions), the t distribution itself can be transformed with a location parameter, , and a scale parameter, .

The following is the plot of the t probability density function for 4 different values of the shape parameter.

These plots all have a similar shape. The difference is in the heaviness of the tails. In fact, the t distribution with equal to 1 is a Cauchy distribution. The t distribution approaches a normal distribution as becomes large. The approximation is quite good for values of > 30. Cumulative Distribution Function The formula for the cumulative distribution function of the t distribution is complicated and is not included here. It is given in the Evans, Hastings, and Peacock book. The following are the plots of the t cumulative distribution function with the same values of as the pdf plots above.

Percent Point Function The formula for the percent point function of the t distribution does not exist in a simple closed form. It is computed numerically. The following are the plots of the t percent point function with the same values of as the pdf plots above.

Other Probability Functions Since the t distribution is typically used to develop hypothesis tests and confidence intervals and rarely for modeling applications, we omit the formulas and plots for the hazard, cumulative hazard, survival, and inverse survival probability functions. Common Statistics Mean 0 (It is undefined for equal to 1.) Median 0 Mode 0 Range Infinity in both directions. Standard Deviation

It is undefined for equal to 1 or 2. Undefined 0. It is undefined for less than or equal to 3. However, the t distribution is symmetric in all cases.

It is undefined for less than or equal to 4. Parameter Estimation Since the t distribution is typically used to develop hypothesis tests and confidence intervals and rarely for modeling applications, we omit any discussion of parameter estimation. Comments The t distribution is used in many cases for the critical regions for hypothesis tests and in determining confidence intervals. The most common example is testing if data are consistent with the assumed process mean. Software Most general purpose statistical software programs, including Dataplot, support at least some of the probability functions for the t distribution.

- ch08_7eUploaded byAlifikri Abufaiz
- Part 05Uploaded byrodi10
- Tutorial Math ModelingUploaded byymvnhung
- Seminar in Research MethodologyUploaded bySaurabh G
- Experimental Design and Statistical AnalysisUploaded byAli Rostami
- Normal CurveUploaded byAubrey Holt
- FRM2009SyllabusUploaded bygreathill56
- a72!2!2 Confidence Level and Confidence IntervalsUploaded byChrisana ManofGod Morrison
- 6 3areaundernormcurve 121016134056 Phpapp02Uploaded byFajar Rudy Qimindra
- JAM MS SyllabusUploaded byBangaru Babu
- statistik - chap08Uploaded byAnnisa Rachmi
- Tutorial 2Uploaded bydaphnesy
- Continous DistributionUploaded byRajesh Dwivedi
- ABIntervalEst.pdfUploaded byJenny
- Final AssignUploaded byHonie Singh
- 8. Probability DistributionsUploaded bySean Gomez
- 2 Normal DistributionUploaded bySazlinda Ramli
- Factor of Safety and Reliability in Geotechnical Engineering_Aug 2001_Discussion_BaecherUploaded bycabronet
- TOPIC 1- Normalcurve(qc)Uploaded byPravin Periasamy
- Syllabus STA302 UB S 2015Uploaded byDavid Ramzy
- Sampling DistributionsUploaded byS.L.L.C
- 2.15_Simulation+Modeling+模拟建模Uploaded byJames Jiang
- Practice Problems for Test 2Uploaded byIslam Abd Alaziz
- Stat ReviewUploaded byPegs
- Bengkel Add Maths- Taburan Kebarangkalian.docUploaded byAbdul Manaf
- Section 06 01 Ess Stats2eUploaded bykaled1971
- math3160s13-hw8_sols.pdfUploaded byPei Jing
- [CHOO, Yan Min] H1 Mathematics Textbook (Singapore(B-ok.xyz)Uploaded bygog
- BSChem-Statistics in Chemical analysis.pdfUploaded byKENT BENEDICT PERALES
- BSChem-Statistics in Chemical analysis.pdfUploaded byKENT BENEDICT PERALES

- Agro-Logistics and Market Integration Hub(ALMIH)Uploaded byMesfin624
- 3406E, C-10, C-12, C-15 and C-16 on-Highway Engines-Maintenance IntervalsUploaded byKirotosk Jaanek
- Lymph NodeUploaded byAstina
- Workout of Nafs and QalbUploaded byS.M Raqib Al-Nizami
- Architectural StructuresUploaded bymatteo_1234
- Learning From Data (11)Uploaded byNJSchandler
- DAFUploaded bythkimzone73
- An Interpretation of the Mountain in Daniel 2.45Uploaded byDin Paglinawan
- Transportation of Substances in Plants and Human BeingsUploaded byChirag Makkar
- Jodi Oil 2nd ManualUploaded bykenangrok2009
- MB8006 Assignment 2 W14Uploaded byJarrett Xu
- masina de spalat.pdfUploaded byTatiana Dinu
- 70539_12cUploaded byDan Farris
- BentoniteUploaded byAditya Sharma
- Architectural Prog PPTUploaded bynickopogi
- Absorption of Nox.pdfUploaded byShailesh Lohare
- API 682 Presentation Beginer 2011Uploaded byOmkarGhori
- Fired Heater DesignUploaded bysaminasritn
- Electrical Impedance Tomography in Acute Lung InjuryUploaded byJefferson Santana
- Enzymes Formal ReportUploaded byCarla
- Coordinate GeometryUploaded byJohn Vone Mores
- 01-MoRTH Road Specification 5th Revision_CroppedUploaded bySaumitr Chaturvedi
- Kandhari Beverages Private r 23012012Uploaded byRobin Garg
- HTN managment-JNC7.pptUploaded byVaibhav Karoliya
- VectorUploaded byReneboy Lambarte
- Airports -Uploaded byJoyce Cua-Barcelon
- anatole france ebook of thais.docUploaded bycarlosvalx
- NCP for post op wound and fractureUploaded byAlyssa Marie
- Agilent FieldFox RF Analyzer N9912AUploaded bysyaglit
- 21825fUploaded byestions