- A Generalized Family of Estimators for Estimating Population Mean Using Two Auxiliary Attributes
- Get PDF 81
- ISM Chapter9
- snoa719
- Statistics Covariance&Correlation
- Assignment 1
- Notes 1
- ch08.pdf
- xr108sol
- Financial Economics Bocconi Lecture5
- Exam1 Spring2006
- chi distrib_CEP.pdf
- Standard Deviation
- 2008 paper
- ZA08 Handout
- Investment Diversification and Portfolio Analysis
- Multivariate Probability
- Summary Notes
- statistics practice 10
- Week 9 Week 10 Ch13 Version 2
- Minka Gamma
- GEM3 Empirical Notes
- 10.1007%2Fs41465-018-0068-1.pdf
- שפות סימולציה- הרצאה 17 | Rare Events Simulation
- ppt05 Econometrie
- ABGD, Automatic Barcode Gap Discovery for primary species delimitation
- 1104.5280
- 3.Numerical Descriptive Techniques
- CFA Level 1 eg
- Estimation of Time-Series Models
- ATI_Radeon_HD_5850_5870.pdf
- SN_XUT_AXIT_PHOTPHORIC.docx
- 46608521f03baf8e701d
- George_Russell_Lidian_Cromatic_Concept.pdf
- 02 Caratula
- aula-diversificada-tomlinson.pdf
- contoh anggaran
- Bomba en Linea.pdf
- Buku Ajar Keperawatan Jiwa
- i Us Document 282
- presentacion-de-ondas-y-sonido n° 1
- Tally 9 Assignment
- Infancia-maltrato y desarrollo infantil G-16.docx
- Bài 21 Thuốc Nam Chữa Cảm Cúm Hiệu Quả
- Azúcar.docx
- 4-ikatan-kimia-dan-struktur-molekul.pdf
- transformadores clases, tipos.pdf
- El Carnaval de Venecia 1
- PROYEK 2 UL
- batuk respi.pptx
- Patologia-das-Construcoes-parte0201.pptx
- 835287_BAB 2.docx
- Sk Musrenbang Kel 3 Ilir
- COMPARATIVOS
- sop nok
- JURNAL ke 1
- Recuso
- Medicamentos y Alcohol
- Hasil Penelitian Perbaikan Final Ujianaaaa
- Tete Haha
- hi
- So what not
- Exam
- Vibhav Gupta.pdf
- hello
- hi
- exam 3
- solomon_1
- Hello Yes
- hehe
- No Yes Why
- Syuyuyutu
- Assign 2 (BA).pdf
- hi
- hello
- Good
- Homework 1 Prof.docx
- retyrtyr
- Econometrics Quiz.docx
- ecotrics
- Assignment
- hytrtytr
- Vibhav Gupta.pdf
- STATA Exam.docx
- Homework 1 Prof.docx
- Assignment 2.docx
- Exam3
- Ashna Kedia.docx
- Exam III_Section1 (1).docx
- hello

March 28, 2013

1

Method of moments: Define the dummy variables dj = 1x=xj and the moment function

1 X

mj = dnj − pˆnj

N n

**where the fourth parameter follows from the adding-up criterion pˆ4 = 1 − pˆj6=4 . It is clear that
**

the MoM estimator equates the P theoretical first moments to the empirical moment are just the

empirical frequencies, so pˆj = n 1xn =xj .

b.

**The MoM asymptotic variance matrix is
**

1 −1 0 −1

Cov(p) = G0 G G ΣG G0 G

N

The gradient of the moments at the estimates is just the identity matrix. The covariance matrix of

the moments Σ is the covariance of the moments. It has a particularly simple analytic form

N σjj = (1 − pj )pj

N σij = −pi pj

**so the MoM covariance estimator collapses to the binomial variance matrix
**

1

Cov(p) = Σ.

N

c

The estimates and standard errors are in Table (1). The Wald test gives a test statistic of 0.28. At

one degree of freedom, the p-value is about 60%.

Table 1:

Parameter Coefficient Std err

pˆ1 0.1667 0.0481

pˆ2 0.2000 0.0516

pˆ3 0.1333 0.0439

1

2

a.

Let y be the true weight and y ∗ be the measured weight.

y ∗ < 100 if y ∈ [80, 100]

y ∗ = y if y ∈ [100, 240]

y ∗ > 240 if y ∈ [240, 300]

**Given E[y ∗ |x = 0] = 180, the upper and lower bounds on E [y|y ∈ [100, 240], x = 0], call these
**

Econdl , Econdu are given by the upper and lower bounds of measurement errors on either side.

Multiplying through by the probabilities, the probability weighted contributions are (M el , M eu ) =

( 80 240 100 300

20 + 10 , 20 + 10 )

17

28 + Econdl = 180

20

17

35 + Econdu = 180

20

so (Econdl , Econdu ) = (170.59, 178.82). The true conditional expectation for y ∈ [80, 300] must lie

in the range of maximum measurement error

17 17

E [y|y ∈ [100, 240], x = 0] ∈ Econdl + M el , Econdu + M eu

20 20

= [173, 187]

b.

The sample median lies in the correctly measured range and the estimated median is therefore

unaffected by the measurement error, so M (y|x = 0) = 160.

c.

There is at least an issue an issue with significance and an issue with causality in the statement.

The statement refers to the difference in relative frequencies of vegetarian and non-vegetarian stu-

dents in the upper tail of the distribution P (y ∗ > 240|omnivore) − P (y ∗ > 240|veggie) = 10

1

.

**Firstly, since there are measurement errors, in the extreme, all omnivores true weight could be
**

epsilontically above 240 pounds, in which case the difference would not neither economically nor

statistically significant. It could also be that all omnivores were 300 pounds, that would certainly

be economically significant, possibly statistically significant, be we don’t know given the informa-

tion.

**Secondly, causality: is it the vegetarianism of the vegetarian students that hold their weight down
**

or are there any other habits correlated with vegetarianism, say an increased propensity of exer-

cise, eating disorders or whatever else that may bring your weight down. As far as we know, the

measurements don’t tell.

3

a. & b.

The fixed effects were generated as

T 1

αbi = ln 1 + x

¯bi + max xbit + υi

30 5 t

2

For small T . . 3 . Note that the random component υi of the fixed effect is kept constant within individual and across simulations. α ˆ i (β))] for all i T t 1 Some of you specified a process where each individual got a new draw υ for every period and most of you made new draws υ for each simulation b. . t. The fixed effects estimation problem can however be expressed in terms of a concentrated likelihood function that will transform the problem to an iid structure and that also corresponds with the requested optimisation algorithm. . αi )) IT t i The ML estimator can equivalently2 be represented by 1X ˆ i (β) ≡ arg max α ln(f (yit |β. 2 In the sense that it leads to an estimator with equivalent asymptotic distribution. α) = Φ(αi + xit β)yit (1 − Φ(αi + xit β))1−yit is the common likelihood function for all i. . . . where xbit ∼ iid U [0. 25. 1000 simulations. α))] where f (yit |β. Then the effect is no longer fixed and the consequence is less individual persistence. The fixed effects are not iid. The reason is that the probability of the occasional likelihood problems at the tails of the fixed effect distribution decreases when you redraw the ’fixed’ effect υi for every t = 1. so neither is {yit } conditional on x. α ˆ i (β)) = E [ln (f (yi |β. . That ameliorates the estimation problem by averaging out the fixed effects across individuals and increases the variation in y. α ˆ i (β)) (3) IT t i A necessary condition for consistency is that we can interchange the limit of the maximum (lhs below). the log likelihood function can be written 1X ll(β) ≡ E[ln(f (yi |β. The fixed effects α and and the shocks are unobserved. . T in b = 1. α) = ln(f (yit |β. . there is then a cross- sectional correlation between αi and xi that motivates a fixed-effects estimator. .for individuals I = 1. Holding I fixed.1 The latent variable y ∗ and the observed variable y are given by the process ∗ yit = αi + xit β + it ( 1 if y ∗ > 0 y= 0 otherwise where it is iid standard normal and xit is a scalar. for periods t = 1. αi (β))] (2) I i The criterion function we use for estimation is 1 XX llT (β) = ln(f (yit |β. The log-likelihood function is ll(β. . α) = E [ln (f (y|β. . with the maximum of the limit (rhs below) 1X plimT →∞ ln(f (yit |β. T . . We can then express the estimation problem as a concentrated log-likelihood problem. . . α)) (1) α T t 1 XX βˆ ≡ arg max ln(f (yit |β. . The ML criterion function is 1 XX llT (β. Those who did got results that looked more like consistency for T = 200 than I report below. The iid error structure is also useful to check necessary conditions for consistency. 1] and υi is an iid standard normal draw. α ˆ (β)) β IT t i corresponding to the algorithm.

for all α. The error of any α ˆ i will then contaminate the estimate of β in the outer loop. we get a familiar random effects error structure uit = υi + it . with the 4 average take over individuals. In the limit. the fixed effects estimator is consistent as T → ∞. The fixed effects go from 1 1 1 T = 2 : αi = ln 1 + x ¯i + max xit + υi 2 5 5 t to p 1 1000 1 T → ∞ : αi → − ln 1 + + + υi ≈ 1. particularly as T → ∞.96 + υi 2 30 5 i. This approach might be computationally burdensome. There are reasons why you may want to avoid random effects. referring to the incidental parameters problem. • The correlation between the regressor and the fixed effect goes away. As with most non-linear estimators.96 for T = 2 and T > 1000. To sum up a somewhat tldr discussion. the increase in the fixed effect leads to less variation in the outcome y. αˆ i will be estimated with some error. individuals with a low and negative individual specific fixed effect υi increase the variation in y as the constant grows. 3 Given the iid structure conditional on the fixed effects and assuming boundedness of the log likelihood function. then P r(y = 0) for most individuals become a rarer event. we get uniform convergence for the probit model. Some of you claimed that there is no consistent estimator of the probit fixed effects (or fixed effects discrete choice models in general). As T increases. 2 In our case. When however T → ∞. On the other hand. either random or fixed effects are consistent estimators. the expected value of αi goes from on average of E[αi |T = 2] = 12 ln 1 + 15 + 15 = 0. if yi = 1 for all t. where Θ is a compact parameter set. The heterogeneity approaches the form αi + uit = α + υi + it that is no longer correlated with xit . note that α ˆ i is estimated using only the data for individ- ual i conditional on β.1 of Amemiya (1985) and we have consistency. the the ML estimator is not defined. For fixed T . Uniform convergence in probability of llT (β) to ll(β) is sufficient to ensure that. The endogeneity issue is therefore decreasing in T . • The fixed effect αi increases in T < 1000. to a common constant plus an individual specific component υi . For some individuals. we then have that • Though there is correlation between the regressor and the fixed effect for finite T . estimation errors lead to finite-sample bias. 4 Noting that the first component x¯t by LLN is 12 in expectation and that E[maxt xit ] is a first order statistic of the standard uniform for N = 2 that is equal to 23 . Either Maximum Simulated Likelihood or a Bayesian MCMC approach is generally required due to the inner integral. Since T stays fixed.16 to 1. respectively. β ∈ Θ. α)dG(α) β I α i for a known distribution function G(α).1. The log likelihood then turns into Z 1X ll(β) = arg max ln f (yi |β. the estimation errors will persist even if I → ∞. yet biased in small samples. so in the limit. The incidental parameters problem refers to a the asymptotics where T stays fixed and I → ∞ in (3). Less variation in y may in turn frustrate the estimation of αi . that is not the case when T → ∞ in (3). Both random effects and fixed effects are therefore consistent. Reduced variation in y for a given T makes the parameters more imprecisely estimated. There is however still correlation in the fixed effects within individuals.e. 4 . changing the base rate E[P r(yi = 1)]. First. We can then apply Theorem 4. the estimation error in ˆ (β) goes to zero 3 α There are three important effects as T → ∞: • Holding I fixed. This is the incidental parameters problem. As argued above.

• The variation in y changes with T < 1000 as the constant part of the fixed effect grows. The change in the variation in outcome y may decrease the estimator precision more than the increased precision from more information. When many iterations are required. reflecting the change in base rates as α grows in T .e. the distribution of βˆ is seen to improve in Figure (1). The estimated fixed effects are noticeably biased away from zero.03 displaying convergence to a symmetric distribution centered around the true value.5 Density 2 1. Though the fixed effects are noticeably less biased than for T = 2. This bias is a potential source of concern for fixed T as the substantial bias of some of the α estimates contaminates the β estimates.5 1 0. the true values themselves change.6 The true fixed effects are seen to have more mass on higher values for T = 30 in Figure (2). the estimates α ˆ still trail off as the variation in y decreases when αi gets far from 0. The results below shows that for some individuals. I have plotted the true fixed effects vs the mean estimated effect over all 1000 simulations. i. there may not be convergence as T grows when T < 1000. For T = 2. Particularly above αi > 1.5 0 −3 −2 −1 0 1 2 3 4 5 6 7 β Figure 1: Distribution of β for T = 2 and T = 30 5 Many of you reported trouble with both convergence and runtime.5 Increasing the time dimension to T = 30. The distribution of βˆ is still fairly well centered around zero as the bias in α is driven by a few very large outliers with hardly any variation in y where the optimiser is rewarded for moving α out of bounds to achieve a small increase in the fitted frequency. The distribution of βˆ for T = 2 and T = 30 is given in Figure (1). The mean is 1. and with it the base rate. eradicating the worst for-loops and other best practices pays off. 6 The script for this part runs for about 30 seconds. some even claimed that it was practically impossible to the required simulations within the given exam time. The script for this part runs for about 10 seconds on my standard laptop. A substantial bias in α is seen to remain even for relatively large T . streamlining the code even somewhat. A plot of the means of α ˆ vs the mean true value for each individual is given in Figure (2).32. Though increasing T entails convergence to the true values. Density β 4 T=2 T = 30 3. the estimates α ˆ venture away on the positive side of the true effects. there are indeed computational problems due to little variation in y for large αi even for large T . In Figure (2).5 3 2. reflecting the fact that the fixed effects are biased and inconsistent in small T samples. the distribution of βˆ is seen to have a large mass to the right of the true value 1 with a mean of 1. 5 . reporting run times of more than a day for a set of simulations.

at about α > 2 compared to about α > 1 for T = 30 and from abs(α) = 0 for T = 2. yet again. there are some individuals who get outcome realisations of a series of y with hardly any variance at all. Random effects.7 The means of α ˆ are seen to align well with the true values for α < 2. As T goes to infinity.As expected. more biased for larger α. 2. That shows up in the long tail on the positive side of the βˆ distribution in Figure (3) (capped at 3) where the estimation routine spins out. listed in decreasing order of efficiency and increasing level of robustness to cross-sectional bias contamination and distributional assumptions. The fixed effects however trail off later this time. d. The fact that there are more observations does not counteract the reduction in variation in outcome for some outlying individuals as the rare event effect dominates and cause estimation problems. Estimation individual-by-individual is consistent and avoids cross individual contamination from the finite T bias we saw might still persist even for T = 200 for the is who have their base rate sufficiently displaced by the increasing fixed effect. Predictably. Individual-by-individual. more observations reduces the bias. 6 . as the common constant stops in- 7 The script runs for about 4 minutes. The mean and the median β is 1 to more than eight decimal places. I also estimated an FE spec. True vs estimated fixed effects 10 5 Estimated α 0 T=2 T = 30 −5 −3 −2 −1 0 1 2 3 True α Figure 2: Estimated vs true fixed effects α for T = 2 and T = 30 c. where both the mean and the median were larger than 1 by some hundredths. The resulting estimates yield the distributions of βˆ and α in Figures (3) and (4). The discussion above suggests we might want a consistent estimator that is robust to small sample bias. the FE estimates have a tighter distribution than the I-by-I estimates due to the increased flexi- bility of I-by-I. Fixed effects. Eventually. you can consistently estimate β by either 1. 3. an improvement from the bias for T = 30. The distributions are both centered around the true value. as the constant increases with T . For most individuals. however.

Though the bias in nuisance parameters α seemed to be substantial at larger values. it appears not to be a strong pattern. There is hardly any difference between the I-by-I spec and the FE so there is little cross-sectional contamination in FE.5 β Figure 3: Distribution of β for T = 200.creasing at T = 1000. the rarity of the outliers causing the bias did not move the distribution of βˆ far from its true value. σa2 for the acceptances. The bias at large α is again driven by mean being pushed out by some realisations of y with barely any variation. & b MSM can be used to estimate the distributions of offer and acceptance points. • The mean of accepted offers. Both distributions are truncated normal on [0. σo2 for the offers and µa . Finally. I use two theoretical restrictions. 4 a. yield variation in y and the estimator is consistent as T → ∞ also for αi > 2. There are then four parameters to estimate µo . we can restrict the covariance between the offer and accep- tance points to zero. I don’t know. note that there is a curious dip in the discrepancy between the true and the estimated α at the far right hand side that is seemingly repeated for T = 2. Density β T= 200 β < 3 9 Individual FE 8 7 6 5 Density 4 3 2 1 0 0 0.100] and since randomized matching both is true and common knowledge. The median is for instance hardly affected as this happens rarely. This pushes the sample maximum far out in the parameter space for some b and we get substantial bias. 7 . 30 in Figure (2). I use five moments • The relative frequency of accepted offers.5 1 1. but running the simulations a few times from different seeds.5 2 2.5 3 3. the impact of extra information will override the rare event effect as T → ∞. we avoided cross-sectional contamina- tion at the cost of the efficiency associated with imposing a true restriction. Whether the apparent pattern is a fluke or reflects some real statistical effect. By estimating the specification individual by individual. if any at all.

Err. FE T = 30. • The mean of rejected offers. FE T= 2.22 4. • The standard errors of rejected offers. For the candidate set of parameters.17 σ22 40. For the simulation. σ 2 over which I search to avoid convergence issues. µ1 22.04 0. I equate the simulated moments to the data.06 µ2 17. FE −5 −3 −2 −1 0 1 2 3 True α Figure 4: True vs estimated α for T = 200 • The standard errors of accepted offers. I used an identity weighting matrix and calculated the standard errors by bootstrap. The empirical moments are given from simulating the data generating process with the given draws.8 The results are given in Table (2). Std. Individual T = 200.53 8 Total computation time including 1000 bootstraps was about 23 minutes.96 0. 8 .55 σ12 10. I make a draw from a standard normal that I keep fixed between simulations and scale with the parameters µ. Table 2: Parameter Coeff.00 0. True vs estimated fixed effects 10 5 Estimated α 0 T= 200.

1).1)]. wald = (R*p' . m2 = [zeros(10.1). p = [1/6 1/5 2/15]. ones(8.1). m3 = [zeros(22.r) file:///Users/Grace/Documents/MATLAB/M604%20CA/M604%20Final/html/M604%20Finals%20doit.1).repmat(p. m1 = [ones(10. % Moment covariance Sigma = -p'*p. Cov = 1/N*Sigma % Estimates and standard errors disp('Estimates and standard errors') [p' sqrt(diag(Cov))] % Wald test R = [1 1 0].4. Probit fixed effects Plot density of estimates Plot betas Estimating individual per individual for T = 200 Density beta Plot true vs estimated fixed effects M604 Q4 ultimatum Standard errors Final M604 % Set up for local parallelization matlabpool close force matlabpool local 4 1.1). zeros(50.*(1-p).M604 Finals doit 31/03/2013 14:57 Contents Final M604 1. Sigma(eye(3)==1) = p.html Page 1 of 8 . % Hard code the estimates.1)].r)*inv(R*Cov*R')*(R*p' . zeros(30. ones(12. MOM 3. % Moments M = [m1 m2 m3] .1)]. MOM % Set parameters N = 60. N.1). r = 0. zeros(38.

. 'beta'). while abs(thetanew-thetaold) > 10^(-6) % Update alphaold = alphanew. 'alpha'). X(:.. % Preallocate and set parameters convergencetracker = zeros(B.:.. G. beta..1) 3. 'On'. thetaold = zeros(G+1. .b). y(:. betaold. options). 1. ~.. % Fixed effects variables X = kron(eye(G).. betanew].. % Set parameters G = 25.b).. betaold = betanew...b). ~. % Set off iterated optimization run [alphanew. B = 1000. y(:.1)).1). betanew = 1. ones(T. thetanew = ones(G+1. alphaold. options). 'LargeScale'..:. X = repmat(X. 'GradObj'. T.M604 Finals doit 31/03/2013 14:57 disp('Wald test p-value') 1-chi2cdf(wald. m604finalprobit(alphanew. optimset('Display'. file:///Users/Grace/Documents/MATLAB/M604%20CA/M604%20Final/html/M604%20Finals%20doit. x.html Page 2 of 8 . betanew]. tic parfor b = 1:B % Reset start values and counter alphanew = alphatrue(:. Probit fixed effects clc clear rseed = 10. . alphatrue] = m604finalprobitdata(rseed.. . 10000). exitalpha] = fminunc(@(alpha) ... X = [X x].b). thetaold = [alphanew. [betanew. . 'notify'.. 'Off'. [1. betaold. Theta = zeros(G+1.1). exitbeta] = fminunc(@(beta) .B). % Estimation options = . B).b). . 'MaxFunEvals'. T = 200. X(:. m604finalprobit(alpha. B]) .1). k = 0.:.:... thetanew = [alphanew. [y.

mat alphaT30 = Theta(1:end-1.M604 Finals doit 31/03/2013 14:57 % Break loop if iterates for too long k = k+1.mat alphaT2 = Theta(1:end-1. 'Location'.2).:). % Keep track of undesirable exits convergencetracker(b. [0.2) ] [dbetaT2. dbetaT2. end Theta(:.12) legend('T = 2'. 'alphatrue') Plot density of estimates load ThetaT2. betaT30 = Theta(end.1) = max(max(convergencecheck)).html Page 3 of 8 . Plot betas clf hold on plot(bT2xi. 'Theta'. if k > 100 break end convergencecheck = ismember([exitalpha. bT2xi] = ksdensity(betaT2). end toc disp('Convergence tracker:') if sum(convergencetracker) == 0 disp('Proper exit') elseif disp('Improper exit') end % Tuck away the results datatitle = ['ThetaT' num2str(T)].'k') title('Density \beta'. [dbetaT30. alphatrueT2 = alphatrue.-3]).b) = thetanew.:).-1. clear Theta alphatrue load ThetaT30. 'T = 30'. alphatrueT30 = alphatrue.:). bT30xi] = ksdensity(betaT30). 'Best') xlabel('\beta') file:///Users/Grace/Documents/MATLAB/M604%20CA/M604%20Final/html/M604%20Finals%20doit. dbetaT30. mean(betaT30. betaT2 = Theta(end. '--k') plot(bT30xi.:).2) median(betaT30. exitbeta]...2) median(betaT2. clear Theta alphatrue disp('Mean and median of beta at T = 2 and T = 30') [mean(betaT2. 'fontsize'. . save(datatitle.

B]) .2). max(mean(alphatrueT30.2))]. % Counter k = 0. % Estimation options = .max(mean(alphatrueT30. . mean(alphaT30.2))]. x.1])..2)).1). mean(alphaT2.. 'Location'.*x. 'Best') print(gcf. 'k') scatter(mean(alphatrueT30. optimset('Display'. '-dpdf'. Beta = zeros(G.2)). 1.. 10000.2). .. G.html Page 4 of 8 .pdf') % Plot true vs estimated fixed effects clf hold on title('True vs estimated fixed effects'. 'LargeScale'. X = [X x]. Alphatrue] = m604finalprobitdata(rseed.. 'On'.. 'k') xlabel('True \alpha') ylabel('Estimated \alpha') plot([min(mean(alphatrueT30. 'm604finalsalphatruevsestimated.G. 'GradObj'. B). T.M604 Finals doit 31/03/2013 14:57 ylabel('Density') print(gcf.1)). 'Off'.2).[1.B).. [1. .pdf') hold off Estimating individual per individual for T = 200 clc clear rseed = 10. 'DerivativeCheck'... X = repmat(X. [y. % Preallocate and set parameters convergencetracker = zeros(B. % Give each individual his own beta x = repmat(x.. 'fontsize'. . B = 1000. file:///Users/Grace/Documents/MATLAB/M604%20CA/M604%20Final/html/M604%20Finals%20doit.. % Set parameters G = 25. x = X. 'T = 30'. ones(T. 'notify'. T = 200. % Fixed effects variables X = kron(eye(G).. '-dpdf'. 'off')..B).'x'.. Alpha = zeros(G.'filled'.12) scatter(mean(alphatrueT2. 'k') legend('T = 2'. [min(mean(alphatrueT30.2). 'MaxFunEvals'.'m604finalsbetadensity.

alphatrueT2 = alphatrue. % Break loop if iterate for too long k = k+1.:.1).:. y(:.. Beta(:.M604 Finals doit 31/03/2013 14:57 tic parfor b = 1:B % Reset start values and counter alphanew = Alphatrue(:. betaT2 = Theta(end. k = 0.b). betaold = zeros(G.b). exitbeta].. Density beta load AlphaBeta.. options). thetaold = [alphanew.1). end toc disp('Convergence tracker:') if sum(convergencetracker) == 0 disp('Proper exit') elseif disp('Improper exit') end save('AlphaBeta'. m604finalprobit(alpha. exitalpha] = fminunc(@(alpha) . convergencetracker(b.. ~.1). beta.b). betaold = betanew.1) = max(max(convergencecheck)). thetanew = ones(2*G. X(:.b) = betanew. 'Beta'.mat alphaT30 = Theta(1:end-1. 'Alphatrue'. alphaold.. 'B' ). file:///Users/Grace/Documents/MATLAB/M604%20CA/M604%20Final/html/M604%20Finals%20doit. end Alpha(:. 'beta'). % Set off iterated optimization run [alphanew. [0...:. exitbeta] = fminunc(@(beta) .:). clear Theta alphatrue load ThetaT30. X(:. y(:. .b).:).-3]).:). if k > 100 break end % Keep track of undesirable exits convergencecheck = ismember([exitalpha. 'Alpha'.:. betanew = ones(G. ~.b).. 'G'.1). 'alpha'). thetanew = [alphanew.mat load ThetaT2.1).b) = alphanew. betanew]. options). thetaold = zeros(2*G.mat alphaT2 = Theta(1:end-1. [betanew.html Page 5 of 8 . . alphaold = zeros(G. betaold. betaold. 'T'. while abs(thetanew-thetaold) > 10^(-6) % Update alphaold = alphanew. betanew].-1. m604finalprobit(alphanew.

' \beta < 3']. [betasFE.. Individual'. mean(Alpha.'d'.'--k') title(['Density \beta T= ' num2str(T).2). 'fontsize'. densityname) Plot true vs estimated fixed effects clf hold on title('True vs estimated fixed effects'. FE'. mean(alphaT30. load ThetaT200. '-dpdf'.'x'. % Plot the densities clf hold on plot(betasIbIxi.2))]. 'fontsize'.mat alphaT200 = Theta(1:end-1.100. print(gcf.'T= 2. betasIbIxi] = ksdensity(betas(betas<3)). feffectsname) hold off M604 Q4 ultimatum rng(101). num2str(T) '. 'k') scatter(mean(alphatrueT200.:). 'k') plot(betasFExi. . 'k') scatter(mean(alphatrueT30.2)..betasIbI. '-dpdf'.1).2).max(mean(alphatrueT30.2).pdf'].2)). 'location'. 'T = 30. betaT30 = Theta(end. mean(alphaT200. betasFE..M604 Finals doit 31/03/2013 14:57 alphatrueT30 = alphatrue. mean(alphaT2..[G*B.1]).2))]. . 'FE'. 'k') scatter(mean(alphatrueT2. 'best') densityname = ['m604finalsbetadensityT' num2str(T) '.'k') feffectsname = ['m604finalsalphatruevsestimatedT'. betasFExi] = ksdensity(betaT200(betaT200<3)). 'location'.:).csv offer = ultimatum(:. alphatrueT200 = alphatrue.'filled'.12) scatter(mean(Alphatrue.2)). file:///Users/Grace/Documents/MATLAB/M604%20CA/M604%20Final/html/M604%20Finals%20doit.pdf'] print(gcf. clear Theta alphatrue betas= reshape(Beta.2). FE'.'o'. FE'. 'T = 200.2). disp('Mean and median of beta') [mean(betas < 10) median(betas)] clf [betasIbI.2). 'best') plot([min(mean(alphatrueT30. betaT200 = Theta(end.2). max(mean(alphatrueT30. 'k') xlabel('True \alpha') ylabel('Estimated \alpha') legend('T= 200.html Page 6 of 8 . [min(mean(alphatrueT30. % Get data load ultimatum.:).12) xlabel('\beta') ylabel('Density') legend('Individual'.

. thetastart. data = [offer. 'off'. N = size(offer.N). options). % Sample with replacement sampler = randi(N. % Generate the data moments data_moments(1)=mean(accept). data_moments(3)=std(terms(accept>0)). thetastart = [18 18 20 20].*(accept>0). data_moments(5)=std(offer(accept==0)). tic parfor b = 1:B samplerb = sampler(:. matchdraws = [randperm(N). samplerb).M604 Finals doit 31/03/2013 14:57 accept = ultimatum(:.1). terms=offer.[N. ultimatum_gmm(z.2). paramdraws = mvnrnd([0 0]. accept..B]).paramdraws.1). file:///Users/Grace/Documents/MATLAB/M604%20CA/M604%20Final/html/M604%20Finals%20doit. Re-estimate.. % Resample pairs of the data.html Page 7 of 8 . terms].~. tic [mom_estimates. % Resample the data data_momentsb = ultimatumdatamoments(data. exitcheck = zeros(B.matchdraws).1) == 1 disp('Proper') elseif disp('Improper') end Standard errors % Nr of bootstraps B = 1000. 100000). data_moments(4)=mean(offer(accept==0)). hold the simulation draws fixed. toc disp('Estimates') mom_estimates disp('Exit') if ismember(exitflag.1).. data_moments(2)=mean(terms(accept>0)). options = optimset('display'.4). % Make prefixed standard normal draws and matchings. thetastart = [18 18 20 20].b).eye(2). % Pre-allocate bestimates = zeros(B. 'MaxFunEvals'.. % Set parameters N = size(offer. .data_moments. 1:N]'.exitflag] = fminsearch(@(z) .

exitflag] = fminsearch(@(z) .1) = exitflag.B) == 0 disp('Proper') elseif ne(sum(exitcheck).B) == 1 disp('Improper') end Published with MATLAB® R2012b file:///Users/Grace/Documents/MATLAB/M604%20CA/M604%20Final/html/M604%20Finals%20doit.html Page 8 of 8 ..~.matchdraws).. bestimates(b.paramdraws.data_momentsb. . ultimatum_gmm(z.:) = mom_estimatesb.. exitcheck(b. end toc % Results disp('MSM estimates with bootstrapped errors') [mom_estimates' std(bestimates)'] % Convergence check disp('Exit') if ne(sum(exitcheck).. options). % Keep track of convergence.M604 Finals doit 31/03/2013 14:57 % Estimates [mom_estimatesb. thetastart.

K-Kalpha). end Published with MATLAB® R2012b file:///Users/Grace/Documents/MATLAB/M604%20CA/M604%20Final/html/m604finalprobit. if isnan(f) == 1 f = 100000. % Gradient g = repmat((2*y-1).1. g = -sum(g). g = -sum(g). X. zig) == 1 % Set parameters X = X(:.1:Kalpha).*X.*normpdf(v). v = (2*y-1).*(X*theta). end % Set parameters K = length(theta)./likh.html Page 1 of 1 . beta. % Gradient. likh = normcdf(v). elseif strcmp('beta'.*X. if strcmp('alpha'. Kalpha = length(alpha).Kalpha). beta]. zig) == 1 % Set parameters X = X(:. % log-likelihood function f = log(likh).*normpdf(v). % Add penalty for numerical issues. zig) % M604 finals probit function: % Argument theta = [alpha.1.m604finalprobit 28/03/2013 14:53 function [f.Kalpha+1:end). y. % Gradient g = repmat((2*y-1). % Criterion f = -sum(f)./likh. g] = m604finalprobit(alpha.

B]).1.1. % Latent variable with iid shock y = alpha + x + randn([G*T.[T.1]).1. [G*T.m604finalprobitdata 28/03/2013 14:53 function [y.[T.1. ups = repmat(ups. Published with MATLAB® R2012b file:///Users/Grace/Documents/MATLAB/M604%20CA/M604%20Final/html/m604finalprobitdata. G. xbar = reshape(xbar.B]). % Draw the elements of the fixed effects xmax = max(x.[]. x = reshape(x.B]). 2 dimension: individual. T. B) % M604 Finals: Returns data for the progibit simulation % Set seed rng(rseed) % 1st dimension: time. alphatrue = reshape(alpha.1. xbar = repmat(xbar. alphatrue] = m604finalprobitdata(rseed.[G*T.B]).[T.1.1).1]). [G*T.B]).G.1).B]).1.B]). x. alphatrue = mean(alphatrue.B]). [G*T.1]). ups = reshape(ups. alphatrue = reshape(alphatrue. 3 dimension: trial x = rand([T. [G*T.1. [T.G.1.[G. ups = randn([1.B]).B]).html Page 1 of 1 . % Choice variable y = y>0. xbar = mean(x). xmax = repmat(xmax. % Reshape to a panel. every bootstrap in the third dimension alpha = reshape(alpha. xmax = reshape(xmax.G. % Interact with the coefficients alpha = log(1 + T/30)*xbar + 1/5*xmax + ups .

1).0)).1)*mu.2).ultimatum_gmm 28/03/2013 14:51 function f=ultimatum_gmm(trial_mu_sig. offer=params(matchdraws(:. if (min(trial_mu_sig)>0) % Scale the fixed draws and interact with parameters.*(accept>0).data_moments. sim_moments(3)=std(terms(accept>0)). f=pre_f*W*pre_f'. sim_moments(2)=mean(terms(accept>0)). params=min(100. end Published with MATLAB® R2012b file:///Users/Grace/Documents/MATLAB/M604%20CA/M604%20Final/html/ultimatum_gmm. accept= offer>=params(matchdraws(:. sig=diag(trial_mu_sig(3:4)).5)+ones(N. mu=trial_mu_sig(1:2). sim_moments(4)=mean(offer(accept==0)). % Weight matrix and parameters. % Add penalty on trial parameters venturing out of the domain to circumvent cumbersome cons trained % minimization routine. % Run ultimatum simulation. % Simulated moments sim_moments(1)=mean(accept>0).matchdraws) % M604 Finals: Ultimatum game MSM criterion. W=eye(length(data_moments)).paramdraws. sim_moments(5)=std(offer(accept==0)). terms=offer. else f=10^8 +min(trial_mu_sig)^2.1). N = length(paramdraws).2). % Moment criterion pre_f=sim_moments-data_moments.max(params. params=paramdraws*(sig^.html Page 1 of 1 .

Published with MATLAB® R2012b file:///Users/Grace/Documents/MATLAB/M604%20CA/M604%20Final/html/ultimatumdatamoments.html Page 1 of 1 . terms = data(:. termsb = terms(sampler).2). offerb = offer(sampler). data_momentsb(1)=mean(acceptb>0).1). sampler) % M604 finals Ultimatum game: returns data moments for bootstrap purposes % with parallelization. acceptb = accept(sampler). accept = data(:. % Unpack and sample offer = data(:. data_momentsb(5)=std(offerb(acceptb==0)). data_momentsb(4)=mean(offerb(acceptb==0)). data_momentsb(3)=std(termsb(acceptb>0)). data_momentsb(2)=mean(termsb(acceptb >0)).3).ultimatumdatamoments 28/03/2013 14:53 function f = ultimatumdatamoments(data. f = data_momentsb. % Bootstrap moments.

- A Generalized Family of Estimators for Estimating Population Mean Using Two Auxiliary AttributesUploaded byscience2010
- Get PDF 81Uploaded byabda3737
- ISM Chapter9Uploaded byJeans Carlos
- snoa719Uploaded byQweku Black
- Statistics Covariance&CorrelationUploaded byNisarg Modi
- Assignment 1Uploaded byrianbe
- Notes 1Uploaded bySandeep Singh
- ch08.pdfUploaded byAniket Gaikwad
- xr108solUploaded byKatty Gomez
- Financial Economics Bocconi Lecture5Uploaded byElisa Carnevale
- Exam1 Spring2006Uploaded bySungmin Park
- chi distrib_CEP.pdfUploaded byAry guimaraes neto
- Standard DeviationUploaded bysababathy
- 2008 paperUploaded bysankaramanchi
- ZA08 HandoutUploaded byPhong Lan
- Investment Diversification and Portfolio AnalysisUploaded byBalaSara Sarangan
- Multivariate ProbabilityUploaded byDevrup Ghatak
- Summary NotesUploaded byJulie Juhee Han
- statistics practice 10Uploaded byapi-248877347
- Week 9 Week 10 Ch13 Version 2Uploaded byNo Promises
- Minka GammaUploaded byG Nathan Jd
- GEM3 Empirical NotesUploaded byxy053333
- 10.1007%2Fs41465-018-0068-1.pdfUploaded byYiryi Alejandro Anzola Uva
- שפות סימולציה- הרצאה 17 | Rare Events SimulationUploaded byRon
- ppt05 EconometrieUploaded byinebergmans
- ABGD, Automatic Barcode Gap Discovery for primary species delimitationUploaded byLu Castellanos
- 1104.5280Uploaded byTheo
- 3.Numerical Descriptive TechniquesUploaded byNurgazy Nazhimidinov
- CFA Level 1 egUploaded bySumanth Kumar
- Estimation of Time-Series ModelsUploaded byanurag3069

- hiUploaded byakshay patri
- So what notUploaded byakshay patri
- ExamUploaded byakshay patri
- Vibhav Gupta.pdfUploaded byakshay patri
- helloUploaded byakshay patri
- hiUploaded byakshay patri
- exam 3Uploaded byakshay patri
- solomon_1Uploaded byakshay patri
- Hello YesUploaded byakshay patri
- heheUploaded byakshay patri
- No Yes WhyUploaded byakshay patri
- SyuyuyutuUploaded byakshay patri
- Assign 2 (BA).pdfUploaded byakshay patri
- hiUploaded byakshay patri
- helloUploaded byakshay patri
- GoodUploaded byakshay patri
- Homework 1 Prof.docxUploaded byakshay patri
- retyrtyrUploaded byakshay patri
- Econometrics Quiz.docxUploaded byakshay patri
- ecotricsUploaded byakshay patri
- AssignmentUploaded byakshay patri
- hytrtytrUploaded byakshay patri
- Vibhav Gupta.pdfUploaded byakshay patri
- STATA Exam.docxUploaded byakshay patri
- Homework 1 Prof.docxUploaded byakshay patri
- Assignment 2.docxUploaded byakshay patri
- Exam3Uploaded byakshay patri
- Ashna Kedia.docxUploaded byakshay patri
- Exam III_Section1 (1).docxUploaded byakshay patri
- helloUploaded byakshay patri