3 views

Uploaded by M-Faheem Aslam

dd

- Principles of Econometrics 4e Chapter 2 Solution
- General Linear Models and Logistic Regression
- sports_articles_list
- Some Graphical Methods for Interpreting Interactions in Logistic and OLS Regression
- Complexity, networks and knowledge flow
- Measuring financial contagion between emerging equity markets before and after the onset of the global financial crisis
- Vcd4 Handout 2x2
- UT Dallas Syllabus for stat6337.501.10f taught by Michael Baron (mbaron)
- Educational
- Decision and Choice Luce’s Choice Axiom.pdf
- An Introduction to Classification and Regression Tree
- 1277.full
- JURNAL STROKE
- wanke
- kahn_restat.pdf
- Comparing and Combining Theories to Explain Proenvironmental
- SantaCruz2003.pdf
- MultinomialLogisticRegression_BasicRelationships
- Determinants of Student Satisfaction Dawit Daniel
- A Comparison of Course Outcomes for Supplemental Instruction and Tutoring Students in Anatomy and Physiology Classes Spring 2008

You are on page 1of 24

Simple Logistic Regression a) Example: APACHE II Score and Mortality in Sepsis The following figure shows 30 day mortality in a sample of septic patients as a function of their baseline APACHE II Score. Patients are coded as 1 or 0 depending on whether they are dead or alive in 30 days, respectively.

Died 1 30 Day Mortality in Patients with Sepsis Survived 0 0 5 10 25 30 15 20 APACHE II Score at Baseline 35 40 45

We wish to predict death from baseline APACHE II score in these patients. Let (x) be the probability that a patient with score x will die. Note that linear regression would not work well here since it could produce probabilities less than zero or greater than one.

Died 1 30 Day Mortality in Patients with Sepsis Survived 0 0 5 10 25 30 15 20 APACHE II Score at Baseline 35 40 45

b) Sigmoidal family of logistic regression curves Logistic regression fits probability functions of the following form:

p( x ) = exp( a + bx ) / (1 + exp( a + bx ))

This equation describes a family of sigmoidal curves, three examples of which are given below.

p( x ) = exp( a + bx ) / (1 + exp( a + bx ))

1.0 0.8

-4 -8 - 12 - 20

(x)

25

30

35

40

c) Parameter values and the shape of the regression curve For now assume that > 0. For negative values of x, exp ( a + bx ) 0 as and hence p( x ) 0 / (1 + 0 ) = 0 For very large values of x, exp( a + bx ) and hence p( x ) (1 + ) = 1

x -

p( x ) = exp( a + bx ) / (1 + exp( a + bx ))

1.0 0.8

-4 -8 - 12 - 20

(x)

25

30

35

40

When x = -a / b, a + bx = 0 and hence p( x ) = 1 1 + 1 = 0.5 The slope of (x) when (x)=.5 is /4. Thus controls how fast (x) rises from 0 to 1. For given , controls were the 50% survival point is located.

b g

We wish to choose the best curve to fit the data. Data that has a sharp survival cut off point between patients who live or die should have a large value of .

Died 1

Survived 0 0 5 10 15 20

25

30

35

40

Data with a lengthy transition from survival to death should have a low value of .

Died 1

Survived 0 0 5 10 15 20

25

30

35

40

p( x ) = exp( a + bx ) / (1 + exp( a + bx ))

Hence 1 - p( x ) = probability of survival

= 1 + exp( a + bx ) - exp( a + bx ) 1 + exp( a + bx )

{3.1}

= 1 (1 + exp( a + bx )) , and the odds of death is p( x ) (1 - p( x )) = exp( a + bx ) The log odds of death equals log( p( x ) (1 - p( x )) = a + bx

{3.2}

e) The logit function For any number between 0 and 1 the logit function is defined by logit ( p ) = log( p / (1 - p ))

Let di =

R 1: i S T0: i

th th

E ( di ) = p( x i ) = Pr[ di = 1]

logit( E ( di )) = p( xi ) = a + bxi

{3.3}

E ( yi ) = a + bx i for i = 1, 2,...,n

yi

has a normal distribution with standard deviation . it is the random component of the model, which has a normal distribution. is the linear predictor.

a + bx i

In logistic regression, the expected value of di given xi is E(di) = pi = p [xi ] logit(E(di)) = + xi for i = 1, 2, , n di is dichotomous with probability of event pi = p [xi ] it is the random component of the model logit is the link function that relates the expected value of the random component to the linear predictor.

3.

In linear regression we used the method of least squares to estimate regression coefficients. In generalized linear models we use another approach called maximum likelihood estimation. The maximum likelihood estimate of a parameter is that value that maximizes the probability of the observed data. that maximize the We estimate and by those values a and b probability of the observed data under the logistic regression model.

. tabulate fate Death by 30 | days | Freq. Percent Cum. ------------+----------------------------------alive | 279 61.45 61.45 dead | 175 38.55 100.00 ------------+----------------------------------Total | 454 100.00

.08 0 0 .02 Density .04 .06

10

30

40

1 0 0 Proportion of Deaths with Indicated Score .2 .4 .6 .8

10

30

40

= = = =

-----------------------------------------------------------------------------fate | Coef. Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------apache | .1156272 .0159997 7.23 0.000 .0842684 .146986 _cons | -2.290327 .2765283 -8.28 0.000 -2.832313 -1.748342 ------------------------------------------------------------------------------

b

a

= .1156272 = -2.290327

p( x ) = exp( a + bx ) / (1 + exp( a + bx ))

= exp(-2.290327 + .1156272x ) 1 + exp(-2.290327 + .1156272x )

p (20 ) =

0 0

10

30 Pr(fate)

40

4.

Odds Ratios and the Logistic Regression Model a) Odds ratio associated with a unit increase in x

The log odds that patients with APACHE II scores of x and x + 1 will die are logit ( p( x )) = a + bx and logit ( p( x + 1)) = a + b( x + 1) = a + bx + b respectively. subtracting {3.5} from {3.6} gives = logit ( p( x + 1)) - logit( p( x ))

{3.6} {3.5}

= log

and hence exp() is the odds ratio for death associated with a unit increase in x.

A property of logistic regression is that this ratio remains constant for all values of x.

5.

In our sepsis example the parameter estimate for apache () was .1156272 with a standard error of .0159997. Therefore, the odds ratio for death associated with a unit rise in APACHE II score is exp(.1156272) = 1.123 with a 95% confidence interval of (exp(0.1156 - 1.960.0160), exp(0.1156 + 1.960.0160)) = (1.09, 1.15).

10

-2 0

10

30

40

obs_logodds

Linear prediction

7.

. Let s2 and s 2 denote the variance of a and b a b . Let s ab denote the covariance between a and b Then it can be shown that the standard error of is

x = +b se a

2 s2 + x 2s b + 2x s ab a

x 1.96 se a +b a + bx

11

-4 0

-2

10

30

40

obs_logodds

x 1.96 se a +b a + bx

L [x ] = p x - 1.96 se a +b exp a + bx x - 1.96 se a +b 1 + exp a + bx

and

U [x ] = p x + 1.96 se a +b exp a + bx x + 1.96 se a +b 1 + exp a + bx

12

0 0

.2

.4

.6

.8

10

30

40

lb_prob/ub_prob Pr(fate)

It is common to recode continuous variables into categorical variables in order to calculate odds ratios for, say the highest quartile compared to the lowest.

. centile apache, centile(25 50 75) -- Binom. Interp. -Variable | Obs Percentile Centile [95% Conf. Interval] -------------+------------------------------------------------------------apache | 454 25 10 9 11 | 50 14 13.60845 15 | 75 20 19 21 . generate float upper_q= apache >= 20 . tabulate upper_q upper_q | Freq. Percent Cum. ------------+----------------------------------0 | 334 73.57 73.57 1 | 120 26.43 100.00 ------------+----------------------------------Total | 454 100.00

13

. cc fate upper_q if apache >= 20 | apache <= 10 Proportion | Cases | Controls | Total | | | Odds ratio | Attr. frac. ex. | Attr. frac. pop | Point estimate 7.234419 .8617719 .6505533 chi2(1) = Exposed 77 43 120 Unexposed 25 101 126 | | | | | | | | | 49.75 Pr>chi2 = 0.0000 [95% Conf. Interval] 3.924571 .7451951 13.44099 .9256007 (exact) (exact) Total 102 144 246 Exposed 0.7549 0.2986 0.4878 -----------------+------------------------+----------------------

-----------------+------------------------+----------------------

|------------------------+----------------------

+-----------------------------------------------

This approach discards potentially valuable information and may not be as clinically relevant as an odds ratio at two specific values.

Alternately we can calculate the odds ratio for death for patients at the 75th percentile of Apache scores compared to patients at the 25th percentile logit ( p(20)) = a + b 20 logit ( p(10)) = a + b 10 Subtracting gives p (20) / (1 - p (20)) log = b 10 = 0.1156 10 = 1.156 p (10) / (1 - p (10)) Hence, the odds ratio equals exp(1.156) = 3.18 A problem with this estimate is that it is strongly dependent on the accuracy of the logistic regression model.

14

Hence, the odds ratio equals exp(1.156) = 3.18 With Stata we can calculate the 95% confidence interval for this odds ratio as follows:

. lincom 10*apache, eform ( 1) 10 apache = 0

-----------------------------------------------------------------------------fate | exp(b) Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------(1) | 3.178064 .5084803 7.23 0.000 2.322593 4.348628 ------------------------------------------------------------------------------

A problem with this estimate is that it is strongly dependent on the accuracy of the logistic regression model.

Simple logistic regression generalizes to allow multiple covariates logit( E (di )) = a + b1xi1 + b2 xi 2 + ... + b k xik where xi1, x12, , xik are covariates from the ith patient

and 1, ...k, are known parameters

di =

1: 0:

Multiple logistic regression can be used for many purposes. One of these is to weaken the logit-linear assumption of simple logistic regression using restricted cubic splines.

15

These curves have k knots located at t1 ,t2 ," ,tk. They are: Linear before t1 and after tk . Piecewise cubic polynomials between adjacent knots (i.e. of the form ax 3 + bx 2 + cx + d ) Continuous and smooth.

t1

t2

t3

Given x and k knots a restricted cubic spline can be defined by y = a + x1b1 + x 2b2 + " + x k -1bk -1 for suitably defined values of xi These covariates are functions of x and the knots but are independent of y. x1 = x and hence the hypothesis b2 = b3 = " = bk -1 tests the linear hypothesis.

logit ( E (di )) = a + x1b1 + x 2b2 + " + x k -1b k -1

Programs to calculate x1 ," , x k -1 are available in Stata, R and other statistical software packages.

16

We fit a logistic regression model using a three knot restricted cubic spline model with knots at the default locations at the 10th percentile, 50th percentile, and 90th percentile.

. rc_spline apache, nknots(3) number of knots = 3 value of knot 1 = 7 value of knot 2 = 14 value of knot 3 = 25 . logit fate _Sapache1 _Sapache2 Logit estimates Log likelihood = -271.64615 Number of obs LR chi2(2) Prob > chi2 Pseudo R2 = = = = 454 62.05 0.0000 0.1025

-----------------------------------------------------------------------------fate | Coef. Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------_Sapache1 | .1237794 .0447174 2.77 0.006 .036135 .2114238 _Sapache2 | -.0116944 .0596984 -0.20 0.845 -.128701 .1053123 _cons | -2.375381 .5171971 -4.59 0.000 -3.389069 -1.361694 ------------------------------------------------------------------------------

Note that the coefficient for _Sapache2 is small and not significant, indicating an excellent fit for the simple logistic regression model.

Giving analogous commands as for simple logistic regression gives the following plot of predicted mortality given the baseline Apache score.

. drop prob logodds se lb_logodds ub_logodds ub_prob lb_prob prob_rcs prob logodds_rcs logodds

. rename . rename

. predict se, stdp . generate float lb_logodds= logodds-1.96* se . generate float ub_logodds= logodds+1.96* se . generate float ub_prob= exp( ub_logodds)/(1+exp( ub_logodds)) . generate float lb_prob= exp( lb_logodds)/(1+exp( lb_logodds)) . twoway (rarea lb_prob ub_prob apache, blcolor(yellow) bfcolor(yellow)) > (scatter proportion apache) > (line prob apache, clcolor(red) clwidth(medthick))

This plot is very similar to the one for simple logistic regression except the 95% confidence band is a little wider.

17

.8 0 0 .2 .4 .6

10

30

40

lb_prob/ub_prob Pr(fate)

.8 0 0 .2 .4 .6

10

30

40

lb_prob/ub_prob Pr(fate)

18

This regression gives the following table of values Percentile 25 75 Apache 10 20 _Sapache1 10 20 _Sapache2 0.083333 5.689955

We calculate the odds ratio of death for patients at the 75th vs. 25th percentile of apache score as follows: The logodds at the 75th percentile equals

logit ( p (20)) = a + b1 20 + b2 5.689955

The logodds at the 25th percentile equals logit (p (10)) = a + b1 10 + b2 0.083333 Subtracting the second from the first equation gives that the log odds ratio for patients at the 75th vs. 25th percentile of apache score is logit (p (20) / p (10)) = b1 10 + b2 5.606622

Stata calculates this odds ratio to be 3.22 with a 95% confidence interval of 2.3 -- 4.6

. lincom _Sapache1*10 + 5.606622* _Sapache2, eform ( 1) 10 _Sapache1 + 5.606622 _Sapache2 = 0

-----------------------------------------------------------------------------fate | exp(b) Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------(1) | 3.22918 .5815912 6.51 0.000 2.26875 4.596188 ------------------------------------------------------------------------------

Recall that for the simple model we had the following odds ratio and confidence interval.

-----------------------------------------------------------------------------fate | exp(b) Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------(1) | 3.178064 .5084803 7.23 0.000 2.322593 4.348628 ------------------------------------------------------------------------------

The close agreement of these results supports the use of the simple logistic regression model for these data.

19

9.

Breslow & Day, Vol. I give the following results from the Ille-et-Vilaine case-control study of esophageal cancer and alcohol. Cases were 200 men diagnosed with esophageal cancer in regional hospitals between 1/1/1972 and 4/30/1974. Controls were 775 men drawn from electoral lists in each commune.

Daily Alcohol Consumption > 80g 96 109 205 < 80g 104 666 770 Total 200 775 975

Let xi =

mi = number of cases (i = 1) or controls (i = 0) di = number of cases (i = 1) or controls (i = 0) who are heavy drinkers. Then the observed prevalence of heavy drinkers is d0/m0 = 109/775 for controls and d1/m1 = 96/200 for cases. The observed prevalence of moderate or non-drinkers is (m0 - d0)/m0 = 666/775 for controls and (m1 - d1)/m1 = 104/200 for cases.

20

( di / mi ) / [( mi - di ) / mi ] = di / ( mi - di )

The observed odds ratio for heavy drinking in cases relative to controls is = y d1 / (m1 - d1 ) d0 / (m0 - d0 ) = 96 / 104 109 / 666 = 5.64

If the cases and controls are a representative from y Since esophageal cancer is rare sample their respective underlying populations then also estimates the relative risk of

esophageal cancer in heavy drinkers is an unbiased 1. y estimate of the true odds ratio for heavy to moderate drinkers. drinking in relative cases relative to controls in the underlying population.

2. This true odds ratio also equals the true odds ratio for esophageal cancer in heavy drinkers relative to moderate drinkers.

selog(y ) =

1 1 1 1 + + + d0 m0 - d0 d1 m1 - d1

Hence, if we let L =y exp y ) -1.96 selog (y and

U = y exp y ) 1.96 selog(y

21

10.

{3.9}

where E ( di / mi ) = p i = Probability of being a heavy drinker for cases (i = 1) and controls (i = 0). Then {3.9} can be rewritten logit( p i ) = log( p i / (1 - p i )) = a + bx i Hence log( p1 / (1 - p1 )) = a + bx1 = a + b log( p 0 / (1 - p 0 )) = a + bx 0 = a since x1 = 1 and x0 = 0. Subtracting these two equations gives

log( p1 / (1 - p1 )) - log( p 0 / (1 - p 0 )) = b

and

log

LM p / (1 - p ) OP = log( y ) = b N p / (1 - p )Q

1 1 0 0

b) Nuisance parameters is called a nuisance parameter. This is one that is required by the model but is not used to calculate interesting statistics. 11. Analyzing Case-Control Data with Stata

cancer No Yes No Yes alcohol < < >= >= 80g 80g 80g 80g patients 666 104 109 96

22

. cc cancer alcohol [freq=patients], woolf | alcohol | Proportion | Exposed Unexposed | Total Exposed -----------------+------------------------+---------------------Cases | 96 104 | 200 0.4800 Controls | 109 666 | 775 0.1406 -----------------+------------------------+---------------------Total | 205 770 | 975 0.2103 | | | Point estimate | [95% Conf. Interval] |------------------------+---------------------Odds ratio | 5.640085 | 4.000589 7.951467 (Woolf) Attr. frac. ex. | .8226977 | .7500368 .8742371 (Woolf) Attr. frac. pop | .3948949 | +----------------------------------------------chi2(1) = 110.26 Pr>chi2 = 0.0000 * * Now calculate the same odds ratio using logistic regression *

= 5.64

logistic alcohol cancer [freq=patients] No. of obs = LR chi2(1) = Prob > chi2= Pseudo R2 = 975 96.43 0.0000 0.0962

-----------------------------------------------------------------------------alcohol | Odds Ratio Std. Err. z P>|z| [95% Conf. Interval] ---------+-------------------------------------------------------------------cancer | 5.640085 .9883491 9.87 0.000 4.000589 7.951467 ------------------------------------------------------------------------------

This is the analogous logistic command for simple logistic regression. If we had entered the data as

cancer heavy patients 0 109 775 1 96 200

This commands fit the model logit(E(alcohol)) = + cancer* giving = 1.73 = the log odds ratio of being a heavy drinker in cancer patients relative to controls. The odds ratio is exp(1.73) = 5.64.

23

The 95% confidence interval is (5.64exp(-1.960.1752), 5.64exp(1.960.1752)) = (4.00, 7.95). The classical limits using Woolfs method is (5.64exp(-1.96s), 5.64exp(1.96s)) =(4.00, 7.95), where s2 = 1/96 + 1/109 + 1/104 + 1/666 = 0.0307 = (0.1752)2. Hence Logistic regression is in exact agreement with classical methods in this simple case. In Stata the command

cc cancer alcohol [freq=patients], woolf

gives us Woolfs 95% confidence interval for the odds ratio. We will cover how to calculate confidence intervals using glm in the next chapter.

24

- Principles of Econometrics 4e Chapter 2 SolutionUploaded byHoyin Kwok
- General Linear Models and Logistic RegressionUploaded byKDEWolf
- sports_articles_listUploaded byjohnnyent
- Some Graphical Methods for Interpreting Interactions in Logistic and OLS RegressionUploaded byTingting Huang
- Complexity, networks and knowledge flowUploaded byGandolfo Dominici
- Measuring financial contagion between emerging equity markets before and after the onset of the global financial crisisUploaded byjamesfitz45
- Vcd4 Handout 2x2Uploaded byjuntujuntu
- UT Dallas Syllabus for stat6337.501.10f taught by Michael Baron (mbaron)Uploaded byUT Dallas Provost's Technology Group
- EducationalUploaded byjigarchhatrola
- Decision and Choice Luce’s Choice Axiom.pdfUploaded byAhmed Gouda
- An Introduction to Classification and Regression TreeUploaded byo65346
- 1277.fullUploaded byPedro Eugenio Pereira Bargiona
- JURNAL STROKEUploaded byErsi Dwi Utami Siregar
- wankeUploaded byEdgard Trujillo Sánchez
- kahn_restat.pdfUploaded byjudassantos
- Comparing and Combining Theories to Explain ProenvironmentalUploaded byIris Adriana
- SantaCruz2003.pdfUploaded byAlvin Duke Sy
- MultinomialLogisticRegression_BasicRelationshipsUploaded byMarZiYa
- Determinants of Student Satisfaction Dawit DanielUploaded byDawit Daniel
- A Comparison of Course Outcomes for Supplemental Instruction and Tutoring Students in Anatomy and Physiology Classes Spring 2008Uploaded byfornshaw
- BURAK5Uploaded byssayem5142
- Guatemala ChildrenUploaded bykarl_poor
- KF Dapus No15Uploaded bymui
- Emerg Med J 2015 Cameron 174 9Uploaded bykyurinkim
- 32bfe5130741b37c73Uploaded byAshima Gupta
- 133_2006ss_tomasjarosUploaded byalexihimanshu
- article 1Uploaded byapi-292965056
- Jurnal Organ InderaUploaded byAnggana Faza Nazhara
- Violence Against Women 2013 Heath 1065 78Uploaded byAlina Ciabuca
- glmUploaded byCao Haoqing

- Microfinance Effective in Generating Opportunities and Mitigating Poverty in PakistanUploaded byM-Faheem Aslam
- ProductFlyer_9783319397108Uploaded byM-Faheem Aslam
- Eu StocksUploaded byM-Faheem Aslam
- xhUploaded byM-Faheem Aslam
- eviews 7Uploaded byM-Faheem Aslam
- Boots TrappingUploaded byM-Faheem Aslam
- Bootstrap DemoUploaded byM-Faheem Aslam
- BootstrapUploaded byM-Faheem Aslam
- 4th DARC Meeting Incomplete FoldersUploaded byM-Faheem Aslam
- HEC ProQuest Trial PosterUploaded byM-Faheem Aslam
- Para NormalityUploaded byslav234
- Financial Risk Management AssignmentUploaded byM-Faheem Aslam
- 8Uploaded byM-Faheem Aslam
- KOSPI 200Uploaded byM-Faheem Aslam
- Price LimitsUploaded byM-Faheem Aslam
- AirlineUploaded byM-Faheem Aslam
- Time SeriesUploaded byM-Faheem Aslam
- 3 - 7 - Summarizing Data (23-21)Uploaded byM-Faheem Aslam
- 2 - 8 - Sources of Data Sets (7-34)Uploaded byM-Faheem Aslam
- 2 - 5 - Representing Data in R (13-18)Uploaded byM-Faheem Aslam
- 2 - 3 - What Is Data- (11-25)Uploaded byM-Faheem Aslam
- 2 - 1 - Course Introduction (4-13)Uploaded byM-Faheem Aslam
- 2 Financial Assets and MarketsUploaded byM-Faheem Aslam
- 5 Stock ExchangeUploaded byM-Faheem Aslam
- CdcUploaded byM-Faheem Aslam
- Logit Model.xlsxUploaded byM-Faheem Aslam

- English MidtestUploaded byAnnisa Rohima
- 1PDC 2013 WinnersUploaded byRussandCarmen Westbrook
- Avo Alg Report 08Uploaded bybabunmerah
- Color MeasurementUploaded byalang_business
- Catalog CSCUploaded byjagrut
- Answer KeyUploaded byTammy Gheorghe
- Norm Exercise5Uploaded byHabtamu Hailemariam Asfaw
- One hundred twenty five recipes.pdfUploaded bysnowmanflo
- Investor Presentation [Company Update]Uploaded byShyam Sunder
- XI BilingvUploaded byRareș Popa
- [Revision] Chemistry Part-3-- Hydrogen Economy, Hard Water Types & Treatment, Hydrogen Peroxide, Acid, Base, SaltUploaded byprashantsh90
- Palash RoyUploaded byPalash Roy
- 20 Greek RecipesUploaded byelpant
- Our Marktng Plan (1)Uploaded bySaba Khan
- Batch Process Bioethanol_Z.mobilis'Uploaded byThirumalai Vasan
- SCOTT CUMMINGS, L. 2007. Manual for Pollen, Phytolith, Starch, FTIR, AMS Radiocarbon, And Macrofloral SamplingUploaded byKendy Huallpamaita Cárdenas
- Friendship DocUploaded byBilhani Veldi
- Best of Kazuhide (Seek Japan 201007)Uploaded byRégis Clément
- The Past Simple TenseUploaded byvelibor27
- 7 Ways to Increase HemoglobinUploaded byje ab
- 2013-02-07 The County TimesUploaded bySouthern Maryland Online
- Toleration Means Loss of Vital ResistanceUploaded byJonas Sunshine Callewaert
- Master Peace Grill Conshohocken PAUploaded byMikeBottos
- Figures of SpeechUploaded byGR Beharry
- Hmong RecipesUploaded bypaceminterris
- Historic ManuscriptUploaded byshrestha1234
- Plant ToxinsUploaded bynurulain2o
- researchUploaded byGrace Lacia Delos Reyes
- Glucemic IndexUploaded byMayra de Cáceres
- PatisserieUploaded bykathy modar