Data was gathered from the most respected disc golf directory on the internet(DGCourseReview.com) and courses were sorted so that only permanent courses in theUnited States with at least five reviews were taken into account. These three queries(permanent, US, 5+ reviews) were made for the following reasons. Only courses that arealways playable/permanent (not temporary or practice courses) were chosen because theycould skew the ratings. Only US courses were chosen so that we have the samedemographic of reviewers, since
European/Asian reviewers don’t
compare withAmerican reviewers. Lastly, as we want credible ratings, we only used courses with aminimum of 5 reviews. This resulted in about 2500 courses. Since it would take toolong to gather that data, the list was alphabetized by course name and data was pulled forevery 20
course, resulting in 125 observations. Each of these observations had a rating,year designed, number of holes, distance to next course, multiple tees, tee type, numberof players, and whether the baskets were DISCatchers or not.
A Look at the Data
The rating variable that
regressingon is continuous from 0 to 5. Thedistribution of the ratings gathered ispictured to the right. It appears to benormally distributed around 3, whichallows us to safely proceed with theregression. The Shapiro-Wilk statistic is.537, validating that the data is normally distributed.1) The first variable regressed is yeardesigned. The histogram (right) shows thetremendous exponential growth that discgolf is currently experiencing. Below,rating and year designed was graphed anda linear trend line was added, but thereseems to be no clear relationship. Thisfinding that newer courses are apparentlynot improved over the older ones is ratherdiscouraging. This proves even more that systematic studies like the present one aremissing and should play an important rule in the design of future courses.