Saturday, November 5, 2011

Why does confidence interval increase as you get further from the mean?

I'm in a statistics cl, and I'm having trouble understanding something about confidence intervals. For a normal distribution, it's essentially x(bar) +/- z*(stdev/sqrtN). Z, stdev and N all stay the same regardless of the value of X, and yet in cl we were told that confidence intervals and prediction intervals for a single variable expand in what seemed to be a parabolic symmetry. My professor didn't have time to explain, so I've been quite curious. Any istance would be appreciated!

No comments:

Post a Comment