Can any of you explain to me the significance of a standard deviation? I know it has to do with bell curves and statistics but not the specifics. Lots of you know math. Help Azhrei please.
Avoid technical speak if at all possible. No math in 5 years.
It's a measure of the spread or dispersion of a population. Basically, the larger the standard deviation the more spread out (or less concentric) the population is.
Hope that helps.
So what does that MEAN in relation to percentages or a bell curve? What's a normal standard deviation? What about one of say, 15?
It really depends on what and how you're measuring. If you're measuring something where all the samples are supposed to be similar, 15 percent standard deviation would be high. To give you an example: in my management class this quarter the average score on the midterm was mid-sixties and the standard deviation was around 8 I think. I'm assuming that you've been to college. Conceptualize the spread of all the scores (percentages) in one of your classes and the above standard deviation would probably be a more or less accurate measure of this. In the example I gave a standard deviation of 15 percent would mean that the scores were all over the place.
If you told me more about what you were trying to apply this to, I could probably give you more useful information.
In terms of a bell curve, a higher standard deviation would mean that the curve was more flat (which is equivalent to being less concentric).
I'm trying to figure out what a SD of 15 means for an IQ test.
In very very general terms, if you score +-15 of the score given, you're somewhere at the level represented by that score.
SOmething like that.
Yeah, that meant a ton too.
It means its a crappy IQ test.
I have the formula. I'll try to put it into very layman terms. You really have to understand how standard deviation is obtained to understand what it means.
Here goes: (1) find the average of all values, (2) take each value and minus the average from it, (3) then square the difference (this is done to obtain a positive value), (4) add all these numbers up, (5) divide this sum by the number of original values minus one (this has to do with degrees of freedom), (6) find the square root of the number you just got, and (7) this is your standard deviation. In a weird way its similar to but different from the average of the difference of each number from the mean.
Here is a sample of the above process:
Sum of these values: 45
Number of values in the population: 9
Step 1: Average (or mean) for the population: 5
Step 2: Difference of each value from the mean: -4,-3,-2,-1,0,1,2,3,4,
Step 3: Difference of each value from the mean squared: 16,9,4,1,0,1,4,9,16
Step 4: Sum of all the squared differences: 60
Step 5: Sum of the differences divided by the number of values in the population minus 1: 60/(9-1)=7.5
Step 6: Square root of the sum of the differences divided by the number of values in the population minus 1: 2.7386
That's the standard deviation of the population (1,2,3,4,5,6,7,8,9). A long process but it shows you the mechanics behind the magic phrase "standard deviation". Just imagine the process we went through performed on all the people who took that particular IQ test.
Hope that wasn't too confusing.
If that isn't the formula I'm gonna feel really stupid.
I'm relatively sure that it is though. And if it isn't someone will correct me.
It doesn't mean that it's a crappy IQ test. Since the scores in this example are determined by both the test and the testers, there really could be such variability of intelligence within the population. Also, one must take into account that IQ tests aren't usually scored on a percentage basis. If my memory is correct, the average score is one hundred and everyone elses score is supposed to be based on how intelligent they are compared to the general population. This means that a person with a score of 120 is 1.2 times as intelligent as someone of the same age (IQ tests are usually administered during the standard schooling years in the U.S.). Since we are dealing with such large numbers (presumably), a standard deviation of 15 doesn't seem all that large to me.
"This means that a person with a score of 120 is 1.2 times as intelligent as someone of the same age"
I should have said that they are 1.2 times as intelligent as the average person of the same age.
>I'm trying to figure out what a SD of 15 means for an IQ test.
Assume that the average IQ is 100. An SD of 15 would mean that 68% of people have an IQ between 85 and 115. Approximately 95% of people would be within 2 SD's, so if your IQ is 130 you are in the 97th percentile approximately. Within 3 SD's you encompass about 99% of the test takers. So if you have a 145, you are in the 99.5%tile in terms of IQ.
Thank you SES! and everyone else too.
And an IQ of 85 should be sufficient to rule out Lotus Petal and Arcane Denial in 99% of all decks
You would think so...
SES - I'm not sure that's what it means. I don't know for sure, but I would guess they are giving a sort of margin of error for the test, rather than describing the spread of the population. Without any further knowledge of the context than Azhrei's description, my impression was that 15 is the standard deviation for ONE person taking the test multiple times, so it's a measure of how close the test will come to measuring your true intelligence on the first try.
Your interpretation is also possible though. I think you'd have to ask the test people to find out which was meant.
No, it's for the population as a whole.