what's so special about standard deviation?
$begingroup$
Equivalently, about variance?
I realize it measures the spread of a distribution, but many other metrics could do the same (eg, the average absolute deviation). What is it's deeper significance? Does it have
- a particular geometric interpretation (in the sense eg. that the mean is the balancing point of a distribution)?
- any other intuitive interpretation that differentiates it from other possible measure of spread??
What's so special about it that makes it act as a normalizing factor in all sorts of situations (eg convert covariance to correlation)?
statistics
$endgroup$
add a comment |
$begingroup$
Equivalently, about variance?
I realize it measures the spread of a distribution, but many other metrics could do the same (eg, the average absolute deviation). What is it's deeper significance? Does it have
- a particular geometric interpretation (in the sense eg. that the mean is the balancing point of a distribution)?
- any other intuitive interpretation that differentiates it from other possible measure of spread??
What's so special about it that makes it act as a normalizing factor in all sorts of situations (eg convert covariance to correlation)?
statistics
$endgroup$
3
$begingroup$
Have you heard the term "moment?" The variance is the second moment about the mean. See HERE
$endgroup$
– Mark Viola
6 hours ago
add a comment |
$begingroup$
Equivalently, about variance?
I realize it measures the spread of a distribution, but many other metrics could do the same (eg, the average absolute deviation). What is it's deeper significance? Does it have
- a particular geometric interpretation (in the sense eg. that the mean is the balancing point of a distribution)?
- any other intuitive interpretation that differentiates it from other possible measure of spread??
What's so special about it that makes it act as a normalizing factor in all sorts of situations (eg convert covariance to correlation)?
statistics
$endgroup$
Equivalently, about variance?
I realize it measures the spread of a distribution, but many other metrics could do the same (eg, the average absolute deviation). What is it's deeper significance? Does it have
- a particular geometric interpretation (in the sense eg. that the mean is the balancing point of a distribution)?
- any other intuitive interpretation that differentiates it from other possible measure of spread??
What's so special about it that makes it act as a normalizing factor in all sorts of situations (eg convert covariance to correlation)?
statistics
statistics
asked 6 hours ago
blue_noteblue_note
2417
2417
3
$begingroup$
Have you heard the term "moment?" The variance is the second moment about the mean. See HERE
$endgroup$
– Mark Viola
6 hours ago
add a comment |
3
$begingroup$
Have you heard the term "moment?" The variance is the second moment about the mean. See HERE
$endgroup$
– Mark Viola
6 hours ago
3
3
$begingroup$
Have you heard the term "moment?" The variance is the second moment about the mean. See HERE
$endgroup$
– Mark Viola
6 hours ago
$begingroup$
Have you heard the term "moment?" The variance is the second moment about the mean. See HERE
$endgroup$
– Mark Viola
6 hours ago
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
There's a very nice geometric interpretation.
Random variables of finite mean form a vector space. Covariance is a useful inner product on that space. Oh, wait, that's not quite right: constant variables are orthogonal to themselves in this product, so it's only positive semi-definite. So, let me be more precise - on the quotient space formed by the equivalence relation "differs from by a constant", covariance is a true inner product. (If quotient spaces are an unfamiliar concept, just focus on the vector space of zero-mean variables; it gets you the same outcome in this context.)
Right, let's carry on. In the norm this inner product induces, standard deviation is a variable's length, while the correlation coefficient between two variables (their covariance divided by the product of their standard deviations) is the cosine of the "angle" between them. That the correlation coefficient is in $[-1,,1]$ is then a restatement of the vector space's Cauchy-Schwarz inequality.
$endgroup$
3
$begingroup$
Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
$endgroup$
– blue_note
6 hours ago
$begingroup$
@blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
$endgroup$
– J.G.
6 hours ago
add a comment |
$begingroup$
When defining "standard deviation", we want some way to take a bunch of deviations from a mean and quantify how big they typically are using a single number in the same units as the deviations themselves. But any definition of "standard deviation" induces a corresponding definition of "mean" because we want our choice of "mean" to always minimize the value of our "standard deviation" (intuitively, we want to define "mean" to be the "middlemost" point as measured by "standard deviation"). Only by defining "standard deviation" in the usual way do we recover the arithmetic mean while still having a measure in the right units. (Without getting into details, the key point is that the quadratic becomes linear when we take the derivative to find its critical point.)
If we want to use some other mean, we can of course find a different "standard deviation" that will match that mean (the progress is somewhat analogous to integration), but in practice it's just easier to transform the data so that the arithmetic mean is appropriate.
New contributor
$endgroup$
$begingroup$
If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
$endgroup$
– mephistolotl
53 mins ago
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3071367%2fwhats-so-special-about-standard-deviation%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
There's a very nice geometric interpretation.
Random variables of finite mean form a vector space. Covariance is a useful inner product on that space. Oh, wait, that's not quite right: constant variables are orthogonal to themselves in this product, so it's only positive semi-definite. So, let me be more precise - on the quotient space formed by the equivalence relation "differs from by a constant", covariance is a true inner product. (If quotient spaces are an unfamiliar concept, just focus on the vector space of zero-mean variables; it gets you the same outcome in this context.)
Right, let's carry on. In the norm this inner product induces, standard deviation is a variable's length, while the correlation coefficient between two variables (their covariance divided by the product of their standard deviations) is the cosine of the "angle" between them. That the correlation coefficient is in $[-1,,1]$ is then a restatement of the vector space's Cauchy-Schwarz inequality.
$endgroup$
3
$begingroup$
Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
$endgroup$
– blue_note
6 hours ago
$begingroup$
@blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
$endgroup$
– J.G.
6 hours ago
add a comment |
$begingroup$
There's a very nice geometric interpretation.
Random variables of finite mean form a vector space. Covariance is a useful inner product on that space. Oh, wait, that's not quite right: constant variables are orthogonal to themselves in this product, so it's only positive semi-definite. So, let me be more precise - on the quotient space formed by the equivalence relation "differs from by a constant", covariance is a true inner product. (If quotient spaces are an unfamiliar concept, just focus on the vector space of zero-mean variables; it gets you the same outcome in this context.)
Right, let's carry on. In the norm this inner product induces, standard deviation is a variable's length, while the correlation coefficient between two variables (their covariance divided by the product of their standard deviations) is the cosine of the "angle" between them. That the correlation coefficient is in $[-1,,1]$ is then a restatement of the vector space's Cauchy-Schwarz inequality.
$endgroup$
3
$begingroup$
Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
$endgroup$
– blue_note
6 hours ago
$begingroup$
@blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
$endgroup$
– J.G.
6 hours ago
add a comment |
$begingroup$
There's a very nice geometric interpretation.
Random variables of finite mean form a vector space. Covariance is a useful inner product on that space. Oh, wait, that's not quite right: constant variables are orthogonal to themselves in this product, so it's only positive semi-definite. So, let me be more precise - on the quotient space formed by the equivalence relation "differs from by a constant", covariance is a true inner product. (If quotient spaces are an unfamiliar concept, just focus on the vector space of zero-mean variables; it gets you the same outcome in this context.)
Right, let's carry on. In the norm this inner product induces, standard deviation is a variable's length, while the correlation coefficient between two variables (their covariance divided by the product of their standard deviations) is the cosine of the "angle" between them. That the correlation coefficient is in $[-1,,1]$ is then a restatement of the vector space's Cauchy-Schwarz inequality.
$endgroup$
There's a very nice geometric interpretation.
Random variables of finite mean form a vector space. Covariance is a useful inner product on that space. Oh, wait, that's not quite right: constant variables are orthogonal to themselves in this product, so it's only positive semi-definite. So, let me be more precise - on the quotient space formed by the equivalence relation "differs from by a constant", covariance is a true inner product. (If quotient spaces are an unfamiliar concept, just focus on the vector space of zero-mean variables; it gets you the same outcome in this context.)
Right, let's carry on. In the norm this inner product induces, standard deviation is a variable's length, while the correlation coefficient between two variables (their covariance divided by the product of their standard deviations) is the cosine of the "angle" between them. That the correlation coefficient is in $[-1,,1]$ is then a restatement of the vector space's Cauchy-Schwarz inequality.
answered 6 hours ago
J.G.J.G.
23.6k22338
23.6k22338
3
$begingroup$
Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
$endgroup$
– blue_note
6 hours ago
$begingroup$
@blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
$endgroup$
– J.G.
6 hours ago
add a comment |
3
$begingroup$
Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
$endgroup$
– blue_note
6 hours ago
$begingroup$
@blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
$endgroup$
– J.G.
6 hours ago
3
3
$begingroup$
Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
$endgroup$
– blue_note
6 hours ago
$begingroup$
Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
$endgroup$
– blue_note
6 hours ago
$begingroup$
@blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
$endgroup$
– J.G.
6 hours ago
$begingroup$
@blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
$endgroup$
– J.G.
6 hours ago
add a comment |
$begingroup$
When defining "standard deviation", we want some way to take a bunch of deviations from a mean and quantify how big they typically are using a single number in the same units as the deviations themselves. But any definition of "standard deviation" induces a corresponding definition of "mean" because we want our choice of "mean" to always minimize the value of our "standard deviation" (intuitively, we want to define "mean" to be the "middlemost" point as measured by "standard deviation"). Only by defining "standard deviation" in the usual way do we recover the arithmetic mean while still having a measure in the right units. (Without getting into details, the key point is that the quadratic becomes linear when we take the derivative to find its critical point.)
If we want to use some other mean, we can of course find a different "standard deviation" that will match that mean (the progress is somewhat analogous to integration), but in practice it's just easier to transform the data so that the arithmetic mean is appropriate.
New contributor
$endgroup$
$begingroup$
If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
$endgroup$
– mephistolotl
53 mins ago
add a comment |
$begingroup$
When defining "standard deviation", we want some way to take a bunch of deviations from a mean and quantify how big they typically are using a single number in the same units as the deviations themselves. But any definition of "standard deviation" induces a corresponding definition of "mean" because we want our choice of "mean" to always minimize the value of our "standard deviation" (intuitively, we want to define "mean" to be the "middlemost" point as measured by "standard deviation"). Only by defining "standard deviation" in the usual way do we recover the arithmetic mean while still having a measure in the right units. (Without getting into details, the key point is that the quadratic becomes linear when we take the derivative to find its critical point.)
If we want to use some other mean, we can of course find a different "standard deviation" that will match that mean (the progress is somewhat analogous to integration), but in practice it's just easier to transform the data so that the arithmetic mean is appropriate.
New contributor
$endgroup$
$begingroup$
If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
$endgroup$
– mephistolotl
53 mins ago
add a comment |
$begingroup$
When defining "standard deviation", we want some way to take a bunch of deviations from a mean and quantify how big they typically are using a single number in the same units as the deviations themselves. But any definition of "standard deviation" induces a corresponding definition of "mean" because we want our choice of "mean" to always minimize the value of our "standard deviation" (intuitively, we want to define "mean" to be the "middlemost" point as measured by "standard deviation"). Only by defining "standard deviation" in the usual way do we recover the arithmetic mean while still having a measure in the right units. (Without getting into details, the key point is that the quadratic becomes linear when we take the derivative to find its critical point.)
If we want to use some other mean, we can of course find a different "standard deviation" that will match that mean (the progress is somewhat analogous to integration), but in practice it's just easier to transform the data so that the arithmetic mean is appropriate.
New contributor
$endgroup$
When defining "standard deviation", we want some way to take a bunch of deviations from a mean and quantify how big they typically are using a single number in the same units as the deviations themselves. But any definition of "standard deviation" induces a corresponding definition of "mean" because we want our choice of "mean" to always minimize the value of our "standard deviation" (intuitively, we want to define "mean" to be the "middlemost" point as measured by "standard deviation"). Only by defining "standard deviation" in the usual way do we recover the arithmetic mean while still having a measure in the right units. (Without getting into details, the key point is that the quadratic becomes linear when we take the derivative to find its critical point.)
If we want to use some other mean, we can of course find a different "standard deviation" that will match that mean (the progress is somewhat analogous to integration), but in practice it's just easier to transform the data so that the arithmetic mean is appropriate.
New contributor
New contributor
answered 1 hour ago
QwertyQwerty
1
1
New contributor
New contributor
$begingroup$
If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
$endgroup$
– mephistolotl
53 mins ago
add a comment |
$begingroup$
If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
$endgroup$
– mephistolotl
53 mins ago
$begingroup$
If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
$endgroup$
– mephistolotl
53 mins ago
$begingroup$
If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
$endgroup$
– mephistolotl
53 mins ago
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3071367%2fwhats-so-special-about-standard-deviation%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
3
$begingroup$
Have you heard the term "moment?" The variance is the second moment about the mean. See HERE
$endgroup$
– Mark Viola
6 hours ago