what's so special about standard deviation?












12












$begingroup$


Equivalently, about variance?



I realize it measures the spread of a distribution, but many other metrics could do the same (eg, the average absolute deviation). What is it's deeper significance? Does it have




  • a particular geometric interpretation (in the sense eg. that the mean is the balancing point of a distribution)?

  • any other intuitive interpretation that differentiates it from other possible measure of spread??


What's so special about it that makes it act as a normalizing factor in all sorts of situations (eg convert covariance to correlation)?










share|cite|improve this question









$endgroup$








  • 3




    $begingroup$
    Have you heard the term "moment?" The variance is the second moment about the mean. See HERE
    $endgroup$
    – Mark Viola
    6 hours ago


















12












$begingroup$


Equivalently, about variance?



I realize it measures the spread of a distribution, but many other metrics could do the same (eg, the average absolute deviation). What is it's deeper significance? Does it have




  • a particular geometric interpretation (in the sense eg. that the mean is the balancing point of a distribution)?

  • any other intuitive interpretation that differentiates it from other possible measure of spread??


What's so special about it that makes it act as a normalizing factor in all sorts of situations (eg convert covariance to correlation)?










share|cite|improve this question









$endgroup$








  • 3




    $begingroup$
    Have you heard the term "moment?" The variance is the second moment about the mean. See HERE
    $endgroup$
    – Mark Viola
    6 hours ago
















12












12








12


3



$begingroup$


Equivalently, about variance?



I realize it measures the spread of a distribution, but many other metrics could do the same (eg, the average absolute deviation). What is it's deeper significance? Does it have




  • a particular geometric interpretation (in the sense eg. that the mean is the balancing point of a distribution)?

  • any other intuitive interpretation that differentiates it from other possible measure of spread??


What's so special about it that makes it act as a normalizing factor in all sorts of situations (eg convert covariance to correlation)?










share|cite|improve this question









$endgroup$




Equivalently, about variance?



I realize it measures the spread of a distribution, but many other metrics could do the same (eg, the average absolute deviation). What is it's deeper significance? Does it have




  • a particular geometric interpretation (in the sense eg. that the mean is the balancing point of a distribution)?

  • any other intuitive interpretation that differentiates it from other possible measure of spread??


What's so special about it that makes it act as a normalizing factor in all sorts of situations (eg convert covariance to correlation)?







statistics






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 6 hours ago









blue_noteblue_note

2417




2417








  • 3




    $begingroup$
    Have you heard the term "moment?" The variance is the second moment about the mean. See HERE
    $endgroup$
    – Mark Viola
    6 hours ago
















  • 3




    $begingroup$
    Have you heard the term "moment?" The variance is the second moment about the mean. See HERE
    $endgroup$
    – Mark Viola
    6 hours ago










3




3




$begingroup$
Have you heard the term "moment?" The variance is the second moment about the mean. See HERE
$endgroup$
– Mark Viola
6 hours ago






$begingroup$
Have you heard the term "moment?" The variance is the second moment about the mean. See HERE
$endgroup$
– Mark Viola
6 hours ago












2 Answers
2






active

oldest

votes


















15












$begingroup$

There's a very nice geometric interpretation.



Random variables of finite mean form a vector space. Covariance is a useful inner product on that space. Oh, wait, that's not quite right: constant variables are orthogonal to themselves in this product, so it's only positive semi-definite. So, let me be more precise - on the quotient space formed by the equivalence relation "differs from by a constant", covariance is a true inner product. (If quotient spaces are an unfamiliar concept, just focus on the vector space of zero-mean variables; it gets you the same outcome in this context.)



Right, let's carry on. In the norm this inner product induces, standard deviation is a variable's length, while the correlation coefficient between two variables (their covariance divided by the product of their standard deviations) is the cosine of the "angle" between them. That the correlation coefficient is in $[-1,,1]$ is then a restatement of the vector space's Cauchy-Schwarz inequality.






share|cite|improve this answer









$endgroup$









  • 3




    $begingroup$
    Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
    $endgroup$
    – blue_note
    6 hours ago










  • $begingroup$
    @blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
    $endgroup$
    – J.G.
    6 hours ago





















0












$begingroup$

When defining "standard deviation", we want some way to take a bunch of deviations from a mean and quantify how big they typically are using a single number in the same units as the deviations themselves. But any definition of "standard deviation" induces a corresponding definition of "mean" because we want our choice of "mean" to always minimize the value of our "standard deviation" (intuitively, we want to define "mean" to be the "middlemost" point as measured by "standard deviation"). Only by defining "standard deviation" in the usual way do we recover the arithmetic mean while still having a measure in the right units. (Without getting into details, the key point is that the quadratic becomes linear when we take the derivative to find its critical point.)



If we want to use some other mean, we can of course find a different "standard deviation" that will match that mean (the progress is somewhat analogous to integration), but in practice it's just easier to transform the data so that the arithmetic mean is appropriate.






share|cite|improve this answer








New contributor




Qwerty is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$













  • $begingroup$
    If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
    $endgroup$
    – mephistolotl
    53 mins ago











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3071367%2fwhats-so-special-about-standard-deviation%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









15












$begingroup$

There's a very nice geometric interpretation.



Random variables of finite mean form a vector space. Covariance is a useful inner product on that space. Oh, wait, that's not quite right: constant variables are orthogonal to themselves in this product, so it's only positive semi-definite. So, let me be more precise - on the quotient space formed by the equivalence relation "differs from by a constant", covariance is a true inner product. (If quotient spaces are an unfamiliar concept, just focus on the vector space of zero-mean variables; it gets you the same outcome in this context.)



Right, let's carry on. In the norm this inner product induces, standard deviation is a variable's length, while the correlation coefficient between two variables (their covariance divided by the product of their standard deviations) is the cosine of the "angle" between them. That the correlation coefficient is in $[-1,,1]$ is then a restatement of the vector space's Cauchy-Schwarz inequality.






share|cite|improve this answer









$endgroup$









  • 3




    $begingroup$
    Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
    $endgroup$
    – blue_note
    6 hours ago










  • $begingroup$
    @blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
    $endgroup$
    – J.G.
    6 hours ago


















15












$begingroup$

There's a very nice geometric interpretation.



Random variables of finite mean form a vector space. Covariance is a useful inner product on that space. Oh, wait, that's not quite right: constant variables are orthogonal to themselves in this product, so it's only positive semi-definite. So, let me be more precise - on the quotient space formed by the equivalence relation "differs from by a constant", covariance is a true inner product. (If quotient spaces are an unfamiliar concept, just focus on the vector space of zero-mean variables; it gets you the same outcome in this context.)



Right, let's carry on. In the norm this inner product induces, standard deviation is a variable's length, while the correlation coefficient between two variables (their covariance divided by the product of their standard deviations) is the cosine of the "angle" between them. That the correlation coefficient is in $[-1,,1]$ is then a restatement of the vector space's Cauchy-Schwarz inequality.






share|cite|improve this answer









$endgroup$









  • 3




    $begingroup$
    Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
    $endgroup$
    – blue_note
    6 hours ago










  • $begingroup$
    @blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
    $endgroup$
    – J.G.
    6 hours ago
















15












15








15





$begingroup$

There's a very nice geometric interpretation.



Random variables of finite mean form a vector space. Covariance is a useful inner product on that space. Oh, wait, that's not quite right: constant variables are orthogonal to themselves in this product, so it's only positive semi-definite. So, let me be more precise - on the quotient space formed by the equivalence relation "differs from by a constant", covariance is a true inner product. (If quotient spaces are an unfamiliar concept, just focus on the vector space of zero-mean variables; it gets you the same outcome in this context.)



Right, let's carry on. In the norm this inner product induces, standard deviation is a variable's length, while the correlation coefficient between two variables (their covariance divided by the product of their standard deviations) is the cosine of the "angle" between them. That the correlation coefficient is in $[-1,,1]$ is then a restatement of the vector space's Cauchy-Schwarz inequality.






share|cite|improve this answer









$endgroup$



There's a very nice geometric interpretation.



Random variables of finite mean form a vector space. Covariance is a useful inner product on that space. Oh, wait, that's not quite right: constant variables are orthogonal to themselves in this product, so it's only positive semi-definite. So, let me be more precise - on the quotient space formed by the equivalence relation "differs from by a constant", covariance is a true inner product. (If quotient spaces are an unfamiliar concept, just focus on the vector space of zero-mean variables; it gets you the same outcome in this context.)



Right, let's carry on. In the norm this inner product induces, standard deviation is a variable's length, while the correlation coefficient between two variables (their covariance divided by the product of their standard deviations) is the cosine of the "angle" between them. That the correlation coefficient is in $[-1,,1]$ is then a restatement of the vector space's Cauchy-Schwarz inequality.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered 6 hours ago









J.G.J.G.

23.6k22338




23.6k22338








  • 3




    $begingroup$
    Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
    $endgroup$
    – blue_note
    6 hours ago










  • $begingroup$
    @blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
    $endgroup$
    – J.G.
    6 hours ago
















  • 3




    $begingroup$
    Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
    $endgroup$
    – blue_note
    6 hours ago










  • $begingroup$
    @blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
    $endgroup$
    – J.G.
    6 hours ago










3




3




$begingroup$
Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
$endgroup$
– blue_note
6 hours ago




$begingroup$
Interesting approach. Is it a personal interpretation or a standard one? If it's standard, are there any resources you can provide? I haven't seen it in any book...
$endgroup$
– blue_note
6 hours ago












$begingroup$
@blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
$endgroup$
– J.G.
6 hours ago






$begingroup$
@blue_note You're most likely to encounter it in a discussion of regression, since regressing $Y$ against $X$ writes $Y$ as a multiple of $X$, plus a variable orthogonal to $X$ in this sense. In fact, the coefficients involved in such an expression square to the proportion of variance explained. This has a well-understood connection to probability in quantum mechanics. But really, any source that explains why there's a $^2$ in $R^2$ will at least hint at these ideas.
$endgroup$
– J.G.
6 hours ago













0












$begingroup$

When defining "standard deviation", we want some way to take a bunch of deviations from a mean and quantify how big they typically are using a single number in the same units as the deviations themselves. But any definition of "standard deviation" induces a corresponding definition of "mean" because we want our choice of "mean" to always minimize the value of our "standard deviation" (intuitively, we want to define "mean" to be the "middlemost" point as measured by "standard deviation"). Only by defining "standard deviation" in the usual way do we recover the arithmetic mean while still having a measure in the right units. (Without getting into details, the key point is that the quadratic becomes linear when we take the derivative to find its critical point.)



If we want to use some other mean, we can of course find a different "standard deviation" that will match that mean (the progress is somewhat analogous to integration), but in practice it's just easier to transform the data so that the arithmetic mean is appropriate.






share|cite|improve this answer








New contributor




Qwerty is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$













  • $begingroup$
    If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
    $endgroup$
    – mephistolotl
    53 mins ago
















0












$begingroup$

When defining "standard deviation", we want some way to take a bunch of deviations from a mean and quantify how big they typically are using a single number in the same units as the deviations themselves. But any definition of "standard deviation" induces a corresponding definition of "mean" because we want our choice of "mean" to always minimize the value of our "standard deviation" (intuitively, we want to define "mean" to be the "middlemost" point as measured by "standard deviation"). Only by defining "standard deviation" in the usual way do we recover the arithmetic mean while still having a measure in the right units. (Without getting into details, the key point is that the quadratic becomes linear when we take the derivative to find its critical point.)



If we want to use some other mean, we can of course find a different "standard deviation" that will match that mean (the progress is somewhat analogous to integration), but in practice it's just easier to transform the data so that the arithmetic mean is appropriate.






share|cite|improve this answer








New contributor




Qwerty is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$













  • $begingroup$
    If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
    $endgroup$
    – mephistolotl
    53 mins ago














0












0








0





$begingroup$

When defining "standard deviation", we want some way to take a bunch of deviations from a mean and quantify how big they typically are using a single number in the same units as the deviations themselves. But any definition of "standard deviation" induces a corresponding definition of "mean" because we want our choice of "mean" to always minimize the value of our "standard deviation" (intuitively, we want to define "mean" to be the "middlemost" point as measured by "standard deviation"). Only by defining "standard deviation" in the usual way do we recover the arithmetic mean while still having a measure in the right units. (Without getting into details, the key point is that the quadratic becomes linear when we take the derivative to find its critical point.)



If we want to use some other mean, we can of course find a different "standard deviation" that will match that mean (the progress is somewhat analogous to integration), but in practice it's just easier to transform the data so that the arithmetic mean is appropriate.






share|cite|improve this answer








New contributor




Qwerty is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$



When defining "standard deviation", we want some way to take a bunch of deviations from a mean and quantify how big they typically are using a single number in the same units as the deviations themselves. But any definition of "standard deviation" induces a corresponding definition of "mean" because we want our choice of "mean" to always minimize the value of our "standard deviation" (intuitively, we want to define "mean" to be the "middlemost" point as measured by "standard deviation"). Only by defining "standard deviation" in the usual way do we recover the arithmetic mean while still having a measure in the right units. (Without getting into details, the key point is that the quadratic becomes linear when we take the derivative to find its critical point.)



If we want to use some other mean, we can of course find a different "standard deviation" that will match that mean (the progress is somewhat analogous to integration), but in practice it's just easier to transform the data so that the arithmetic mean is appropriate.







share|cite|improve this answer








New contributor




Qwerty is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|cite|improve this answer



share|cite|improve this answer






New contributor




Qwerty is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









answered 1 hour ago









QwertyQwerty

1




1




New contributor




Qwerty is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Qwerty is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Qwerty is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.












  • $begingroup$
    If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
    $endgroup$
    – mephistolotl
    53 mins ago


















  • $begingroup$
    If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
    $endgroup$
    – mephistolotl
    53 mins ago
















$begingroup$
If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
$endgroup$
– mephistolotl
53 mins ago




$begingroup$
If all you want is to minimization at the mean and the right units, why not sum/integrate the magnitude of the deviations?
$endgroup$
– mephistolotl
53 mins ago


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3071367%2fwhats-so-special-about-standard-deviation%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

How to make a Squid Proxy server?

Is this a new Fibonacci Identity?

19世紀