Okay, so we already saw in a previous video that now that we talk about observers and their real outcomes to each measurement, we could also talk about the expected value of a measurement. So of course, a measurement outcome is probabilistic. And so, we could talk about the statistics of what, what we expect to see. So in this video we are going to look at this question a little more precisely and closely. So let's say that we have a an observed M so M is a summation matrix so M = M diagram. It's k by k summation matrix and lets say that we have a quantum state of our system which we now. Right in the eigenbasis of M. So you write is as summation alpha phi and, so now if you measure this observable you get a random outcome. And let's, let's the random variable X denote the outcome of this measurement. Okay. So what we want to do is, we want to understand what's the distribution of X? So, well, of course, this is sort of easy to do because we know that, what, what, what are going, what are the outcomes going to be? Well, they are supported on the eigenvalues of, of M. So if the eigenvalues are lamnda-naught, lambda one through lambda K-1, then we know that. Lamda I will occur so, so the probability that, that, that x equal to lambda sub j is just what? Well, the j outcome will occur with probability alpha sub j magnitude squared. Because, because our state psi, which is the state of our quantum system, we have decomposed in terms of, of the eigenbasis of M. And the amplitude of phi sub j is exactly alpha sub j. So we got a probability distribution of this kind on j different outcomes, on our k different outcomes, lamdba naught lamdba k minus one. Every other outcome has probability zero. So now we can, we can try to, well off-course, ideally we would want to know the entire probability distribution that, but usually we collect some statistics arriving this distributions and that tells us, gives us a rough idea about what the distribution looks like. And so the two most, useful you know, the first order statistics are, the mean of the distributions, the expected value of x, which be might denote by mu. And the variance of x, which, or sigma squared. The sigma is the standard deviation. The variance of x is, is the expected square of the deviation for the mean. So, it's expected value of x minus the mean, the whole square. Okay so. I'm sure you all know about, about these quantities. But You know, let's, let's just try to make sure that you understand them you know, you have a feel for what these two quantities are. So we are given some, some probability distribution and we might you know, you might imagine that, that you actually make a wooden cut out of this, of this distribution. So you make a wooden shape which looks like this. Okay, so then what's the meaning of the distribution? Well it's, it's the point at which you can balance this distribution. Right, if you, if you think of this as a see-saw, then this is exactly the middle point. The, the point across which, if you hold it at this point, the point on the right and the point on the left exactly balances. And what about this standard deviation or the square root of the variance. How do you interpret that? Well there's also a very nice physical interpretation of that which is, that if you were to, rotate this, this shape about this, about this main position. So you rotate it about a vertical, about this vertical axis. And then you, you try to fill, how much moment of inertia is? How difficult is it to rotate this shape? Right? Well then. This, the standard deviation is, is exactly sort of the, the you know if, if you were to replace this complicated shape by this simple shape, where you put half the mass, half the total weight of this, of this shape on the left here, half on the right here at distance exactly sigma over two on each side. Then, sorry, distance exactly sigma + sigma and - sigma, then this would be a shape which has exactly the same resistance to rotation. Right, and the reason is that the moment of inertia comes from, from the expected square of the distance from, from, the mean, from the, from the center of gravity, and in this case, you know, we, we, we are achieving the same square distance on average, from the center. Okay, so, so now let's try to, approach this from a more qunatitative point of view. So we have an observable M, state sigma. And. We want to understand what the mean value of this, you know of the, of the outcome of this measurement is. And I claim that it's given by [inaudible] and [inaudible]. Okay so, so let's, let's try to prove this. So let's say that M had eigennvectors phi sub j with eigenvalue lambda sub j. And let's say, as before, that psi, we wrote as summation of phi sub j. So we, we wrote, we wrote, quantum state out in the eigenbasis of M. And these were, amplitudes so now, what's the expected value of X? So, so min which is the expected value of X, is what? Well, we have outcome j, outcome lambda sub j with what probability? Well, the, the outcome is lambda sub j if, if we happen to project on to the j-th eigenvector, which happens with probability alpha sub j magnitude squared. And so the expected value is the sum of all j of this. Okay, so now what we want to show is that this is exactly the same as that quantity. So let's write out that quantity. Okay, so what's get psi? Well, if you were writing in the, you know, let's write out you know, it depends upon which basis we write it out in. So let's use, let's write out get psi in the basis of eigenvectors of M. So, what does get psi look like? Well, it looks, it looks like, exactly, exactly this. It looks like alpha naught, alpha one, alpha K - one. What is M look like in the same basis? If you're writing out M in the eigenbasis, well then it looks like lambda naught, lambda one, lambda K - one. And zero, zero so it's a diagonal matrix, in its, in its own eigenbasis. And what's bra phi? Well, bra phi is the raw vector corresponding to this. It's alpha naught alpha one complex conjugate, alpha K - one . Okay, so you< /i> multiply this out. What do you get? Well, you follow your r ules of multiplication and you get, you get exactly, you get summation lambda sub j okay sorry. Let's, let's, let's write it out in, in proper order. It's alpha sub j lambda sub j alpha sub j. These are all complex numbers. They can, sorry. These are, well, these are complex numbers. That's real. They commute with each other. So you can just write it as summation lambda j sorry alpha j alpha j lambda sub j. Now whats this? Whats alpha j x alpha j? It's just the square of the magnitude of alpha j. Okay so, so this is more equal. Okay, what about the variance effects? So, sigma squared well, I'm sure you know the variance of X is, is you can, you can write it has the expected value of X squared - the expected value of X holding squared, which is. Okay and this is just expected value of x squared - mew squared. Okay. So, so what are they claiming about this? We're claiming it's, it's, it's this bilinear form. It's, it's bra psi M squared, ket psi - bra psi M ket psi whole squared. Well you can see where this second piece comes from, this is just mew squared. So what about this first piece? Well, okay. So what's the expected value of X^2? So what's X^2 going to be equal to? Well if we have a j-th outcome then, then, then the square of the outcome is just lambda j^2. And this what probability does it happen? Well its still the same probability. It's probability that it project onto phi sub j which is alpha j magnitude squared. Okay. Sum of all j. So that what the expected value of X^2 is. Now whats, whats this bilinear form? Well this binary form is again it's bra phi. Sorry, alpha naught alpha one alpha k - one times, what's M^2? And written out in its eigenbasis? Well you can square this matrix, what will it look like? Well it would just look like, its diagonal matrix, it look like lambda one, lambda naught^2, lambda one^2, lambda K - one^2. So when you, you square a matrix, its eigenvectors remain unchanged, and its eigenvalues get squared. And so, you know of course now you can see that this is clearly root of that.