1 00:00:00,000 --> 00:00:04,009 Okay, so we already saw in a previous video that now that we talk about 2 00:00:04,009 --> 00:00:10,004 observers and their real outcomes to each measurement, we could also talk about the 3 00:00:10,004 --> 00:00:15,001 expected value of a measurement. So of course, a measurement outcome is 4 00:00:15,001 --> 00:00:21,000 probabilistic. And so, we could talk about the statistics of what, what we expect to 5 00:00:21,000 --> 00:00:28,080 see. So in this video we are going to look at this question a little more precisely 6 00:00:28,080 --> 00:00:35,065 and closely. So let's say that we have a an observed M so M is a summation matrix 7 00:00:35,065 --> 00:00:43,030 so M = M diagram. It's k by k summation matrix and lets say that we have a quantum 8 00:00:43,030 --> 00:00:51,010 state of our system which we now. Right in the eigenbasis of M. So you write is as 9 00:00:51,010 --> 00:00:58,081 summation alpha phi and, so now if you measure this observable you get a random 10 00:00:58,081 --> 00:01:05,067 outcome. And let's, let's the random variable X denote the outcome of this 11 00:01:05,067 --> 00:01:11,074 measurement. Okay. So what we want to do is, we want to understand what's the 12 00:01:11,074 --> 00:01:17,055 distribution of X? So, well, of course, this is sort of easy to do because we know 13 00:01:17,055 --> 00:01:23,020 that, what, what, what are going, what are the outcomes going to be? Well, they are 14 00:01:23,020 --> 00:01:28,095 supported on the eigenvalues of, of M. So if the eigenvalues are lamnda-naught, 15 00:01:28,095 --> 00:01:38,039 lambda one through lambda K-1, then we know that. Lamda I will occur so, so the 16 00:01:38,039 --> 00:01:48,046 probability that, that, that x equal to lambda sub j is just what? Well, the j 17 00:01:48,046 --> 00:01:55,056 outcome will occur with probability alpha sub j magnitude squared. Because, because 18 00:01:55,056 --> 00:02:02,010 our state psi, which is the state of our quantum system, we have decomposed in 19 00:02:02,010 --> 00:02:08,074 terms of, of the eigenbasis of M. And the amplitude of phi sub j is exactly alpha 20 00:02:08,074 --> 00:02:14,023 sub j. So we got a probability distribution of this kind on j different 21 00:02:14,023 --> 00:02:20,034 outcomes, on our k different outcomes, lamdba naught lamdba k minus one. Every 22 00:02:20,034 --> 00:02:26,034 other outcome has probability zero. So now we can, we can try to, well off-course, 23 00:02:26,034 --> 00:02:32,010 ideally we would want to know the entire probability distribution that, but usually 24 00:02:32,010 --> 00:02:38,002 we collect some statistics arriving this distributions and that tells us, gives us 25 00:02:38,002 --> 00:02:43,045 a rough idea about what the distribution looks like. And so the two most, useful 26 00:02:43,045 --> 00:02:48,027 you know, the first order statistics are, the mean of the distributions, the 27 00:02:48,027 --> 00:02:53,094 expected value of x, which be might denote by mu. And the variance of x, which, or 28 00:02:53,094 --> 00:03:00,043 sigma squared. The sigma is the standard deviation. The variance of x is, is the 29 00:03:00,043 --> 00:03:08,062 expected square of the deviation for the mean. So, it's expected value of x minus 30 00:03:08,062 --> 00:03:16,004 the mean, the whole square. Okay so. I'm sure you all know about, about these 31 00:03:16,004 --> 00:03:21,062 quantities. But You know, let's, let's just try to make sure that you understand 32 00:03:21,062 --> 00:03:26,099 them you know, you have a feel for what these two quantities are. So we are given 33 00:03:26,099 --> 00:03:32,036 some, some probability distribution and we might you know, you might imagine that, 34 00:03:32,036 --> 00:03:37,054 that you actually make a wooden cut out of this, of this distribution. So you make a 35 00:03:37,054 --> 00:03:42,029 wooden shape which looks like this. Okay, so then what's the meaning of the 36 00:03:42,029 --> 00:03:47,007 distribution? Well it's, it's the point at which you can balance this distribution. 37 00:03:47,007 --> 00:03:51,059 Right, if you, if you think of this as a see-saw, then this is exactly the middle 38 00:03:51,059 --> 00:03:56,056 point. The, the point across which, if you hold it at this point, the point on the 39 00:03:56,056 --> 00:04:02,066 right and the point on the left exactly balances. And what about this standard 40 00:04:02,066 --> 00:04:09,028 deviation or the square root of the variance. How do you interpret that? Well 41 00:04:09,028 --> 00:04:15,003 there's also a very nice physical interpretation of that which is, that if 42 00:04:15,003 --> 00:04:21,050 you were to, rotate this, this shape about this, about this main position. So you 43 00:04:21,050 --> 00:04:27,099 rotate it about a vertical, about this vertical axis. And then you, you try to 44 00:04:27,099 --> 00:04:37,030 fill, how much moment of inertia is? How difficult is it to rotate this shape? 45 00:04:37,030 --> 00:04:46,055 Right? Well then. This, the standard deviation is, is exactly sort of the, the 46 00:04:46,055 --> 00:04:53,064 you know if, if you were to replace this complicated shape by this simple shape, 47 00:04:53,064 --> 00:05:01,033 where you put half the mass, half the total weight of this, of this shape on the 48 00:05:01,033 --> 00:05:08,057 left here, half on the right here at distance exactly sigma over two on each 49 00:05:08,057 --> 00:05:14,097 side. Then, sorry, distance exactly sigma + sigma and - sigma, then this would be a 50 00:05:14,097 --> 00:05:20,026 shape which has exactly the same resistance to rotation. Right, and the 51 00:05:20,026 --> 00:05:26,048 reason is that the moment of inertia comes from, from the expected square of the 52 00:05:26,048 --> 00:05:32,078 distance from, from, the mean, from the, from the center of gravity, and in this 53 00:05:32,078 --> 00:05:39,070 case, you know, we, we, we are achieving the same square distance on average, from 54 00:05:39,070 --> 00:05:46,059 the center. Okay, so, so now let's try to, approach this from a more qunatitative 55 00:05:46,059 --> 00:05:54,086 point of view. So we have an observable M, state sigma. And. We want to understand 56 00:05:54,086 --> 00:06:03,041 what the mean value of this, you know of the, of the outcome of this measurement 57 00:06:03,041 --> 00:06:10,019 is. And I claim that it's given by [inaudible] and [inaudible]. Okay so, so 58 00:06:10,019 --> 00:06:18,024 let's, let's try to prove this. So let's say that M had eigennvectors phi sub j 59 00:06:18,024 --> 00:06:26,059 with eigenvalue lambda sub j. And let's say, as before, that psi, we wrote as 60 00:06:26,059 --> 00:06:35,065 summation of phi sub j. So we, we wrote, we wrote, quantum state out in the 61 00:06:35,065 --> 00:06:43,067 eigenbasis of M. And these were, amplitudes so now, what's the expected 62 00:06:43,067 --> 00:06:51,062 value of X? So, so min which is the expected value of X, is what? Well, we 63 00:06:51,062 --> 00:07:01,004 have outcome j, outcome lambda sub j with what probability? Well, the, the outcome 64 00:07:01,004 --> 00:07:09,057 is lambda sub j if, if we happen to project on to the j-th eigenvector, which 65 00:07:09,057 --> 00:07:15,058 happens with probability alpha sub j magnitude squared. And so the expected 66 00:07:15,058 --> 00:07:23,008 value is the sum of all j of this. Okay, so now what we want to show is that this 67 00:07:23,008 --> 00:07:30,027 is exactly the same as that quantity. So let's write out that quantity. Okay, so 68 00:07:30,027 --> 00:07:37,011 what's get psi? Well, if you were writing in the, you know, let's write out you 69 00:07:37,011 --> 00:07:43,050 know, it depends upon which basis we write it out in. So let's use, let's write out 70 00:07:43,050 --> 00:07:49,089 get psi in the basis of eigenvectors of M. So, what does get psi look like? Well, it 71 00:07:49,089 --> 00:07:57,004 looks, it looks like, exactly, exactly this. It looks like alpha naught, alpha 72 00:07:57,004 --> 00:08:04,066 one, alpha K - one. What is M look like in the same basis? If you're writing out M in 73 00:08:04,066 --> 00:08:13,085 the eigenbasis, well then it looks like lambda naught, lambda one, lambda K - one. 74 00:08:13,085 --> 00:08:20,008 And zero, zero so it's a diagonal matrix, in its, in its own eigenbasis. And what's 75 00:08:20,008 --> 00:08:28,033 bra phi? Well, bra phi is the raw vector corresponding to this. It's alpha naught 76 00:08:28,033 --> 00:08:35,038 alpha one complex conjugate, alpha K - one . Okay, so you< /i> multiply this out. 77 00:08:35,038 --> 00:08:42,004 What do you get? Well, you follow your r ules of multiplication and you get, you 78 00:08:42,004 --> 00:08:50,050 get exactly, you get summation lambda sub j okay sorry. Let's, let's, let's write it 79 00:08:50,050 --> 00:08:57,080 out in, in proper order. It's alpha sub j lambda sub j alpha sub j. These are all 80 00:08:57,080 --> 00:09:04,051 complex numbers. They can, sorry. These are, well, these are complex numbers. 81 00:09:04,051 --> 00:09:12,014 That's real. They commute with each other. So you can just write it as summation 82 00:09:12,014 --> 00:09:20,070 lambda j sorry alpha j alpha j lambda sub j. Now whats this? Whats alpha j x alpha 83 00:09:20,070 --> 00:09:28,023 j? It's just the square of the magnitude of alpha j. Okay so, so this is more 84 00:09:28,023 --> 00:09:34,094 equal. Okay, what about the variance effects? So, sigma squared well, I'm sure 85 00:09:34,094 --> 00:09:42,084 you know the variance of X is, is you can, you can write it has the expected value of 86 00:09:42,084 --> 00:09:50,037 X squared - the expected value of X holding squared, which is. Okay and this 87 00:09:50,037 --> 00:09:59,091 is just expected value of x squared - mew squared. Okay. So, so what are they 88 00:09:59,091 --> 00:10:08,011 claiming about this? We're claiming it's, it's, it's this bilinear form. It's, it's 89 00:10:08,011 --> 00:10:15,032 bra psi M squared, ket psi - bra psi M ket psi whole squared. Well you can see where 90 00:10:15,032 --> 00:10:22,045 this second piece comes from, this is just mew squared. So what about this first 91 00:10:22,045 --> 00:10:32,029 piece? Well, okay. So what's the expected value of X^2? So what's X^2 going to be 92 00:10:32,029 --> 00:10:43,013 equal to? Well if we have a j-th outcome then, then, then the square of the outcome 93 00:10:43,013 --> 00:10:49,081 is just lambda j^2. And this what probability does it happen? Well its still 94 00:10:49,081 --> 00:10:56,007 the same probability. It's probability that it project onto phi sub j which is 95 00:10:56,007 --> 00:11:02,072 alpha j magnitude squared. Okay. Sum of all j. So that what the expected value of 96 00:11:02,072 --> 00:11:10,074 X^2 is. Now whats, whats this bilinear form? Well this binary form is again it's 97 00:11:10,074 --> 00:11:24,008 bra phi. Sorry, alpha naught alpha one alpha k - one times, what's M^2? And 98 00:11:24,008 --> 00:11:30,008 written out in its eigenbasis? Well you can square this matrix, what will it look 99 00:11:30,008 --> 00:11:36,003 like? Well it would just look like, its diagonal matrix, it look like lambda one, 100 00:11:36,003 --> 00:11:42,035 lambda naught^2, lambda one^2, lambda K - one^2. So when you, you square a matrix, 101 00:11:42,035 --> 00:11:50,046 its eigenvectors remain unchanged, and its eigenvalues get squared. And so, you know 102 00:11:50,046 --> 00:11:59,007 of course now you can see that this is clearly root of that.