We now turn, to the use of the cosmic microwave background, as a cosmological tool. This turns out to be, the most powerful tool in our arsenal today. And it's what really led us into the era of precision cosmology. The basic idea is as follows If we had something sufficiently large in the early universe that can be measured and we knew what redshift it was, then we could apply angular diameter test to it and constrain cosmological parameters from that and the largest possible thing we could think of would be The particle horizon at the time, which is the distance out to which causal connection can be established at the time that universe was that old. If we can infer the size of the particle horizon from physical reasons And, if we can somehow find its signature, in the microwave background, then we can perform the test. There will be density fluctuations, in the early universe, and those that are leftover from the quantum fluctuations, from even earlier times, in the history of the universe. And they would manifest themselves as slight variations in density. You can decompose those in series of waves overlapping and the matter would fall towards the densest parts and because the matter and radiation are tightly coupled before the micro backend was released, radiation would follow, inducing some slight Doppler shifts in its thermal emission. So when, microwave background is released, the decoupling stops, and, whatever pattern of fluctuation was there will remain frozen and observable today. Theory predicts that those who have amplitudes of some, parts in million of the present-day temperature, which is 2.7 degree Kelvin, and so be very, very subtle effect to measure. So for a long time, cosmologists tried to measure these fluctuations, and finally they succeeded. There have been many experiments leading towards this, but the first really successful one, was a balloon born, micro background, measurement, from Antarctica called Boomerang. It was led by Andrew Lange and Paolo de Bernardis, and they were the first ones to have convincing measurement of this fluctuation at the scales that are relevant here, and the first ones to actually infer cosmological parameters from that. This was probably the single most important measurement at the time. At about the same time as the supernovae yielded the evidence for dark energy, and together they really opened this era of precision cosmology. Subsequently, WMAP satellite, which is Wilkinson Microwave Anisotropy Probe was launched And its purpose was to do this measurement even more precisely, and today the best results we have come from analysis of the data from WMAP. So here's how this works. If you were to take an image of the cosmic microwave background, it will be pretty uniform and so you have to turn the contrast now by factor of 1000. And then, you see that there is a dipole, due to the motion of the Milky Way relative to microwave background. If we then subtract dipole, and then turned a knob to even higher contrast to a million then we'll see fluctuations in the sky, a lot of them associated with the Milky Way galaxy, the foreground emission from synchrotron and dust and so on and after careful modelling which is very difficult and very delicate emission from the galaxy has to be removed and what's left is a pattern of density fluctuations in the early universe that have cosmological origin and that is the signal we're looking for. So here is an image which is from the boomerang measurement, it's a false color representation of temperature fluctuations in the sky. And you can see that they're characteristic size blobs, they turn out to be about 1 degree. So that is actually size of the particle horizon at the time of the coupling. So here again is how this works, in the early universe there will be density fluctuations as residual of quantum processes. Matter will be falling toward the densest parts, radiation will follow and have slight Doppler shifts, this is why things are called Doppler peaks later. And when the coupling time comes the pattern is frozen. If you now remember Fourier decomposition and in density field. In any number of dimensions, whether it's a straight line, like normal acoustic signals, or three dimensional field and density, can be decomposed into set of overlapping waves. And the largest one of those, is the largest wave of them that can be accomodated, which is the size of the particle horizon. The pattern will then stay imprinted, on the micro background radiation. After it's been decoupled from the matter, and by measuring it we can infer things about size of the horizon, and expansion rate at the time it was released, and it depends actually in all manner of cosmological parameters, which can be computed nicely from theory. The way we quantify this is through spherical harmonics, which is essentially in equivalent of Fourier decomposition on a sphere, schematically this is how it works, just like in Fourier analysis, fluctuations of certain size would be represented as a, peak in the power spectrum, for responding to that spatial wavelength. The big ones, would have, very low spatial frequency. The small ones would have very high spatial frequency. Here is a simple mathematical, simulation of this from N. Wright. Putting different numbers of waves on a sphere, corresponding to a different wave number, which is called l. Mathematically, this is, again, equivalent to Fourier decomposition, but on a sphere. And instead of fewer components, we are talking about spherical harmonics and any signal on the surface of the sphere can be expressed as a sum of spherical harmonics weighted properly component by component and formulas for each one of those exist and can be it computed. So what's measured, really, is a combination, of many different waves, but different wavelengths, on the sphere, have different weights. And, their distribution will be the power spectrum. So again, this was completely equivalent to power spectra in, Fourier analysis, except that now their waves are in a sphere, and no a line, or, in a plane. So from the measurements we can infer what the power spectrum is and from that we can then infer something about initial density fluctuations. he characteristic size that corresponds to particle horizon of the time is given by the wave number is equal approximately 180 degrees divided by that Spherical harmonic l and if we can find out what that is then we can say something about size and expansion rate of the unaries at that time.So here WMAP results, interim results they got a slightly better later as they kept reducing delay time and there is a very prominent and obvious peak at about. L of 200, or angular scale a little less than one degree, and that corresponds to the size of the horizon. The other bumps you see are harmonics of the base frequency. So different cosmological models in different combinations of parameters Make a prediction of a pair that will go through these data points and shown here is a particularly well fitting model that is model that's now pretty much generally accepted. The exact positions and amplitudes and relative amplitudes and widths of these peaks depend. Again, on complicated mi-, mixtures of cosmological parameters. Total matter density, dark energy density, baryonic density, expansion rate, and so on. And in principle could be used to constrain all of it. However this is a very complex process and analysis involves creating large ensembles of Model universes and finding out which fit best and what are, what are the likelihood distributions of each 1 of the parameters So the 1st question was, what is the basic geometry of the universe? Is it open, closed or just critical? And in fact. It turns out it was flat within the measurement errors. That initial measurement, so that omega total is within 1 sigma away from unity. So even the original measurement implied that universe is flat within measurement errors, which are of the order of a couple percent. Since then, this got even more precise. That in itself is a very important result. The universe is spacially flat. But remember, it, that can be achieved as a many different combinations of the density of matter and density of, dark energy or cosmological constant. So by itself this measurement doesn't tell you much about the dark energy. But combined with others it can be used as a very powerful constraint. So, for example, if we can deduce from dynamical measurements that the Ω of mater is about 0.3 then this immediately implies that Ω of the dark energy is but 0.7. Or if we can combine it with another measurement, like that one of supernovi. That will do the same thing. It's important to note that since so many parameters are mixed together in producing these patterns of density fluctuations. There's some degeneracy which means essentially that lot of them will be coupled together and in this particular case error ellipses are highly elongated along the line that's almost parallel to the flat universe line in the old diagram of omega matter versus omega vacuum. Which is why we think the universe is very close to flat so that degeneracy can be broken by a different measurement which would have error ellipses that are not oriented in the same way and you may recall that those from supernova measurements looked pretty much orthogonal through this. Another important measurement comes out of this, is how many baryons are there in the universe? The more matter there is, stronger fluctuations, and therefore the amplitudes of the peaks will be higher. So here are examples of 2 models with, different amounts of Baryonic matter. And by fitting to the actual data we can infer omega baryons. And the result is shown here, expressed in units of h which is Hubble constant units of 100 kilometers per second per mega-parsec since we know h is roughly point 7, then this really means the mega-variance is about 4.5%. And this is in remarkably good agreement. With a measurement from cosmic nucleosynthesis, that we will come to later, which is, completely independent, assumes different physics, different measurements and different everything, which is why we believe that this result is probably correct, because when you have different. Matters leading to the same, result, then, that gives us, much more credence. So the question then arises, alright if there were these characteristic fluctuations, at the time of the coupling, will the be observable later on? And the answer is yes. There will be, corresponding imprint, on a very large scale structure in the universe. Which could be observable in principle in clustering of galaxies at comparable scales which are roughly hundred twenty mega par secs for the age of one for two hundred mega par secs for realistic values of Hubble constant. Hints of these were seen first, Redshift survey from Australia, the 2DF, redshift survey which we'll address later. And then confirmed, with Sloan Digital Sky Survey, there is a slight excess of power in clustering of galaxies on the scales that correspond now to that first Doppler The peak, except this is much later in the history of the universe. So the same standard ruler is now observed at different redshifts, and that makes the test far more powerful. Essentially, what that means is the error ellipses now rotate, and from multiple measurements, you can deduce A lot about, geometry, even without re, resorts, to the other measurements like supernovae. Now there are, many efforts, aimed to do precisely this. To observe the slight excess of clustering, corresponding to the first Doppler peak in microwave background, at the range of redshifts. And that can constrain cosmological perameters even better. Here is a table, from some of the cosmological parameters, many of which we haven't introduced yet, having to do with structure formation. Just as an illustration of the precision, that was obtained from WMAP. It actually got even better. this was after 3 years, of data. And now we have final results that after 9 years of data which is very slightly different and more precise even better data are coming. ESA has a satellite called plank which is essentially like WMAP with even a higher precision, and we expect to see results from that within a year or thereabouts. Next time, we will address source counts as a cosmological test