Radius of convergence. [SOUND] Let's think about a not-too mysterious function. Well let's think about the function f of x equals 1 over 1 plus x squared. I can write down a power series for this function around zero. So, I'll begin with a power series for 1 over 1 minus x. That's the sum and goes from 0 to infinity of x to the n. Now we'll replace x by negative x. And then I'll find that 1 over 1 plus x is the sum, n goes from 0 to infinity of negative x to the n. Which I could write as the sum and goes from 0 to infinity of negative 1 to the n, times x to the n. And now to get from 1 over 1 plus x to 1 over 1 plus x squared, I'll just replace x by x squared. So 1 over 1 plus x squared is the sum n goes from 0 to infinity of negative 1 to the n times x squared to the n. Which I could also write as the sum, n goes from 0 to infinity of negative 1 to the n times x to the 2n power. What's the radius of convergence of that power series? I was being super sloppy when I wrote this out. I wrote one over one minus x equals this power series, but that's not true unless I include this statement. I should have written this down. If the absolute value of x is less than 1. And if that's true than these are equal. The radius of convergence of this series for 1 over 1 plus x squared is also 1, and we can see that in the graph. Here I've graphed the function one over one plus x squared, and take a look at it's Taylor Series expansion around the point zero. Here is just the first term, which is a constant term. Here's through the quadratic term. Here is through the x to the fourth term, through x to the sixth term, and so on. And you can see that it's doing an increasingly good job of approximating the function between minus 1 and 1. But over here it's actually doing an increasingly bad job. 1 over 1 plus x squared looks like a really nice function. So if it's such a nice function, why isn't the Taylor Series centered around zero doing a better job of approximating the function far away? Well here I've graphed y equals sine of x. And honestly, I mean, qualitatively at least this isn't so different looking from the graph of 1 over 1 plus x squared. The graph of 1 over 1 plus X squared had kind of a big bump in the middle, and this thing's just got lots of bumps. And yet the Taylor Series is just totally different experience. Let's start writing down the Taylor Series around the origin. So here is the point around which I'm expanding. Here's just the constant term in red. Here's through the linear term. Here's through the cubic term. And I'm going to keep on going. And as I add more and more terms, in my Taylor Series expansion, I'm getting increasingly good approximations to sine of x everywhere. Sine of x is equal to its Taylor Series centered around zero for all values of x. Whereas 1 over 1 plus x squared, its Taylor Series around zero is only equal to 1 over 1 plus x squared if x is less than 1. And we know other examples like that. Like this, this is the graph y equals 1 over 1 minus x. And I can start right around the Taylor Series around this point. Here's the constant term, the linear term. Take more and more terms and, and yeah, I mean the radiance of convergence here is one. But maybe that isn't so surprising. I mean, 1 over 1 minus x has a bad point at 1, alright, I don't want to plug in x equals 1 because then I'll be dividing by 0. This function's not defined at one. So if I start a Taylor Series around zero and I imagine how big do I really expect that radius of convergence to be. Well, you know an interval of radius one centered around zero bumps into the problem point. So maybe I don't really expect the the radius of convergence to be more than 1, at 0. So maybe 1 over 1 minus x has a problem when x equals 1, but 1 over 1 plus x squared doesn't have any problem at 1. It doesn't have any problem anywhere. This function is defined for all x. So if 1 over 1 plus x squared looks just as nice as sine of x, why is the Taylor Series for this function centered around zero, so much worse than the Taylor Series for sine of x centered around zero? Taylor series for sine of x converges to sine of x everywhere, and the Taylor Series for 1 over 1 plus x squared is only good in an interval of radius one. Well let's see what happens if we write down a Taylor Series expansion for 1 over 1 plus X squared, but not centered around zero, but centered around some other point. Let us write down the Taylor Series expansion centered around x equals one. See what happens. So here we go, there's the constant term. There's the linear terms. And we keep on going. We add more and more terms to the Taylor Series expansion centered around one. And I mean, we're doing an increasingly good job, but not over the whole real line again, right? But at least the radius of convergence is bigger now. Well how much bigger? The Taylor Series for 1 over 1 plus x squared centered at zero has radius one. The Taylor Series centered at one turns out to have radius the square root of 2. And we had this idea that maybe places where the function's undefined, these bad points, where the function doesn't exist. Maybe those are somehow to blame for the radius of convergence. And that's what happened in the case of 1 over 1 minus x, alright. The radius of convergence around x equals 0 was 1, because this function has this bad point at x equals 1. Function's not defined if I plug in x equals 1. Maybe I'm bumping into a bad point here too. Well let's diagram the situation this way. Here's the real line. And around zero I've got this well it's an interval but I've drawn it as a circle. It's a circle of radius one. So if that's representing the radius of convergence of my Taylor Series around the point zero for the function 1 over 1 plus x squared. Now around the point one, alright, I've got a different radius and convergence it's bigger as the square root of 2, so here is a circle of radius the square root of 2. And I'm placing it so that its center is at one. And the idea here is try to get a sense of whether or not maybe I'm bumping into a bad point. Is there's some point on the real line, which is distance 1, from 0, and distance the square root of 2 from 1. But there isn't such a point on the real line. Okay, but when I draw them as circles, the circles are touching at these two points. So where is that point? Well those points are in the complex plane. That point there is i and that point there is minus i. Alright, so those are square roots of minus 1. Those are imaginary numbers. Well what happens if I evaluate 1 over 1 plus x squared, when x equals i or x equals minus i? So, i squared is minus 1. So what's f of i? Well, f is 1 over 1 plus its input, squared, so that would be 1 over 1 plus what's i squared, it's negative one. 1 over 1 plus minus 1, that is not defined, that's dividing by 0. So there is a bad point. There is a point where the function is undefined. It's just that the bad point for this function isn't a real point. It's an imaginary input. All right, if I evaluate this function at i or minus i, it's undefined. And yet the bad point in that complex plane is messing up my radius of convergence, even along the real line. We're beginning to get a glimpse of the important role that complex numbers play even in the theory of just real value Taylor Series. Additional evidence comes for example by this equation, e to the i x equals cos x plus i sin x. And again i is a squared of minus 1. And you can interpret this really as a statement of power series. So here I've written down the Taylor Series for e to the x, but with x replaced with i x. Here I've just written down cos x and here I've written down sin x, but I've multiplied by i. And you can expand this out, using the fact that i squared is to minus one to get this equality. And you can do kind of fun things like substitute in say pi in for x and conclude that e to the i pi is cos pi plus i sin pi. So co sin pie is minus 1, sin pie is 0. So e to the i pie is negative 1. Taylor Series aren't just a jumping off point for calculus, and for numerical approximations. Taylor Series are also our, our first step into the theory of complex analysis. And that theory is more natural than it might seem at first. I mean, if the complex numbers are affecting the radius of convergence of my real power series, then they must be really there. I mean, they're not as imaginary as people might think. So Taylor Series aren't just a jumping off point for calculus and for numerical approximations. Taylor Series, are also our, our first step into the theory of complex analysis. And that theory is more natural than it might seem at first. I mean, if the complex numbers are affecting the radius of convergence of my real power series then they must be really there. I mean, they're not as imaginary as people might think. [SOUND]