Complex analysis

 Lecture 1

We define $\mathbb{C}\equiv \mathbb{R}[x]/(x^2+1)$. Thus $\mathbb{C}$ is a field.

Conjugation: It's an automorphism from $\mathbb{C}$ to $\mathbb{C}$ where $x\mapsto -x$, which essentially changes the sign of the imaginary part.

We have an isomorphism from $\mathbb{C}$ to $\mathbb{R}^2$ where $a+ib\mapsto (a,b)$. For this, we can establish a norm for complex numbers which is the usual Euclidean norm, which is 
$|a+ib|=\sqrt{a^2+b^2}$

Definition (Limit): Suppose we have an open set $U\subseteq \mathbb{C}$, and a function $f:U\to \mathbb{C}$. We say that the limit of the function at $a$ , is $A$ if $\forall \epsilon$, $\exists \delta$ such that, $|f(z)-A|<\epsilon $ if $0<|z-a|<\delta$.

Continuity: We call a function $f$, continuous if $f(z)\to f(a)$ if and only if $z\to a$.

Derivative: Suppose we have a function $f:U\to \mathbb{C}$, we say the derivative of the function exists if the limit $\lim\limits_{z\to a}\dfrac{f(z)-f(a)}{z-a}$ exists. The notation of the derivative is $f'(z)$. 

Remark: If $f$ and $g$ are differentiable functions, then $f\pm g$, $fg$ are differentiable as well, and $\dfrac{f}{g}$ is differentiable if $g\neq 0$. Will prove it a bit later, I am lazy :)

Another important fact: If a function is differentiable, then it is continuous as well.

Analytic function: We call a function $f:U\to \mathbb{C}$, analytic if it is differentiable as $z$, $\forall  z \in U$. 

Cauchy Riemann Equations

Suppose we have an analytic function, $f:U\to\mathbb{R}$. We will calculate its derivative in two different ways, Along $x$(Real part) and then along $y$(Along the imaginary part).

1. Along $x$ axis:

$f'(z)=\lim\limits_{h\to 0,h\in \mathbb{R}}\dfrac{f(z+h)-f(z)}{h}$
Say $f(x)=u(z)+iv(z)$
We will now have, 
$f'(z)=\lim\limits_{h\to 0}\dfrac{u(z+h)-u(z)}{h}+i\lim\limits_{h\to 0}\dfrac{v(z+h)-v(z)}{h}$
$\implies f'(z)=\dfrac{\partial u}{\partial x}+i\dfrac{\partial v}{\partial x}$

2. Along $y$ axis:

$f'(z)=\lim\limits_{h\to 0,h\in \mathbb{R}}\dfrac{f(z+ih)-f(z)}{ih}$
$\implies f'(z)=\lim\limits_{h\to 0}\dfrac{u(z+ih)-u(z)}{ih}+i\lim\limits_{h\to 0}\dfrac{v(z+ih)-v(z)}{ih}$
$\implies f'(z)=-i\dfrac{\partial u}{\partial y}+\dfrac{\partial v}{\partial y}$

Comparing both equations, we have 
$\dfrac{\partial v}{\partial x}=-\dfrac{\partial u}{\partial y}$
$\dfrac{\partial u}{\partial x}=\dfrac{\partial v}{\partial y}$

Proposition: If we have a function whose component functions have continuous first partials and satisfy CR equations, then the function is analytic and has a continuous derivative and the converse holds as well. I will only prove the first part, for now. I will do the second part later.

Proof (First direction): Consider the component functions of the function $f(z)$, $u(x,y)$ and $v(x,y)$. In other words, $f(z)=u(x,y)+iv(x,y)$. We will have 
$u(x+h,y+k)-u(x,y)=\dfrac{\partial u}{\partial x}h+\dfrac{\partial u}{\partial y}k + \epsilon_1$
$v(x+h,y+k)-v(x,y)=\dfrac{\partial v}{\partial x}h+\dfrac{\partial v}{\partial y}k + \epsilon_2$
Note that $\dfrac{\epsilon_i}{h+ik}\to 0$ where $i\in\{1,2\}$. Now considering these equations and the fact that the functions satisfy CR equations, gives us.
$\lim\limits_{h+ik\to 0}\dfrac{f(z+h+ik-f(z))}{h+ik}=\left(\dfrac{\partial u}{\partial x}+i\dfrac{\partial v}{\partial x}\right)(h+ik) + \epsilon_1 + \epsilon_2$
$f'(z)=\dfrac{\partial u}{\partial x}+i\dfrac{\partial v}{\partial x}$
Therefore the function is analytical. 

Lecture 2

Fundamental Theorem of Algebra: Every polynomial $p(z)\in\mathbb{C}[z]$ has a root. 
As a corollary, we have the fact that every complex polynomial can be written as a product of linear factors. 

Apart from these all, we say that the order of a zero of a polynomial (Say $a$) $p(z)$ is $m$ if $p(z)=(z-a)^mq(z)$ where $q(z)\neq 0$.

Lucas's Theorem: For any polynomial, if its zeros lie in a half-plane, then the zeros of $p'(z)$ lie in the same half-plane. 
Let's try proving this.
Proof: Consider the polynomial $p(z)=(z-\alpha_1)(z-\alpha_2)\ldots(z-\alpha_k)$. Now we can write
$\dfrac{p'(z)}{p(z)}=\dfrac{1}{z-\alpha_1}+\dfrac{1}{z-\alpha_2}+\ldots \dfrac{1}{z-\alpha_k}$. Now we have $\alpha_i$s as the roots of the polynomial. Say they all lie in the half-plane $\dfrac{Im(z-a)}{b}$. (It's interesting to note that any half-plane can be written in this way). Now suppose a $z$ does not lie in this half-plane, which is a root of $p'(z)$ and $p(z)\neq 0$. Now we can write,
$Im\left(\dfrac{z-\alpha_k}{b}\right)=Im\left(\dfrac{z-a}{b}\right)+Im\left(\dfrac{a-\alpha_k}{b}\right)>0$. Reciprocating a complex number, changes the sign of the imaginary part. Therefore, $Im\left(\dfrac{b}{z-\alpha_k}\right)<0$ which hence means that $Im\;b \dfrac{p'(z)}{p(z)}=\sum Im(b(z-\alpha_k)^{-1})$, which therefore means that $z$ is not a root of $p'(z)$.   

Rational functions

There's a concept of defining infinity in the complex plane, which means we can extend the complex plane to include infinity. For this, we have the concept of Riemann spheres.

Here every complex number is a point on the sphere. The Riemann did this was, he took a sphere and a point on the complex plane. He then joined the north pole of the sphere with the point. The connecting line intersected the sphere at a point different from the origin. This created a bijection between the points on the complex plane and the points on the sphere. The only time we don't get such an intersection is when the line is perpendicular to the radius of the sphere.

Now let's come back to business.

Rational functions

Let $R(z)=\dfrac{P(z)}{Q(z)}$ where the numerators and denominators are reduced to their lowest terms. Whenever $Q(z)$ is $0$, we get $\infty$ and the point where $Q(z)$ becomes $0$, is called the pole. The order of the pole is simply the order of the root of $Q(x)$ which results in the pole.
Now if we consider $R'(z)$, it exists if $Q(z)\neq 0$. Hence, the poles of $R(z)$ and $R'(z)$ have the same poles with the order increased by $1$. In the case of multiple zeros, the form of the derivative would not be in the reduced form. 

Greater unity is achieved if we consider an extension 
$R:S\to S$
We can set $R'(z)=R(1/z)$. At $z=0$, we have $R(\infty)$ and at $z=\infty$, we have $R(0)$

Say $R(z)=\dfrac{a_0+a_1z+\ldots a_nz^n}{b_0+b_1z+\ldots b_mz^m}$. Here if $m<n$, we can say that $R$ attains $0$ at $\infty$ with order $m-n $and pole at $0$ with order $n-m$.

So how many poles/zeros are there of $R(z)?$. We note that the number of zeros is equal to $max(m,n)$, and so is the number of poles. This number is called the order of the rational function.

Lecture 3

We note that $or(R(z)=a)=or(R(z))$ where $a\in \mathbb{C}$. Also, note that if the order of a rational function is $1$, we get $R(z)=\dfrac{az+b}{cz+d}$ and hence a bijective function. 

Power series: Just like the reals, we have the concepts of liminf, limsup, and convergence of sequences of complex sequences.
Say we have a sequence of complex numbers $\{a_n\}$, which converges to $A$. Then for any $\epsilon>0$, we will have $N\in\mathbb{N}$ such that $\forall \;\;n>N$, $|a_n-A|<\epsilon$.

Series: To a sum $\sum\limits_{i=1}^{\infty}a_i$, we can associate a sequence of partial sums $s_n=\sum\limits_{i=1}^{n}a_i$. Now if $s_n$ converges, the series $\sum\limits_{i=1}^{\infty}a_i$ converges. 

We say that a sequence $\{a_n\}$is absolutely convergent if $\{|a_n|\}$ is convergent. In the similar fashion, we say $\sum\limits_n a_n$ is absolutely convergent if $\sum\limits_n |a_n|$ is convergent.

Note: Convergence implies absolute convergence but the convergence 

Uniform convergence: Say a sequence of functions $\{f_n:E\to \mathbb{C}\}$ converge pointwise to $f$ then it converges uniformly if for all $\epsilon>0$, there exists $N\in \mathbb{N}$ such that for all $n>N$, $|f_n(z)-f(z)|<0$ for all $z\in E$.

Proposition: The limit function of a uniformly convergent sequence of continuous functions, is also continuous.

Cauchy criterion: We have a uniformly convergent sequence of functions $\{f_n\}$ on $E\subset \mathbb{C}$, iff $\forall \epsilon >0$, $\exists\;\;\;N$, such that $\forall m,n>N$, we have $|f_m(z)-f_n(z)|<0$.

Corollary (M test): Suppose we are interested in $\sum\limits_{n}f_n$, such that $|f_n(z)|<M|a_n|$ for a sequence $\{a_n\}$ which converges. The proof can be done using Cauchy criterion.

Lecture 4

Power series

A power series with center $z_0\in \mathbb{C}$ is the series

$\sum\limits_{n\geq 0}a_n(z-z_0)^n$

Associated with every power series of the form $\sum\limits_{n\geq 0}a_n z^n$ (We will call this series $f(z)$), we have a real number which we call the radius of convergence, $R>0$ such that

1. If $z\leq R$, the series converges absolutely.

2. If $z>R$, it diverges.

3. If $0\leq \phi \leq R$, and $z\leq \phi$, then the series converges uniformly.

4. If $z<R$, $f(z)$ is analytic.

5. The derivative of $f(z)$ will be determined by the power series whose terms are the derivatives of the corresponding terms of the actual series.









Comments