MA1112: Linear Algebra II
Dr. Vladimir Dotsenko (Vlad) Lecture 17
Today we shall prove various theorems on quadratic forms stated without proof last week.
Theorem 1. Let q be a quadratic form on a vector space V . There exists a basis f1, . . . ,fnof V for which the quadratic form q becomes a signed sum of squares:
q(x1f1+ · · · +xnfn)=
n
X
i=1
εix2i,
where all numbersεiare either1or−1or0.
Proof. Informally, the slogan behind the proof we shall present is “imitate the Gram–Schmidt procedure”. Let us make it precise. We shall argue by induction on dimV, the basis of induction being dimV =0, when the basis is empty, so there is nothing to prove.
Suppose dimV=n>0. There are two cases to consider. First, it might be the case thatq(v)=0 for allv. In this case, any basis would work, withεi=0 for alli.
Otherwise, there exists a vectorvsuch thatq(v)6=0. Let us extend it to a basise1=v,e2, . . . ,en, and look at the symmetric bilinear formb(v,w)=12(q(v+w)−q(v)−q(w)) associated to the quadratic formq. We claim that we can change the basise1, . . . ,eninto a basise10,e02, . . . ,en0 withe10=e1=vandb(e0i,e10)=0 for alli=2, . . . ,n. Indeed, we can pute0i =ei−b(eb(e1i,e,e11))e1fori>1; here division byb(e1,e1) is possible sinceb(e1,e1)=q(e1)=q(v)6=0. We can now consider the linear span ofe02, . . . ,en0 with the bilinear formq, and proceed by induction on dimension.
Therefore, we can find a basis f2,. . . , fn of that space with the required property. It remains to note that if we takee1=p1
|q(v)|v, then we haveq(e1)= ±1, and alsob(e1,ei)=0 fori >1, which proves that our quadratic form becomes a signed sum of squares in this basis.
Theorem 2(Law of inertia). In the previous theorem, the triple(n+,n−,n0), where n±is the number ofεi equal to
±1, and n0is the number ofεi equal to0, does not depend on the choice of the basis f1, . . . ,fn. This triple is often referred to as thesignatureof the quadratic form q.
Proof. Suppose that we have a basis
e1, . . . ,en+,f1, . . . ,fn−,g1, . . . ,gn0
which produces a system of coordinates whereqbecomes a signed sum of squares withn+ofεiare equal to 1,n− ofεiare equal to−1, andn0ofεi are equal to 0. Let us look at the corresponding symmetric bilinear formb. The reconstruction formulab(v,w)=12(q(v+w)−q(v)−q(w)) implies that
b(x1e1+ · · · +zn0gn0,x01e1+ · · · +z0n
0gn0)=x1x10+ · · · +xn+x0n
+−y1y10− · · · −yn−yn0
−.
By definition, thekernelof a symmetric bilinear form is the space of all vectorsvsuch thatb(v,w)=0 for allw∈V. We see that the kernel is defined by a system of equationsb(v,ei)=b(v,fj)=b(v,gk)=0 for alli,j,k, and by direct inspection this system implies thatvis a linear combination of the vectorsgk. This implies thatn0is the dimension of the kernel ofb, and so is independent of any choices.
Suppose now that there are two different bases
e1, . . . ,en+,f1, . . . ,fn−,g1, . . . ,gn0
1
and
e01, . . . ,en0
+,f10, . . . ,fn0
−,g01, . . . ,gn00
whereqis a signed sum of squares, andn+6=n0+, so without loss of generalityn+>n0+. Note that this implies that n−<n0−. Consider the vectors
e1, . . . ,en+,f10, . . . ,fn00
−,g1, . . . ,gn0.
The total number of those vectors exceeds the dimension ofV, so they must be linearly dependent, that is a1e1+ · · · +an+en++b1f10+ · · · +bn0
−fn00
−+c1g1+ · · · +cn0gn0=0 for some scalarsai,bj,ck. Let us rewrite it as
a1e1+ · · · +an+en++c1g1+ · · · +cn0gn0= −(b1f10+ · · · +bn0
−fn00
−),
and denote the vector to which both the left hand side and the right hand side are equal to byv. Then a21+ · · · +a2n
+=q(v)= −b12− · · · −bn2
−, which implies
a1= · · · =an+=b1= · · · =bn−=0, and substituting it into
a1e1+ · · · +an+en++b1f10+ · · · +bn0
−fn00
−+c1g1+ · · · +cn0gn0=0 we get
c1g1+ · · · +cn0gn0=0,
implying of coursec1= · · · =cn0=0, which altogether shows that these vectors cannot be linearly dependent, a contradiction.
2