• Aucun résultat trouvé

MA1112: Linear Algebra II

N/A
N/A
Protected

Academic year: 2022

Partager "MA1112: Linear Algebra II"

Copied!
2
0
0

Texte intégral

(1)

MA1112: Linear Algebra II

Dr. Vladimir Dotsenko (Vlad) Lecture 17

Today we shall prove various theorems on quadratic forms stated without proof last week.

Theorem 1. Let q be a quadratic form on a vector space V . There exists a basis f1, . . . ,fnof V for which the quadratic form q becomes a signed sum of squares:

q(x1f1+ · · · +xnfn)=

n

X

i=1

εix2i,

where all numbersεiare either1or−1or0.

Proof. Informally, the slogan behind the proof we shall present is “imitate the Gram–Schmidt procedure”. Let us make it precise. We shall argue by induction on dimV, the basis of induction being dimV =0, when the basis is empty, so there is nothing to prove.

Suppose dimV=n>0. There are two cases to consider. First, it might be the case thatq(v)=0 for allv. In this case, any basis would work, withεi=0 for alli.

Otherwise, there exists a vectorvsuch thatq(v)6=0. Let us extend it to a basise1=v,e2, . . . ,en, and look at the symmetric bilinear formb(v,w)=12(q(v+w)q(v)q(w)) associated to the quadratic formq. We claim that we can change the basise1, . . . ,eninto a basise10,e02, . . . ,en0 withe10=e1=vandb(e0i,e10)=0 for alli=2, . . . ,n. Indeed, we can pute0i =eib(eb(e1i,e,e11))e1fori>1; here division byb(e1,e1) is possible sinceb(e1,e1)=q(e1)=q(v)6=0. We can now consider the linear span ofe02, . . . ,en0 with the bilinear formq, and proceed by induction on dimension.

Therefore, we can find a basis f2,. . . , fn of that space with the required property. It remains to note that if we takee1=p1

|q(v)|v, then we haveq(e1)= ±1, and alsob(e1,ei)=0 fori >1, which proves that our quadratic form becomes a signed sum of squares in this basis.

Theorem 2(Law of inertia). In the previous theorem, the triple(n+,n,n0), where n±is the number ofεi equal to

±1, and n0is the number ofεi equal to0, does not depend on the choice of the basis f1, . . . ,fn. This triple is often referred to as thesignatureof the quadratic form q.

Proof. Suppose that we have a basis

e1, . . . ,en+,f1, . . . ,fn,g1, . . . ,gn0

which produces a system of coordinates whereqbecomes a signed sum of squares withn+ofεiare equal to 1,n ofεiare equal to−1, andn0ofεi are equal to 0. Let us look at the corresponding symmetric bilinear formb. The reconstruction formulab(v,w)=12(q(v+w)q(v)−q(w)) implies that

b(x1e1+ · · · +zn0gn0,x01e1+ · · · +z0n

0gn0)=x1x10+ · · · +xn+x0n

+y1y10− · · · −ynyn0

.

By definition, thekernelof a symmetric bilinear form is the space of all vectorsvsuch thatb(v,w)=0 for allwV. We see that the kernel is defined by a system of equationsb(v,ei)=b(v,fj)=b(v,gk)=0 for alli,j,k, and by direct inspection this system implies thatvis a linear combination of the vectorsgk. This implies thatn0is the dimension of the kernel ofb, and so is independent of any choices.

Suppose now that there are two different bases

e1, . . . ,en+,f1, . . . ,fn,g1, . . . ,gn0

1

(2)

and

e01, . . . ,en0

+,f10, . . . ,fn0

,g01, . . . ,gn00

whereqis a signed sum of squares, andn+6=n0+, so without loss of generalityn+>n0+. Note that this implies that n<n0. Consider the vectors

e1, . . . ,en+,f10, . . . ,fn00

,g1, . . . ,gn0.

The total number of those vectors exceeds the dimension ofV, so they must be linearly dependent, that is a1e1+ · · · +an+en++b1f10+ · · · +bn0

fn00

+c1g1+ · · · +cn0gn0=0 for some scalarsai,bj,ck. Let us rewrite it as

a1e1+ · · · +an+en++c1g1+ · · · +cn0gn0= −(b1f10+ · · · +bn0

fn00

),

and denote the vector to which both the left hand side and the right hand side are equal to byv. Then a21+ · · · +a2n

+=q(v)= −b12− · · · −bn2

, which implies

a1= · · · =an+=b1= · · · =bn=0, and substituting it into

a1e1+ · · · +an+en++b1f10+ · · · +bn0

fn00

+c1g1+ · · · +cn0gn0=0 we get

c1g1+ · · · +cn0gn0=0,

implying of coursec1= · · · =cn0=0, which altogether shows that these vectors cannot be linearly dependent, a contradiction.

2

Références

Documents relatifs

(a) Consider the vector space V of all 2 × 2-matrices (with ob- vious addition and multiplication by scalars)... Define the rank of a

The key point of linear algebra is exactly that: an informed choice of a coordinate system can simplify things quite

We shall temporarily take the out- look which views symmetric matrices in the spirit of one of the proofs from last week, and looks at the associated bilinear forms (and

Intuitively, orthogonal matrices with det(A) = 1 are transformations that can distinguish between left and right, or clockwise and counterclockwise (like rotations), and

We proved that every normal linear transformation admits an orthonormal basis of eigenvectors, so the first part follows.. This theorem can be used to deduce the theorem on

Does there exist a 9×9-matrix B for which the matrix B 2 has the Jordan normal form with blocks of sizes 4, 3, 2 appearing once, each block with the eigenvalue 0?. Same question for

(Alternatively, one can argue that being the number of pivots in the reduced row echelon form, the rank cannot exceed either the number of rows or the number of columns, but this

Clearly, for n &gt; 0 there is basis consisting of eigenvectors (we cannot express polynomials of positive degree using eigenvectors), so the matrix of ϕ cannot be made diagonal by