• Aucun résultat trouvé

Incidence Matrices

Dans le document Algorithms and Computation in Mathematics • (Page 118-123)

Spanning Trees

4.2 Incidence Matrices

In this section we consider a further matrix associated with a given digraph.

This will be used for yet another characterization of trees and for finding a formula for the number of spanning trees of an arbitrary connected graph.

Definition 4.2.1 Let G be a digraph with vertex set V ={1, . . . , n} and edge setE={e1, . . . , em}. Then then×mmatrixM= (mij), where

mij=

⎧⎨

1 ifiis the tail of ej, 1 ifi is the head ofej, 0 otherwise,

is called theincidence matrix ofG.

Of course,M depends on the labelling of the vertices and edges ofG; thus it is essentially only determined up to permutations of its rows and columns.

106 4 Spanning Trees

For example, the digraph of Fig. 2.1 has the following incidence matrix, if we number the vertices and edges as in Definition 2.2.1:

Note that each column of an incidence matrix contains exactly two non-zero entries, namely one entry 1 and one entry1; summing the entries1 in rowigivesdout(i), whereas summing the entries 1 yields din(i). The entries 0,1 and 1 are often considered as integers, and the matrixM is considered as a matrix over Z, Qor R. We could also use any other ring R as long as 1=1, that is,R should have characteristic= 2.

Lemma 4.2.2 LetGbe a digraph withnvertices.Then the incidence matrix ofG has rank at most n−1.

Proof Adding all the rows of the incidence matrix gives a row for which all

entries equal 0.

We will soon determine the precise rank of the incidence matrix. To this end, we first characterize the forests among the class of all digraphs; of course, a digraphG is called aforest if the undirected version|G| is a forest, as in the special case of trees.

Theorem 4.2.3 A digraph G with incidence matrix M is a forest if and only if the columns ofM are linearly independent.

Proof We have to show thatGcontains a cycle if and only if the columns of M are linearly dependent. Suppose first that

C=v0

Conversely, let the columns ofM be linearly dependent. Then there are columnss1, . . . , sk and integersx1, . . . , xk= 0 such thatx1s1+· · ·+xksk= 0.

Let E be the set of edges corresponding to the columns s1, . . . , sk and V the set of vertices of G incident with the edges contained in E, and write G= (V, E). Note that every vertex of the associated graph|G|has degree at least 2. Now Exercise 1.2.5 shows that no connected component of|G|is

a tree. Hence all components of|G|contain cycles, so that |G| cannot be a

forest.

Theorem 4.2.4 Let Gbe a digraph withnvertices andpconnected compo-nents. Then the incidence matrixM of Ghas rank n−p.

Proof According to Theorem4.2.3, the rank of M is the number of edges of a maximal forest T contained in |G|. If p= 1, T is a tree and has exactly n−1 edges; thusM has rankn−1 =n−pin this case.

Now supposep= 1. ThenGcan be partitioned into its pconnected com-ponents, that is,T is the disjoint union of ptrees. Suppose that these trees have n1, . . . , np vertices, respectively. Then the incidence matrix of G has

rank (n11) +· · ·+ (np1) =n−p.

Next we want to show that the incidence matrix of a digraph has a very special structure. We require a definition. A matrix over Z is called totally unimodularif each square submatrix has determinant 0, 1 or1. These ma-trices are particularly important in combinatorial optimization; for example, the famous theorem about integral flows in networks1is a consequence of the following result; see also [Law76],§4.12.

Theorem 4.2.5 Let M be the incidence matrix of a digraphG. ThenM is totally unimodular.

Proof LetM be any square submatrix ofM, say withkrows and columns.

We shall use induction onk. Trivially,Mhas determinant 0, 1 or1 ifk= 1.

So letk= 1. IfM contains a 0-column, detM= 0. Next let us assume that each column ofMcontains two non-zero entries. Then the rows and columns ofMdefine a digraphG withkvertices andkedges. By Theorem 1.2.7,|G| cannot be acyclic, so thatGis not a forest. By Theorem4.2.3, the columns of Mare linearly dependent, and again detM= 0. Finally assume that there is a column ofMwith exactly one entry= 0. We may calculate the determinant ofMby expanding it with respect to such a column. Then we obtain a factor

±1 multiplied with the determinant of a square ((k1)×(k1))-submatrix

M, and the assertion follows by induction.

Corollary 4.2.6 Let Gbe a digraph with n vertices andn−1 edges.Let B be the matrix which arises from the incidence matrixM of Gby deleting an arbitrary row. If G is a tree, then detB= 1 or detB=1, and otherwise detB= 0.

1We will treat this result in Chap. 6. Actually we shall use a different proof which is not based on Theorem4.2.5.

108 4 Spanning Trees

Proof Note that the row deleted fromM is a linear combination of the re-maining rows. By Theorem4.2.4,B has rankn−1 if and only ifGis a tree.

Now the assertion is an immediate consequence of Theorem4.2.5.

Next we use the incidence matrix to determine the number of spanning trees of a digraph G. Of course, a spanning tree of G is just a directed subgraphT ofGsuch that|T|is a spanning tree for |G|.

Theorem 4.2.7 (Matrix tree theorem) Let B be the matrix arising from the incidence matrix of a digraph G by deleting an arbitrary row. Then the number of spanning trees ofGis detBBT.

Proof Letn be the number of vertices ofG. For any set S of n−1 column indices, we denote the matrix consisting of then−1 columns ofB correspond-ing to S by BS. Now the theorem of Cauchy and Binet (see, for instance, [Had61]) implies

detBBT=

S

detBSBST=

S

(detBS)2.

By Corollary4.2.6, detBS= 0 if and only if the edges ofGcorresponding toS form a tree; moreover, in this case, (detBS)2= 1. This proves the assertion.

Theorem 4.2.7 is contained implicitly in [Kirh47]. Not surprisingly, this result may also be used to determine the number of spanning trees of a graph Gby considering the incidence matrix of any orientation ofG. We need the following simple lemma; then the desired result is an immediate consequence of this lemma and Theorem4.2.7.

Lemma 4.2.8 Let A be the adjacency matrix of a graph G and M the in-cidence matrix of an arbitrary orientationH ofG, where the same ordering (v1, . . . , vn)of the vertices is used for numbering the rows and columns of both matrices.ThenM MT = diag(degv1, . . . ,degvn)−A.

Proof The (i, j)-entry ofM MT is the inner product of thei-th and thej-th row ofM. Fori=j, this entry is1 ifijorjiis an edge ofH and 0 otherwise.

Fori=j, we get the degree degvi.

Theorem 4.2.9 Let Abe the adjacency matrix of a graph Gwith respect to the ordering(v1, . . . , vn)of the vertices,and put

A=−A+ diag(degv1, . . . ,degvn).

Then the number of spanning trees of G is the common value of all minors ofA which arise by deleting a row and the corresponding column from A.

In Sect.4.8, we will give a different proof for Theorem4.2.9which avoids using the theorem of Cauchy and Binet. The matrixA is called the degree matrixor theLaplacian matrix ofG. As an example, let us consider the case of complete graphs and thus give a third proof for Corollary 1.2.11.

Example 4.2.10 In Corollary 1.2.11, we have encountered a formula for the numberTn of all trees on nvertices; note thatTn counts the different trees, not the isomorphism classes of trees. Subsequently, we presented a construc-tive proof for this result using the Pr¨ufer code. We now use Theorem 4.2.9 to give a third proof. Obviously, the degree matrix of Kn is A=nI−J, where J is the n×n matrix with all entries equal to 1. By Theorem4.2.9, the number of trees onnvertices is the value of a minor ofA, that is

where the determinant has size (n1)×(n1). Using elementary row and column transformations, this determinant is easily evaluated as follows:

Tn=

The following exercise concerns a similar application of the matrix tree theorem; see [FieSe58]. A simple direct proof can be found in [Abu90] where this result is also used to give yet another proof for Corollary 1.2.11.

Exercise 4.2.11 Use Theorem4.2.9 to show that the number of spanning trees of the complete bipartite graphKm,nis mn1nm1.

Note that we can also define incidence matrices for graphs: the matrixM has entrymij= 1 if vertexiis incident with edgeej, andmij= 0 otherwise.

But the statements analogous to Lemma4.2.2and Theorem4.2.3do not hold;

for example, the three columns of a cycle of length 3 are linearly independent over Z. However, the situation changes if we consider the incidence matrix M as a matrix overZ2.

Exercise 4.2.12 Prove the analogues of Lemma4.2.2through Theorem4.2.4 for graphs, whereM is considered as a binary matrix.

The incidence matrix M of a graph—considered as a matrix over the integers—is not unimodular in general, as the following exercise shows. More-over, it provides a further important characterization of bipartite graphs.

110 4 Spanning Trees

Exercise 4.2.13 Let Gbe a graph with incidence matrixM. Show thatG is bipartite if and only ifM is totally unimodular as a matrix overZ.

Hint: The proof that unimodularity of M is necessary is similar to the proof of Theorem4.2.5. The converse can be proved indirectly.

Exercise 4.2.14 Letebe an edge ofKn. Determine the number of spanning trees ofKn\e.

Exercise 4.2.15 LetGbe a forest withnvertices andmedges. How many connected components doesGhave?

The final exercise of this section is a somewhat more demanding applica-tion of Theorem4.2.9.

Exercise 4.2.16 Let F be a perfect matching of G=K2n. Determine the number of spanning trees ofG\F.

Hint: A direct evaluation of the appropriate determinant is rather unpleas-ant. A more elegant argument can be given by determining the eigenvalues for the matrix in question, that is, by finding a suitable set of linearly inde-pendent eigenvectors. Most of these are in fact rather obvious, only the final two eigenvectors are more difficult to find.

Sometimes, a list of all spanning trees of a given graph is needed, or an arbitrary choice of some spanning tree ofG(arandom spanning tree). These tasks are treated in [ColDN89]; in particular, it is shown that the latter problem can be solved with complexityO(|V|3).

Dans le document Algorithms and Computation in Mathematics • (Page 118-123)