LetE3be a Euclidean vector space. The mapping
TWE3!E3 (1.39)
islinearif, using the notationT.u/DTu, we have
T.auCbv/DaTuCbTv; (1.40)
for alla; b2 <and for allu;v2E3.
We will let Lin.E3/ denote the set of linear mappings of E3 into E3. These applications are also called endomorphisms, Euclidean double tensors, or 2-tensorsofE3.
TheproductSTof two tensorsSandTis the composition ofSandT; that is, for allu2E3we have
.ST/uDS.Tu/: (1.41)
Generally,ST¤TS; ifST D TS, then the two tensors are said tocommute. The transposeTT ofTis the new tensor defined by the condition
uTTvDv.Tu/; 8u;v2E3: (1.42)
1.7 Elements of Tensor Algebra 15 The following properties can be easily established:
.TCS/T DTT CST; .TS/T DSTTT; .TT/T DT:
A tensorTissymmetricif
TDTT; (1.43)
andskew-symmetricif
TD TT: (1.44)
Thetensor productof vectorsuandvis the tensoru˝vsuch that
.u˝v/wDu vw; 8w2E3: (1.45)
The set Lin.E3/of tensors defined onE3is itself a vector space with respect to the ordinary operations of addition and the product of a mapping times a real number.
For this space the following theorem holds:
Theorem 1.3. The vector space Lin.E3/ of Euclidean double tensors is 9-dimensional. The tensor systems.ei˝ej/; .ei˝ej/; .ei˝ej/are bases of Lin.E3/ so that for allu 2 E3, the following representations of Euclidean double tensors hold:
TDTijei˝ej DTjiei˝ej DTijei˝ej: (1.46) The matrices .Tij/, .Tji/, and .Tij/ are called the contravariant, mixed, and covariant components ofT, respectively. Finally, the relations among the different components are given by the equations
Tij Dgj hTihDgihgj kThk: (1.47) Proof. To verify that the system.ei ˝ej/is a basis of Lin.E3/, it is sufficient to prove that it is a linearly independent system and that any tensorTcan be expressed as a linear combination of the elements of.ei˝ej/. First, from
ijei˝ej D0;
and from (1.45), (1.12), we derive
ijei˝ej.eh/Dijei ejehDijgj hei D0
for anyh D 1; 2; 3. But the vectors.ei/form a basis ofE3, so that the previous relation implies, for any indexi, the homogeneous system
ijgj hD0
of three equations in the three unknownsij, whereiis fixed. Since the determinant det.gij/of this system is different from zero, see (1.14), all the unknownsij with fixedivanish. From the arbitrariness ofi, the theorem follows.
Similarly, from the linear combination ijei˝ej D0;
when (1.19) is taken into account, we have
ijei ˝ej.eh/Dijei ej ehDijıjhei Dihei D0;
so thatih D 0, for any choice ofi andh. In the same way, the independence of tensors.ei˝ej/can be proved.
To show that any tensorTcan be written as a linear combination of any one of these systems, we start by noting that, from (1.21) and the linearity ofT, we have
TuDujTej:
On the other hand,Tej is a vector of E3 and therefore can be represented in the following form, see (1.21):
Tej DTijei:
If the definition of covariant components (1.23) is recalled together with (1.45), we have
TuDTijujei DTijei˝ej.u/:
Finally, owing to the arbitrariness ofu, we obtain TDTijei˝ej: To show (1.46)2, it is sufficient to remark that
TuDujTej; so that, introducing the notationTej DTjiei, we derive
TuDTjiujei:
1.7 Elements of Tensor Algebra 17 Noting that (1.23) and (1.22) hold, we haveuj D gj iui Dgj iei u Dej u, and the previous relation gives
TuDTjiei˝ej.u/:
From this expression, due to the arbitrariness ofu, (1.46)2 is derived. In a similar way, (1.46)3can be proved.
Relation (1.47) is easily verified using (1.46) and the definition of the reciprocal
basis (1.18), so that the theorem is proved. ut
Example 1.1. Verify that the matrices of contravariant or covariant components of a symmetric (skew-symmetric) tensor are symmetric (skew-symmetric), using the definition of transpose and symmetry.
The symmetry ofT(TDTT) and (1.42) imply that vTuDuTv;
so that
viTijuj DuiTijvj DujTj ivi:
The arbitrariness ofuandvleads toTij DTj i. In a similar way, it can be proved thatTij D Tj i whenTis skew-symmetric.
Example 1.2. Using the different representations (1.46), verify that the contravari-ant, mixed and covariant components of the unit or identity tensorI
IuDu; 8u2E3; are given by the following matrices, respectively:
.gij/; .ıji/; .gij/:
Starting from the condition definingI, written in terms of components, .Iijei ˝ej/.uheh/Duiei;
we derive the equation
Iihuhei Dghiuhei;
which implies thatIihDgih. In a similar way the others results follow.
TheimageofT, denoted by Im.T/, is the subset ofE3such that
Im.T/D fv2E3 j 9u2E3WvDTug; (1.48)
whereas thekernelofT, denoted by Ker.T/, is the subset ofE3such that
Ker.T/D fu2E3jTuD0g: (1.49)
Theorem 1.4. Im.T/and Ker.T/are vector subspaces ofE3. Moreover,Thas an inverse if and only if Ker.T/D f0g:
Proof. In fact, ifv1 DTu12Im.T/andv2DTu22Im.T/, we have av1Cbv2DaTu1CbTu2DT.au1Cbu2/;
so thatav1Cbv22Im.T/:Moreover, ifu1;u22Ker.T/, we haveTu1DTu2D0 and therefore
0DaTu1CbTu2DT.au1Cbu2/;
so thatau1Cbu2 2Ker.T/:Finally, ifThas an inverse, the conditionT.0/ D 0 implies that the inverse image of the zero vector ofE3contains only the zero vector, i.e., Ker.T/D f0g. Conversely, if this condition implied the existence of two vectors u0;u002E3such thatTu0DTu00, we would haveT.u0u00/D0. From this relation it would follow thatu0u002Ker.T/and thereforeu0u00D0: ut As a consequence of the previous results, we can say thatTis an isomorphism if and only if
Im.T/DE3; Ker.T/D f0g: (1.50)
Theorem 1.5. Let us consider a tensorT2Lin.E3/and a basis.ei/ofE3. Then the following conditions are equivalent:
1. Tis an isomorphism;
2. the vectorsT.ei/represent a basisE3;
3. the representative matrix.Tji/ofTwith respect to the basis.ei/is not singular.
Proof. 1.H)2. In fact, from the conditionsiTei DT.iei/D0and (1.50)2, it follows thatiei D0; consequently,i D0, since.ei/is a basis and the vectors Tei are independent. Moreover, ifTis an isomorphism, then for any v 2 E3
there exists a uniqueu 2 E3 such thatv D TuD uiTei, so that the vectorsTei
represent a basis ofE3.
2.H)3. In fact, the conditioniTei D 0can be writteniTiheh D 0 so that, due to (2), the homogeneous systemiTih D 0must admit only the vanishing solution and therefore the matrix.Tji/is not singular.
3.H)1. In terms of components,vDTuis expressed by the system
vi DThiuh: (1.51)
1.7 Elements of Tensor Algebra 19 For any choice of the vectorv 2 E3, this system can be considered as a linear system of n equations with n unknowns uh, which admits one and only one solution, when.Tji/is not singular. Consequently,Tis an isomorphism. ut To conclude this section, the transformation rules under a base change.ei/! .e0i/of the components of a Euclidean second-order tensor will be derived. First, from (1.48), we have
TDTijei˝ej DT0hke0h˝e0k; so that, recalling (1.25), we find
.A1/hi.A1/kjTije0h˝e0kDT0hke0h˝e0k: Since the tensors.e0h˝e0k/form a basis of Lin.E3/;we have
T0hkD.A1/hi.A1/kjTij: (1.52) Similarly, starting from (1.46)2;3and taking into account (1.25), we can derive the following transformation formulae for mixed and covariant components:
Tk0hD.A1/hiAjkTji; (1.53) Thk0 DAihAjkTij: (1.54) Remark 1.1. In the following sections we will often use the notationuT. It denotes the linear mappingE3 !E3that, in terms of components, is written
vi DujTji: (1.55)
When the tensorTis symmetric, we have uTDTu:
Remark 1.2. It is worthwhile noting that the relations (1.51) to (1.54) can be written adopting matrix notation. For instance, in agreement with the convention that the product of two matrices is calculated by rows times columns, the following form can be given to (1.54):
T0DATTA; (1.56)
where T andT0 are the matrices formed with the components ofT andT0 with respect to the bases.ei/and.e0i/, respectively, andAdenotes the matrix of the base changeei !e0i, see (1.25). It is also important to note that in the literature, usually,
the same symbol denotes both the tensorTand its representative matrixT in a fixed basis. Consequently, relation (1.56) is also written as
T0DATTA: