next up previous
Next: Physical Significance of Tensors Up: Relativity and Electromagnetism Previous: Transformation of Velocities


Tensors

It is now convenient to briefly review the mathematics of tensors. Tensors are of primary importance in connection with coordinate transforms. They serve to isolate intrinsic geometric and physical properties from those that merely depend on coordinates.

A tensor of rank $ r$ in an $ n$ -dimensional space possesses $ n^r$ components which are, in general, functions of position in that space. A tensor of rank zero has one component, $ A$ , and is called a scalar. A tensor of rank one has $ n$ components, $ (A_1, A_2, \cdots, A_n)$ , and is called a vector. A tensor of rank two has $ n^2$ components, which can be exhibited in matrix format. Unfortunately, there is no convenient way of exhibiting a higher rank tensor. Consequently, tensors are usually represented by a typical component: for instance, the tensor $ A_{ijk}$ (rank 3), or the tensor $ A_{ijkl}$ (rank 4), et cetera. The suffixes $ i,j,k,\cdots$ are always understood to range from 1 to $ n$ .

For reasons that will become apparent later on, we shall represent tensor components using both superscripts and subscripts. Thus, a typical tensor might look like $ A^{ij}$ (rank 2), or $ B_j^i$ (rank 2), et cetera. It is convenient to adopt the Einstein summation convention. Namely, if any suffix appears twice in a given term, once as a subscript and once as a superscript, a summation over that suffix (from 1 to $ n$ ) is implied.

To distinguish between various different coordinate systems, we shall use primed and multiply primed suffixes. A first system of coordinates $ (x^{\,1}, x^{\,2}, \cdots, x^{\,n})$ can then be denoted by $ x^{\,i}$ , a second system $ (x^{\,1'}, x^{\,2'}, \cdots, x^{\,n'})$ by $ x^{\,i'}$ , et cetera. Similarly, the general components of a tensor in various coordinate systems are distinguished by their suffixes. Thus, the components of some third rank tensor are denoted $ A_{ijk}$ in the $ x^{\,i}$ system, by $ A_{i'j'k'}$ in the $ x^{\,i'}$ system, et cetera.

When making a coordinate transformation from one set of coordinates, $ x^{\,i}$ , to another, $ x^{\,i'}$ , it is assumed that the transformation is non-singular. In other words, the equations that express the $ x^{\,i'}$ in terms of the $ x^{\,i}$ can be inverted to express the $ x^{\,i}$ in terms of the $ x^{\,i'}$ . It is also assumed that the functions specifying a transformation are differentiable. It is convenient to write

$\displaystyle \frac{\partial x^{\,i'}}{\partial x^{\,i}}$ $\displaystyle = p_i^{\,i'},$ (1664)
$\displaystyle \frac{\partial x^{\,i}}{\partial x^{\,i'}}$ $\displaystyle = p_{i'}^{\,i}.$ (1665)

Note that

$\displaystyle p_{i'}^{\,i}\, p_{j}^{\,i'} = \delta_j^{\,i},$ (1666)

by the chain rule, where $ \delta_j^{\,i}$ (the Kronecker delta ) equals 1 or 0 when $ i=j$ or $ i\neq j$ , respectively.

The formal definition of a tensor is as follows:

  1. An entity having components $ A_{ij\cdots k}$ in the $ x^{\,i}$ system and $ A_{i'j'\cdots k'}$ in the $ x^{\,i'}$ system is said to behave as a covariant tensor under the transformation $ x^i\rightarrow x^{i'}$ if

    $\displaystyle A_{\,i'j'\cdots k'} = A_{ij\cdots k} \,p_{i'}^{\,i} \,p_{j'}^{\,j} \cdots p_{k'}^{\,k}.$ (1667)

  2. Similarly, $ A^{ij\cdots k}$ is said to behave as a contravariant tensor under $ x^{\,i}\rightarrow x^{\,i'}$ if

    $\displaystyle A^{i'j'\cdots k'} = A^{ij\cdots k} p_i^{\,i'} \,p_j^{\,j'} \cdots p_k^{\,k'}.$ (1668)

  3. Finally, $ A^{i\cdots j}_{k\cdots l}$ is said to behave as a mixed tensor (contravariant in $ i\cdots j$ and covariant in $ k\cdots l$ ) under $ x^{\,i}\rightarrow x^{\,i'}$ if

    $\displaystyle A_{k'\cdots l'}^{i'\cdots j'} = A_{k\cdots l}^{i\cdots j}\, p_i^{\,i'} \cdots p_j^{\,j'} \,p^{\,k}_{k'}\cdots p^{\,l}_{l'}.$ (1669)

When an entity is described as a tensor it is generally understood that it behaves as a tensor under all non-singular differentiable transformations of the relevant coordinates. An entity that only behaves as a tensor under a certain subgroup of non-singular differentiable coordinate transformations is called a qualified tensor, because its name is conventionally qualified by an adjective recalling the subgroup in question. For instance, an entity that only exhibits tensor behavior under Lorentz transformations is called a Lorentz tensor, or, more commonly, a 4-tensor.

When applied to a tensor of rank zero (a scalar), the previous definitions imply that $ A'= A$ . Thus, a scalar is a function of position only, and is independent of the coordinate system. A scalar is often termed an invariant.

The main theorem of tensor calculus is as follows:

If two tensors of the same type are equal in one coordinate system then they are equal in all coordinate systems.

The simplest example of a contravariant vector (tensor of rank one) is provided by the differentials of the coordinates, $ dx^{\,i}$ , because

$\displaystyle dx^{\,i'} = \frac{\partial x^{\,i'}}{\partial x^{\,i}} \,dx^{\,i} = dx^{\,i}\, p_i^{\,i'}.$ (1670)

The coordinates themselves do not behave as tensors under all coordinate transformations. However, because they transform like their differentials under linear homogeneous coordinate transformations, they do behave as tensors under such transformations.

The simplest example of a covariant vector is provided by the gradient of a function of position $ \phi=\phi(x^{\,1}, \cdots, x^{\,n})$ , because if we write

$\displaystyle \phi_i = \frac{\partial\phi}{\partial x^{\,i}},$ (1671)

then we have

$\displaystyle \phi_{i'} = \frac{\partial\phi}{\partial x^{\,i'}} = \frac{\parti...
...ial x^{\,i}} \frac{\partial x^{\,i}}{\partial x^{\,i'}} =\phi_i \,p_{i'}^{\,i}.$ (1672)

An important example of a mixed second-rank tensor is provided by the Kronecker delta introduced previously, because

$\displaystyle \delta_j^{\,i} \,p_i^{\,i'}\,p_{j'}^{\,j} = p_{j}^{\,i'} \,p_{j'}^{\,j} = \delta_{j'}^{\,i'}.$ (1673)

Tensors of the same type can be added or subtracted to form new tensors. Thus, if $ A_{ij}$ and $ B_{ij}$ are tensors, then $ C_{ij} = A_{ij} \pm B_{ij}$ is a tensor of the same type. Note that the sum of tensors at different points in space is not a tensor if the $ p$ 's are position dependent. However, under linear coordinate transformations the $ p$ 's are constant, so the sum of tensors at different points behaves as a tensor under this particular type of coordinate transformation.

If $ A^{ij}$ and $ B_{ijk}$ are tensors, then $ C^{\,ij}_{klm} = A^{ij}\,
B_{klm}$ is a tensor of the type indicated by the suffixes. The process illustrated by this example is called outer multiplication of tensors.

Tensors can also be combined by inner multiplication, which implies at least one dummy suffix link. Thus, $ C_{\,kl}^j = A^{ij} \,B_{ikl}$ and $ C_k = A^{ij} \,B_{ijk}$ are tensors of the type indicated by the suffixes.

Finally, tensors can be formed by contraction from tensors of higher rank. Thus, if $ A_{klm}^{ij}$ is a tensor then $ C_{kl}^{\,j} = A_{ikl}^{ij}$ and $ C_k = A_{kij}^{ij}$ are tensors of the type indicated by the suffixes. The most important type of contraction occurs when no free suffixes remain: the result is a scalar. Thus, $ A_i^i$ is a scalar provided that $ A_i^j$ is a tensor.

Although we cannot usefully divide tensors, one by another, an entity like $ C^{\,ij}$ in the equation $ A^j = C^{\,ij} \,B_i$ , where $ A^i$ and $ B_i$ are tensors, can be formally regarded as the quotient of $ A^i$ and $ B_i$ . This gives the name to a particularly useful rule for recognizing tensors, the quotient rule. This rule states that if a set of components, when combined by a given type of multiplication with all tensors of a given type yields a tensor, then the set is itself a tensor. In other words, if the product $ A^i = C^{\,ij} \,B_j$ transforms like a tensor for all tensors $ B_i$ then it follows that $ C^{ij}$ is a tensor.

Let

$\displaystyle \frac{\partial A_{k\cdots l}^{i\cdots j} }{\partial x^{\,m}} = A^{i \cdots j}_{k\cdots l,m}.$ (1674)

Then if $ A_{k\cdots l}^{i\cdots j}$ is a tensor, differentiation of the general tensor transformation (1671) yields

$\displaystyle A_{k'\cdots l', m'}^{i'\cdots j'} = A_{k\cdots l,m}^{i\cdots j}\,...
..._j^{\,j'} p_{k'}^{\,k} \cdots p_{l'}^{\,l}\, p_{m'}^{\,m} + P_1 + P_2 + \cdots,$ (1675)

where $ P_1, P_2,$ , et cetera, are terms involving derivatives of the $ p$ 's. Clearly, $ A_{k\cdots l}^{i\cdots j}$ is not a tensor under a general coordinate transformation. However, under a linear coordinate transformation ($ p$ 's constant) $ A_{k'\cdots l', m'}^{i'\cdots j'}$ behaves as a tensor of the type indicated by the suffixes, because the $ P_1, P_2,$ , et cetera, all vanish. Similarly, all higher partial derivatives,

$\displaystyle A_{k\cdots l,mn}^{i\cdots j} =\frac{\partial A_{k\cdots l}^{i\cdots j} } {\partial x^{\,m} \partial x^{\,n}}$ (1676)

et cetera, also behave as tensors under linear transformations. Each partial differentiation has the effect of adding a new covariant suffix.

So far, the space to which the coordinates $ x^{\,i}$ refer has been without structure. We can impose a structure on it by defining the distance between all pairs of neighboring points by means of a metric,

$\displaystyle ds^{\,2} = g_{ij}\, dx^{\,i} \,dx^{\,j},$ (1677)

where the $ g_{ij}$ are functions of position. We can assume that $ g_{ij} = g_{ji}$ without loss of generality. The previous metric is analogous to, but more general than, the metric of Euclidian $ n$ -space, $ ds^{\,2} = (dx^{\,1})^2 +(dx^{\,2})^2 + \cdots + (dx^{\,n})^2$ . A space whose structure is determined by a metric of the type (1679) is called Riemannian. Because $ ds^{\,2}$ is invariant, it follows from a simple extension of the quotient rule that $ g_{ij}$ must be a tensor. It is called the metric tensor.

The elements of the inverse of the matrix $ g_{ij}$ are denoted by $ g^{\,ij}$ . These elements are uniquely defined by the equations

$\displaystyle g^{\,ij} g_{jk} = \delta_k^{\,i}.$ (1678)

It is easily seen that the $ g^{\,ij}$ constitute the elements of a contravariant tensor. This tensor is said to be conjugate to $ g_{ij}$ . The conjugate metric tensor is symmetric (i.e., $ g^{\,ij} = g^{\,ji}$ ) just like the metric tensor itself.

The tensors $ g_{ij}$ and $ g^{\,ij}$ allow us to introduce the important operations of raising and lowering suffixes. These operations consist of forming inner products of a given tensor with $ g_{ij}$ or $ g^{\,ij}$ . For example, given a contravariant vector $ A^i$ , we define its covariant components $ A_i$ by the equation

$\displaystyle A_i = g_{ij}\, A^j.$ (1679)

Conversely, given a covariant vector $ B_i$ , we can define its contravariant components $ B^{\,i}$ by the equation

$\displaystyle B^{\,i} = g^{\,ij}\, B_j.$ (1680)

More generally, we can raise or lower any or all of the free suffixes of any given tensor. Thus, if $ A_{ij}$ is a tensor we define $ {A^i}_j$ by the equation

$\displaystyle {A^i}_j = g^{\,ip} A_{pj}.$ (1681)

Note that once the operations of raising and lowering suffixes has been defined, the order of raised suffixes relative to lowered suffixes becomes significant.

By analogy with Euclidian space, we define the squared magnitude $ (A)^2$ of a vector $ A^i$ with respect to the metric $ g_{ij}\,dx^{\,i}\, dx^{\,j}$ by the equation

$\displaystyle (A)^2 = g_{ij}\, A^i\, A^j = A_i\, A^i.$ (1682)

A vector $ A^i$ termed a null vector if $ (A)^2=0$ . Two vectors $ A^i$ and $ B^i$ are said to be orthogonal if their inner product vanishes: that is, if

$\displaystyle g_{ij} \,A^i\, B^{\,j} = A_i \,B^{\,i} = A^i\, B_i = 0.$ (1683)

Finally, let us consider differentiation with respect to an invariant distance, $ s$ . The vector $ dx^{\,i}/ds$ is a contravariant tensor, because

$\displaystyle \frac{dx^{\,i'}}{ds} = \frac{\partial x^{\.i'}}{\partial x^{\,i}} \frac{d x^{\,i}}{ds} = \frac{dx^{\,i}}{ds}\, p_i^{\,i'}.$ (1684)

The derivative $ d({A^{i\cdots j}}_{k\cdots l})/ds$ of some tensor with respect to $ s$ is not, in general, a tensor, because

$\displaystyle \frac{d({A^{i\cdots j}}_{k\cdots l})}{ds} = {A^{i\cdots j}}_{k\cdots l,m} \frac{d x^{\,m}}{ds},$ (1685)

and, as we have seen, the first factor on the right-hand side is not generally a tensor. However, under linear transformations it behaves as a tensor, so under linear transformations the derivative of a tensor with respect to an invariant distance behaves as a tensor of the same type.


next up previous
Next: Physical Significance of Tensors Up: Relativity and Electromagnetism Previous: Transformation of Velocities
Richard Fitzpatrick 2014-06-27