All Science Fair Projects

Science Fair Project Encyclopedia for Schools!

 Search    Browse    Forum  Coach    Links    Editor    Help    Tell-a-Friend    Encyclopedia    Dictionary

Science Fair Project Encyclopedia

For information on any area of science that interests you,
enter a keyword (eg. scientific method, molecule, cloud, carbohydrate etc.).
Or else, you can start by choosing any of the categories below.

Tensor (intrinsic definition)

(Redirected from Rank of a tensor)

In mathematics, the modern component-free approach to the theory of tensors views tensors initially as abstract objects, expressing some definite type of multi-linear concept. Their well-known properties can be derived from their definitions, as linear maps or more generally; and the rules for manipulations of tensors arise as an extension of linear algebra to multilinear algebra.

In differential geometry an intrinsic geometric statement may be described by a tensor field on a manifold, and then doesn't need to make references to coordinates at all. The same is true in general relativity, of tensor fields describing a physical property. The component-free approach is also used heavily in abstract algebra and homological algebra, where tensors arise naturally.

Note: This is only a definition of tensor product between vector spaces with a chosen basis. The notion of tensor product can be generalised to vector spaces without a chosen basis, and even further; between modules. But the article is still fairly abstract. If you are baffled by this, try reading the main tensor article and the classical treatment first.

 Contents

Definition

Let V and W be two vector spaces over a common field F with basis {vi} and {wj}. Their tensor product

$V \otimes W$

is a vector space over the same field F together with a bilinear map

$\otimes: V \times W \rarr V \otimes W$

with the basis

$\{ \mathbf{v}_i \} \otimes \{ \mathbf{w}_j \} = \{ \mathbf{v}_i \otimes \mathbf{w}_j \}$

Note that here the same symbol $\otimes$ has been used with two different--albeit related--senses, one between vector spaces, and one as the bilinear map.

If V and W are both finite dimensional then the dimension of $V \otimes W$ is the product of the dimensions of V and W. This tensor product can by iteration be applied to more than just two vector spaces.

A tensor on the vector space V is then defined to be an element of (i.e. a vector in) the following vector space:

$V \otimes ... \otimes V \otimes V^* \otimes ... \otimes V^*$

where V* is the dual space of V.

If there are m copies of V and n copies of V* in our product, the tensor is said to be of type (m, n) and of contravariant rank m and covariant rank n. The tensors of rank zero are just the scalars (elements of the field F), those of contravariant rank 1 the vectors in V, and those of covariant rank 1 the one-forms in V* (for this reason the last two spaces are often called the contravariant and covariant vectors).

Note that the (1,1) tensors

$V \otimes V^*$

are isomorphic in a natural way to the space of linear transformations (i.e. matrices) from V to V. An inner product of a real vector space V; V × V → R corresponds in a natural way to a (0,2) tensor in

$V^* \otimes V^*$

called the associated metric and usually denoted g.

Alternate notation

Rather than writing out the full tensor product to denote the space of tensors of type (m,n), the literature often uses the abbreviation

$T^m_n(V) = V\otimes ... \otimes V\otimes V^* ... \otimes V^*$

Another, alternate notation for this space is in terms of linear maps from a vector space V to a vector space W. Let

L(V,W)

denote the space of all linear maps from V to W. Thus, for example, the dual space (the space of 1-forms) may be written as

$V^* \approx L(V,\mathbb{R})$

The set of (m,n)-tensors can then be written as

$T^m_n(V) \approx L(V^*\otimes ... \otimes V^*\otimes V \otimes ... \otimes V, \mathbb{R}) \approx L^{m+n}(V^*,...,V^*,V,...,V,\mathbb{R})$

Note that in the formula above,the roles of V and V* are reversed. In particular, one has

$T^1_0(V) \approx L(V^*,\mathbb{R}) \approx V$

and

$T^0_1(V) \approx L(V,\mathbb{R}) \approx V^*$

and

$T^1_1(V) \approx L(V,V)$

The notation

GL(V,W)

is often used to denote the space of invertible linear transformations from V to W; however there is no analogous notation for tensor spaces.

Tensor fields

In differential geometry, physics and engineering, we usually deal with tensor fields on differentiable manifolds. (The term "tensor" is sometimes used as a shorthand for "tensor field".) For instance, the curvature tensor is discussed in differential geometry and the stress-energy tensor is important in physics and engineering. Both of these are related by Einstein's theory of general relativity. In engineering, the underlying manifold will often be Euclidean 3-space. A tensor field assigns to any given point of the manifold a tensor in the space

$V \otimes ... \otimes V \otimes V^* \otimes ... \otimes V^*$

where V is the tangent space at that point and V* is the cotangent space. See also tangent bundle and cotangent bundle.

The notation for tensor fields can sometimes be confusingly similar to the notation for tensor spaces. Thus, the tangent bundle TM = T(M) might sometimes be written as

$T_0^1(M)=T(M) =TM$

to emphasize that the tangent bundle is a tensor field of (1,0) tensors on the manifold M. Do not confuse this with the very similar looking notation $T_0^1(V)$; in the later case, we just have one tensor space, whereas in the former, we have a tensor space defined for each point in the manifold M. Curly (script) letters are sometimes used to denote the set of infinitely-differentiable tensor fields on M. Thus,

$\mathcal{T}^m_n(M)$

is the (m,n) tensor bundle on M of infinitely-differentiable tensor fields. A tensor field is an element of this set.

Basis

For any given coordinate system we have a basis {ei} for the tangent space V (note that this may vary from point-to-point if the manifold is not linear), and a corresponding dual basis {ei} for the cotangent space V* (see dual space). The difference between the raised and lowered indices is there to remind us of the way the components transform.

For example purposes, then, take a tensor A in the space

$V \otimes V \otimes V^*$

The components relative to our coordinate system can be written

$\mathbf{A} = A^{ij}_k (\mathbf{e}_i \otimes \mathbf{e}_j \otimes \mathbf{e}^k)$

Here we used the Einstein notation, a convention useful when dealing with coordinate equations: when an index variable appears both raised and lowered on the same side of an equation, we are summing over all its possible values. In physics we often use the expression

$A^{ij}_k$

to represent the tensor, just as vectors are usually treated in terms of their components. This can be visualized as an n × n × n array of numbers. In a different coordinate system, say given to us as a basis {ei'}, the components will be different. If (xi'i) is our transformation matrix (note it is not a tensor, since it represents a change of basis rather than a geometrical entity) and if (yii') is its inverse, then our components vary per

$A^{i'j'}_{k'} = x^{i'}_i x^{j'}_j y^k_{k'} A^{ij}_k$

In older texts this transformation rule often serves as the definition of a tensor. Formally, this means that tensors were introduced as specific representations of the group of all changes of coordinate systems.

/Old Talk - still has some stuff that should likely be merged in

03-10-2013 05:06:04