What is a tensor?

A (rank 2 contravariant) tensor is a vector of vectors. If you have a vector, it's 3 numbers which point in a certain direction. What that means is that they rotate into each other when you do a rotation of coordinates. So that the 3 vector components $V^i$ transform into

$$V'^i = A^i_j V^j$$

under a linear transformation of coordinates.

A tensor is a vector of 3 vectors that rotate into each other under rotation (and also rotate as vectors--- the order of the two rotation operations is irrelevant). If a vector is $V^i$ where i runs from 1-3 (or 1-4, or from whatever to whatever), the tensor is $T^{ij}$, where the first index labels the vector, and the second index labels the vector component (or vice versa). When you rotate coordinates T transforms as

$$ T'^{ij} = A^i_k A^j_l T^{kl} = \sum_{kl} A^i_k A^j_l T^{kl} $$

Where I use the Einstein summation convention that a repeated index is summed over, so that the middle expression really means the sum on the far right.

A rank 3 tensor is a vector of rank 2 tensors, a rank four tensor is a vector of rank 3 tensors, so on to arbitrary rank. The notation is $T^{ijkl}$ and so on with as many upper indices as you have a rank. The transformation law is one A for each index, meaning each index transforms separately as a vector.

A covariant vector, or covector, is a linear function from vectors to numbers. This is described completely by the coefficients, $U_i$, and the linear function is

$$ U_i V^i = \sum_i U_i V^i = U_1 V^1 + U_2 V^2 + U_3 V^3 $$

where the Einstein convention is employed in the first expression, which just means that if the same index name occurs twice, once lower and once upper, you understand that you are supposed to sum over the index, and you say the index is contracted. The most general linear function is some linear combination of the three components with some coefficients, so this is the general covector.

The transformation law for a covector must be by the inverse matrix

$$ U'_i = \bar{A}_i^j U_j $$

Matrix multiplication is simple in the Einstein convention:

$$ M^i_j N^j_k = (MN)^i_k $$

And the definition of $\bar{A}$ (the inverse matrix) makes it that the inner product $U_i V^i$ stays the same under a coordinate transformation (you should check this).

A rank-2 covariant tensor is a covector of covectors, and so on to arbitrarily high rank.

You can also make a rank m,n tensor $T^{i_1 i_2 ... i_m}_{j_1j_2 ... j_n}$, with m upper and n lower indices. Each index transforms separately as a vector or covector according to whether it is up or down. Any lower index may be contracted with any upper index in a tensor product, since this is an invariant operation. This means that the rank m,n tensors can be viewed in many ways:

And so on for a number of interpretations that grows exponentially with the rank. This is the mathemtician's preferred definition, which does not emphasize the transformation properties, rather it emphasizes the linear maps involved. The two definitions are identical, but I am happy I learned the physicist definition first.

In ordinary Euclidean space in rectangular coordinates, you don't need to distinguish between vectors and covectors, because rotation matrices have an inverse which is their transpose, which means that covectors and vectors transform the same under rotations. This means that you can have only up indices, or only down, it doesn't matter. You can replace an upper index with a lower index keeping the components unchanged.

In a more general situation, the map between vectors and covectors is called a metric tensor $g_{ij}$. This tensor takes a vector V and produces a covector (traditionally written with the same name but with a lower index)

$$ V_i = g_{ij} V^i$$

And this allows you to define a notion of length

$$ |V|^2 = V_i V^i = g_{ij}V^i V^j $$

this is also a notion of dot-product, which can be extracted from the notion of length as follows:

$$ 2 V\cdot U = |V+U|^2 - |V|^2 - |U|^2 = 2 g_{\mu\nu} V^\mu U^\nu $$

In Euclidean space, the metric tensor $g_{ij}= \delta_{ij}$ which is the Kronecker delta. It's like the identity matrix, except it's a tensor, not a matrix (a matrix takes vectors to vectors, so it has one upper and one lower index--- note that this means it automatically takes covectors to covectors, this is multiplication of the covector by the transpose matrix in matrix notation, but Einstein notation subsumes and extends matrix notation, so it is best to think of all matrix operations as shorthand for some index contractions).

The calculus of tensors is important, because many quantities are naturally vectors of vectors.

In general, tensors are the founding tool for group representations, and you need them for all aspects of physics, since symmetry is so central to physics.