Basic tensor calculous

Standard (orthonormal) basis and components

We adopt the standard basis, i.e.,

(1)
\begin{align} \boldsymbol{e}_i\cdot\boldsymbol{e}_j= \delta_{ij}. \end{align}

Then, an arbitrary linear combination of rank-2 dyadic products can be written as

(2)
\begin{align} \boldsymbol{T}=\lambda\boldsymbol{a}\otimes\boldsymbol{b}+\nu\boldsymbol{c}\otimes\boldsymbol{d}+\cdots. \end{align}

It is sufficient to discuss dyad products between basis only, i.e.,

(3)
\begin{align} \boldsymbol{T}=T_{ij}\boldsymbol{e}_i\otimes\boldsymbol{e}_j. \end{align}

Eq. (3) is called a rank-2 tensor. More precisely, a rank-2 tensor is a bilinear mapping that maps two vectors to a scalar. Eq. (3) is one of representations of such a bilinear mapping; however, it would be practical to think that Eq. (3) itself is representing a rank-2 tensor.
If one want to obtain a complete information of $\boldsymbol{T}$, one needs to know $n\times n$ bilinear mappings:

(4)
\begin{align} \boldsymbol{a}\cdot\boldsymbol{T}\cdot\boldsymbol{b},\ \ \boldsymbol{c}\cdot\boldsymbol{T}\cdot\boldsymbol{d}\cdots. \end{align}

These are called components.
It is very natural to choose basis as independent vectors that act on the tensor from left and right.
Thus, the components of $\boldsymbol{T}$ is defined by

(5)
\begin{align} T_{ij}\equiv\boldsymbol{e}_i\cdot\boldsymbol{T}\cdot\boldsymbol{e}_j. \end{align}

$T_{ij}$ in Eq. (3) and new $T_{ij}$ are substantially same because, from

(6)
\begin{align} \boldsymbol{a}\otimes\boldsymbol{b}\cdot\boldsymbol{c}=\boldsymbol{a}\left(\boldsymbol{b}\cdot\boldsymbol{c}\right) \end{align}

and

(7)
\begin{align} \boldsymbol{c}\cdot\boldsymbol{a}\otimes\boldsymbol{b}=\left(\boldsymbol{c}\cdot\boldsymbol{a}\right)\boldsymbol{b}, \end{align}

we obtaine

(8)
\begin{align} \boldsymbol{e}_\alpha\cdot\left(T_{ij}\boldsymbol{e}_i\otimes\boldsymbol{e}_j\right)\cdot\boldsymbol{e}_\beta=T_{\alpha\beta}. \end{align}

Hence, in general manner, Eq. (5) is thought as definition of component and Eq. (3) as representation (substance) of a tensor.
However, the concept of tensor itself is a highly abstract concept and it appears with various types of representations. See Unit tensor.

General basis and components

Now, we consider cases with general basis, i.e., we adopt basis such that

(9)
\begin{align} \boldsymbol{g}_i\cdot\boldsymbol{g}_j\neq \delta_{ij}. \end{align}

Sometimes, employing the following "dual" basis contributes to the simplicity of calculations:

(10)
\begin{align} \boldsymbol{g}_i\cdot\boldsymbol{g}^j= \delta_i^{\cdot j}, \end{align}

$\boldsymbol{g}_i$ are referred to as covariant basis whereas $\boldsymbol{g}^i$ are referred to as contravariant basis. They are called dual basis.
Namely, we adopt six basis when working on a 3-dim. space. Of course, independent basis in them are only three.
Here, $\delta_i^{\cdot j}$ is substantially same as Kronecker's delta symbol, $\delta_{ij}$, but we can give different meaning to it. See Unit tensor.
When using general basis, to obtain complete information about $\boldsymbol{T}$, it is sufficient to obtain $n\times n$ independent linear mappings of vectors.
Now, because we have $n$ covariant basis and $n$ contravariant basis, we can define four types of components as

(11)
\begin{align} T_{ij}\equiv\boldsymbol{g}_i\cdot\boldsymbol{T}\cdot\boldsymbol{g}_j. \end{align}
(12)
\begin{align} T_i^{\cdot j}\equiv\boldsymbol{g}_i\cdot\boldsymbol{T}\cdot\boldsymbol{g}^j. \end{align}
(13)
\begin{align} T^i_{\cdot j}\equiv\boldsymbol{g}^i\cdot\boldsymbol{T}\cdot\boldsymbol{g}_j. \end{align}
(14)
\begin{align} T^{ij}\equiv\boldsymbol{g}^i\cdot\boldsymbol{T}\cdot\boldsymbol{g}^j. \end{align}

For example, in a 3-dim. problem, there are 36 components and 9 of them are independent basis.
When defining components like those, we can clearly determine the position (up or down) of indices.
Using Eq. (10), representations of $\boldsymbol{T}$ that correspond to each components are given as

(15)
\begin{align} \boldsymbol{T}=T_{ij}\boldsymbol{g}^i\otimes\boldsymbol{g}^j. \end{align}
(16)
\begin{align} \boldsymbol{T}=T_i^{\cdot j}\boldsymbol{g}^i\otimes\boldsymbol{g}_j. \end{align}
(17)
\begin{align} \boldsymbol{T}=T^i_{\cdot j}\boldsymbol{g}_i\otimes\boldsymbol{g}^j. \end{align}
(18)
\begin{align} \boldsymbol{T}=T^{ij}\boldsymbol{g}_i\otimes\boldsymbol{g}_j. \end{align}

Inner product

There two different inner product that can be defined between two rank-2 dyad products.

(19)
\begin{align} \boldsymbol{a}\otimes\boldsymbol{b}:\boldsymbol{c}\otimes\boldsymbol{d}\equiv(\boldsymbol{a}\cdot\boldsymbol{d})(\boldsymbol{b}\cdot\boldsymbol{c}) \end{align}

and

(20)
\begin{align} \boldsymbol{a}\otimes\boldsymbol{b}\cdot\cdot\boldsymbol{c}\otimes\boldsymbol{d}\equiv(\boldsymbol{a}\cdot\boldsymbol{c})(\boldsymbol{b}\cdot\boldsymbol{d}). \end{align}

Note that they give different results.
Based on these definitions, inner products between rank-2 tensors can be defined as follows:

(21)
\begin{align} T_{ij}\boldsymbol{e}_i\otimes\boldsymbol{e}_j:H_{\alpha\beta}\boldsymbol{e}_\alpha\otimes\boldsymbol{e}_\beta\equiv T_{ij}H_{ji} \end{align}

and

(22)
\begin{align} T_{ij}\boldsymbol{e}_i\otimes\boldsymbol{e}_j\cdot\cdot H_{\alpha\beta}\boldsymbol{e}_\alpha\otimes\boldsymbol{e}_\beta\equiv T_{ij}H_{ij} \end{align}

When using standard (orthonormal) basis, similar to standard inner product between vectors, they give summation over every products between components that have same indices.
When using genera basis,

(23)
\begin{align} T_i^{\cdot j}\boldsymbol{g}^i\otimes\boldsymbol{g}_j:H_\alpha^{\cdot\beta}\boldsymbol{g}^\alpha\otimes\boldsymbol{g}_\beta\equiv T_i^{\cdot j}H_j^{\cdot i} \end{align}
(24)
\begin{align} T^i_{\cdot j}\boldsymbol{g}_i\otimes\boldsymbol{g}^j:H^\alpha_{\cdot\beta}\boldsymbol{g}_\alpha\otimes\boldsymbol{g}^\beta\equiv T^i_{\cdot j}H^j_{\cdot i} \end{align}
(25)
\begin{align} T_i^{\cdot j}\boldsymbol{g}^i\otimes\boldsymbol{g}_j\cdot\cdot H^\alpha_{\cdot\beta}\boldsymbol{g}_\alpha\otimes\boldsymbol{g}^\beta\equiv T_i^{\cdot j}H^i_{\cdot j} \end{align}

and

(26)
\begin{align} T^i_{\cdot j}\boldsymbol{g}_i\otimes\boldsymbol{g}^j\cdot\cdot H_\alpha^{\cdot\beta}\boldsymbol{g}^\alpha\otimes\boldsymbol{g}_\beta\equiv T^i_{\cdot j}H_i^{\cdot j}. \end{align}

Note that the former two give the same result and the latter give the same result too.
When using general basis, there are further variations of inner products that can be defined. In them, $g_{ij}$ or $g^{ij}$ will appear. However they give the same scalars as above inner products.

Bilinear mapping

A rank-2 tensor is a bilinear mapping of vectors to a scalar. $\boldsymbol{T}=T^{ij}\boldsymbol{g}_i\otimes\boldsymbol{g}_j$ can be understood as representing such a bilinear mapping if one see

(27)
\begin{align} \boldsymbol{a}\cdot\boldsymbol{T}\cdot\boldsymbol{b}=T^{ij}a_ib_j \end{align}

as

(28)
\begin{align} \boldsymbol{a},\boldsymbol{b}\mapsto =T^{ij}a_ib_j \end{align}

Basic calculation rules

(29)
\begin{align} \left(\boldsymbol{a}\otimes\boldsymbol{b}\right)\otimes\boldsymbol{c}=\boldsymbol{a}\otimes\boldsymbol{b}\otimes\boldsymbol{c}, \end{align}
(30)
\begin{align} \left(\boldsymbol{a}\otimes\boldsymbol{b}\right)\otimes\left(\boldsymbol{c}\otimes\boldsymbol{d}\right)=\boldsymbol{a}\otimes\boldsymbol{b}\otimes\boldsymbol{c}\otimes\boldsymbol{d},\ and\ so\ on. \end{align}
(31)
\begin{align} \boldsymbol{a}\otimes\boldsymbol{b}\cdot\boldsymbol{c}=\boldsymbol{a}\left(\boldsymbol{b}\cdot\boldsymbol{c}\right) \end{align}
(32)
\begin{align} \left(\boldsymbol{a}\otimes\boldsymbol{b}\right)\cdot\left(\boldsymbol{c}\otimes\boldsymbol{d}\right)=\boldsymbol{a}\otimes\left(\boldsymbol{b}\cdot\boldsymbol{c}\right)\otimes\boldsymbol{d}=\left(\boldsymbol{b}\cdot\boldsymbol{c}\right)\boldsymbol{a}\otimes\boldsymbol{d} \end{align}
(33)
\begin{align} \boldsymbol{c}\cdot\boldsymbol{a}\otimes\boldsymbol{b}=\boldsymbol{a}\left(\boldsymbol{b}\cdot\boldsymbol{c}\right) \end{align}
(34)
\begin{align} \left(\boldsymbol{c}\otimes\boldsymbol{d}\right)\cdot\left(\boldsymbol{a}\otimes\boldsymbol{b}\right)=\boldsymbol{c}\otimes\left(\boldsymbol{d}\cdot\boldsymbol{a}\right)\otimes\boldsymbol{b}=\left(\boldsymbol{d}\cdot\boldsymbol{a}\right)\boldsymbol{c}\otimes\boldsymbol{b} \end{align}

Vectors and their components

$\boldsymbol{a}=a_i\boldsymbol{e}_i$ is rewritten as

(35)
\begin{align} \boldsymbol{a}=a^i\boldsymbol{g}_i. \end{align}
(36)
\begin{align} \boldsymbol{a}=a_i\boldsymbol{g}^i. \end{align}

Those components can be defined by

(37)
\begin{align} a^i=\boldsymbol{a}\cdot\boldsymbol{g}^i, \end{align}
(38)
\begin{align} a_i=\boldsymbol{a}\cdot\boldsymbol{g}_i, \end{align}