Since scalars and vectors are both subclasses of tensors, we expect that tensors obey the basic rules of arithmetic that we are already familiar with. This is, for the most part, true. As will be seen, tensors obey addition, subtraction, multiplication, and division, but with some modifications and restrictions. They also introduce some new capabilities that we have not encountered directly before.
Two tensors of the same rank and type may be added together to give another tensor. For example, if and are both tensors, then the quantities
also form the components of a tensor.
To multiply two tensors together, simply write them together, combining any independent indices in their appropriate position.
Let the first tensor be and the second tensor be . Then the product of these two tensors is

. 
(2.20) 
Since subtraction can be defined as the combined operations of multiplication by a scalar (1) followed by addition, it is immediately seen that subtraction follows the same rules as addition. Thus, only tensors of the same rank and type can be subtracted.
Division of one tensor by another is not supported in general. This is because of the impossibility of knowing which component of the numerator is being acted upon or of the denominator doing the acting. The only exception is division by a scalar. In this case, the tensor is simply being rescaled and each component of the original tensor is divided by the scalar value.
Consider three coordinate systems, x^{a}, x'^{a}, x”^{a}. If a tensor T_{rs} transforms as a covariant tensor between the first and second coordinate systems,

, 

and it also transforms as a covariant tensor between the second and third coordinate systems,

, 

then the transformation between the first and third coordinate system is transitive if the the transformation is tensoral,

. 

Note that while covariant tensors where used to define transitivity, this rule also applied to contravariant and mixed tensors.
The addition of two tensors is associative, i.e., A^{m} + B^{m} = B^{m} + A^{m}. Multiplication by two successive scalars is also associative,

k(cT^{m}) = (ck)T^{m}. 

In general, tensors do not commute. Thus,

. 
(2.21) 
Before introducing the last arithmetic operation, it is worth while to consider the question of symmetry. A tensor is said to be symmetric if

. 
(2.22) 
A tensor is said to be skewsymmetric, or antisymmetric, if

. 
(2.23) 
Since this is a tensor, these properties are conserved under coordinate transformations. A similar set of definitions apply to contravariant tensors, but note that they do not apply to mixed tensors. The relationship does not, in general, carry over from one coordinate system to another.
For higher ranked tensors, the concept of of symmetry can still be applied. A tensor is symmetric with respect to a pair of indices (both superscripts or both subscripts) if the value of the component is unchanged on interchanging these indicies. It is antisymmetric if interchange of indices results in only a change of sign for the component.
An important result of these rules is that any rank two tensor (contravariant or covariant) may be expressed as the sum of a symmetric tensor and an antisymmetric tensor,

. 
(2.24) 
We can denote symmetry in a pair of indices by enclosing them in a set of parentheses, T^{(ab)}. Similarly, antisymmetry can be denoted by using a set of square brackets, T^{[ab]}.
Consider a mixed tensor of rank two or higher, . Contraction is the operation of summing over a set of two indices, one superscript and one subscript. In other words, to carry out a contraction, one of the nonrepeating indices are changed to match one of the other indices, and then the tensor is summed over these two indices.
Consider a rank two tensor in two dimensions. Its components are

. 

Contracting this tensor yields the scalar

. 

Note that any odd ordered tensor can be continuously contracted down to a rank one tensor (a vector), while any even ordered tensor can be continuously contracted down to a rank zero tensor (a scalar). Also note that the process of contraction can not be applied to indices on the same level.