Tensor product of two linear maps
Let \(A \in End(V)\) and \(B \in End(W)\) be two linear maps. We can define naturally the tensor product \(A \otimes B\) of \(A\) and \(B\), from \(V \otimes W\) to \(V \otimes W\), sending \(v \otimes w\) to \(Av \otimes Bw\). In this post, I am going to realize \(A \otimes B\) as a matrix and relate the determinant and trace of \(A \otimes B\) to the ones of \(A\) and \(B\).
Let \(V\) and \(W\) be two vector spaces over a field \(K\) with \(\dim V=n\) and \(\dim W=m\). Let \(e_1, \dots, e_n\) be a basis of \(V\) and let \(f_1, \dots, f_m\) a basis of \(W\). Under the basis, a linear map \(A: V \to V\) can be realized as an \(n \times n\) matrix \((a_{ij})_{1 \le i,j \le n}\), where \(a_{ij} \in K\). Similarly, a linear map \(B: W \to W\) can be realized as a \(m \times m\) matrix \((b_{kl})_{1 \le k, l \le m}\). Now, \(V \otimes W\), as a vector space, has basis \(\{e_i \otimes f_j: 1 \le i \le n, 1 \le j \le m\}\). And, \[\begin{align*} A \otimes B (e_i \otimes f_j) &= A(e_i) \otimes B(f_j) \\ &= \sum_{k}a_{ki}e_k \otimes \sum_{l}b_{lj}f_l \\ &= \sum_{k,l}a_{ki}b_{lj} e_k \otimes f_l. \end{align*}\]
From above calculation, we know that the \((k,l)\)-th row, \((i,j)\)-th column coefficient of the matrix of \(A \otimes B\) is \(a_{ki}b_{lj}\). To write down the matrix explicitly, we need to give an order to \(\{(i,j):1 \le i \le n, 1 \le j \le m\}\). For instance, we can sort the indexes in an ascending order by using the first coordinate as primary key and the second coordinate as the second key. Let’s look at the simplest example. Let \(A=\left(\begin{array}{cc} a & b \\ c & d \end{array}\right)\) and \(B=\left(\begin{array}{cc} x & y \\ z & w \end{array}\right)\). Then \[A \otimes B = \left(\begin{array}{cccc} ax & ay & bx & by\\ az & aw & bz & bw\\ cx & cy & dx & dy\\ cz & cw & dz & dw \end{array}\right).\]
Proposition. Let \(\alpha\) be the eigenvalue of \(A\) with eigenvector \(v\) and let \(\beta\) be the eigenvalue of \(B\) with eigenvector \(w\). Then \(A \otimes B\) has eigenvalue \(\alpha\beta\) with eigenvector \(v \otimes w\).
Proof. This is an easy calculation: \[\begin{align*} A \otimes B (v \otimes w) &= Av \otimes Bw \\ &= \alpha v \otimes \beta w \\ &= \alpha\beta \, v \otimes w. \end{align*}\] This completes the proof. \(\square\)
Since we can express determinants and traces in terms of eigenvalues, the above proposition relates the determinants and traces of \(A \otimes B\), \(A\) and \(B\) together.
Corollary. \((i) \; \det(A \otimes B) = (\det A)^m (\det B)^n\).
\((ii) \; \text{tr}(A \otimes B) = \text{tr}\, A \cdot \text{tr}\, B\).
Proof. Without loss of generality, we can assume is \(K\) is algebraic closed, since we can replace \(K\) with its algebraic closure without changing the determinants and traces.
Let \(\alpha_1, \dots, \alpha_n\) be all eigenvalues(with multiplicities) of \(A\) and let \(\beta_1, \dots, \beta_m\) be all eigenvalues(with multiplicities) of \(B\). Then by the Proposition, \(\alpha_i\beta_j\) are all the eigenvalues of \(A \otimes B\), for \(1 \le i \le n\) and \(1 \le j \le m\). Hence, \[\begin{align*} \det(A \otimes B) &= \prod_{i,j} \alpha_i\beta_j \\ &= \left(\prod_i \alpha_i\right)^m \left(\prod_j \beta_j\right)^n \\ &= (\det A)^m(\det B)^n, \end{align*}\] And \[\begin{align*} \text{tr}(A \otimes B) &= \sum_{ij} \alpha_i\beta_j \\ &= \left(\sum_i \alpha_i\right)\left(\sum_j \beta_j\right) \\ &= \text{tr} \, A \cdot \text{tr} \, B. \end{align*}\] \(\square\)