U Tokyo Grad School Math 2020 Problem 1 / 東大院理工学 数学 2020 問題一

📢 This article was translated by gemini-2.5-flash

Introduction

Even with a Japanese title, this post isn’t primarily in Japanese. Might add more if I find time.

Problem 1: This Article

Problem 2: https://blog.yexca.net/en/archives/187

Problem 3: https://blog.yexca.net/en/archives/188

Lately, I felt like my studies weren’t sticking, even after attending lectures. So I decided to tackle some problems (yeah, studying without doing problems might seem weird, but it felt normal then, probably just laziness). Turns out, I couldn’t solve them. But looking up solutions triggered memories from past courses, so I basically re-learned everything as I worked through this.

Also, no official solutions meant I had to compute the first three problems with Python; those answers should be solid. But the last two are proofs, so I can’t guarantee my derivations are flawless.

Problem

Source: https://www.i.u-tokyo.ac.jp/edu/entra/examarchive.shtml

Problems are copyrighted by the University of Tokyo; cited here for convenience, non-profit use only.

Let square matrices $A, B$ be defined as:

$$ A=\begin{pmatrix} 1 & \sqrt{2} & 0 \\ \sqrt{2} & 1 & \sqrt{2} \\ 0 & \sqrt{2} & 1 \end{pmatrix}, B=\begin{pmatrix} 0 & -\frac{2}{3} & \frac{1}{3} \\ \frac{2}{3} & 0 & -\frac{2}{3} \\ -\frac{1}{3} & \frac{2}{3} & 0 \end{pmatrix} $$

Also, let $I$ be the identity matrix. For a real square matrix $X$, $\exp(X)$ is defined as:

$$ \exp(X)=\sum_{k=0}^{\infty}(\frac{1}{k!}X^{k})=I+X+\frac{1}{2!}X^{2}+\frac{1}{3!}X^{3}+\cdots $$

When defined this way, answer the following questions.

(1)Find all eigenvalues of $A$ and their corresponding eigenvectors. For eigenvectors, choose those with a norm of $1$ and a non-negative real first element.

(2)Find $A^{n}$ for a non-negative integer $n$.

(3)Find $\exp(A)$.

(4)Show that when $\alpha$ is a real number, $\exp(\alpha B)$ can be expressed by the following equation:

$$ \exp(\alpha B) = I + (\sin\alpha)B + (1-\cos(\alpha))B^{2} $$

You may use the Cayley-Hamilton theorem.

(5)Given a 3-dimensional real vector $a$, define the function $f$ for a 3-dimensional real vector $x$ as:

$$ f(x) = \sum_{k=1}^{n} \left \| \exp(\frac{2\pi k}{n}B)a - x \right \|^{2} $$

Let $n \ge 2$. Show that $f$ is minimized at $x=(I+B^{2})a$.

Problem 1

Problem: Find the eigenvalues and eigenvectors of $A$ (eigenvectors should have a norm of 1 and a non-negative first element).


Construct the matrix and calculate its determinant based on the characteristic equation $\det(A-\lambda I) = 0$. The characteristic equation is:

$$ \det(A-\lambda I) =\begin{vmatrix} 1-\lambda & \sqrt{2} & 0 \\ \sqrt{2} & 1-\lambda & \sqrt{2} \\ 0 & \sqrt{2} & 1-\lambda \end{vmatrix} = 0 $$

Expanding along the first row yields:

$$ (1-\lambda)[(1-\lambda)^{2}-2]-\sqrt{2}[\sqrt{2}(1-\lambda)]=0 $$

Solving gives:

$$ \lambda_{1}=1, \lambda_{2}=-1, \lambda_{3}=3 $$

Next, find the eigenvectors corresponding to each eigenvalue. For $\lambda_{1}=1$:

$$ (A-\lambda_{1}I) = \begin{pmatrix} 0 & \sqrt{2} & 0 \\ \sqrt{2} & 0 & \sqrt{2} \\ 0 & \sqrt{2} & 0 \end{pmatrix} $$

Solving the equation $(A-\lambda_{1}I)\mathbf{v}_{1}=0$ yields the eigenvector:

$$ \mathbf{v}_{1}=\begin{pmatrix} 1 \\ 0 \\ -1 \end{pmatrix} $$

For $\lambda_{2}=-1$:

$$ (A-\lambda_{2}I)=\begin{pmatrix} 2 & \sqrt{2} & 0 \\ \sqrt{2} & 2 & \sqrt{2} \\ 0 & \sqrt{2} & 2 \end{pmatrix} $$

Solving the equation $(A-\lambda_{2}I)\mathbf{v}_{2}=0$ yields the eigenvector:

$$ \mathbf{v}_{2}=\begin{pmatrix} 1 \\ -\sqrt{2} \\ 1 \end{pmatrix} $$

For $\lambda_{3}=3$:

$$ (A-\lambda_{3}I)=\begin{pmatrix} -2 & \sqrt{2} & 0 \\ \sqrt{2} & -2 & \sqrt{2} \\ 0 & \sqrt{2} & -2 \end{pmatrix} $$

Solving the equation $(A-\lambda_{3}I)\mathbf{v}_{3}=0$ yields the eigenvector:

$$ \mathbf{v}_{3}=\begin{pmatrix} 1 \\ \sqrt{2} \\ 1 \end{pmatrix} $$

Normalizing the eigenvectors, here’s the summary:

$$ \begin{align*} \text{Eigenvalues}&: \lambda_{1}=1, \lambda_{2}=-1, \lambda_{3}=3 \\ \text{Eigenvectors}&: \mathbf{v}_{1}=\frac{1}{\sqrt{2}} \begin{pmatrix} 1 \\ 0 \\ -1 \end{pmatrix}, \mathbf{v}_{2}=\frac{1}{2} \begin{pmatrix} 1 \\ -\sqrt{2} \\ 1 \end{pmatrix}, \mathbf{v}_{3}=\frac{1}{2} \begin{pmatrix} 1 \\ \sqrt{2} \\ 1 \end{pmatrix} \end{align*} $$

Python Code

Verify the results with sympy:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
import sympy as sp

# Define matrix A
A = sp.Matrix([[1, sp.sqrt(2), 0], 
               [sp.sqrt(2), 1, sp.sqrt(2)], 
               [0, sp.sqrt(2), 1]])

# Calculate eigenvalues and eigenvectors
eigenvals = A.eigenvals()  # eigenvalues
eigenvects = A.eigenvects()  # eigenvectors

print(f"Eigenvalues of A:")
sp.pprint(eigenvals)
print(f"Eigenvectors of A:")
sp.pprint(eigenvects)

Output is as follows:

UTokyo-2020-1-1

Problem 2

Problem: Find $A^{n}$ ($n$ is a non-negative integer).


By diagonalization theory, matrix $A$ decomposes into $A=SDS^{-1}$. From Problem 1, we get:

$$ S=\begin{pmatrix} 1 & 1 & 1 \\ 0 & -\sqrt[]{2} & \sqrt[]{2} \\ -1 & 1 & 1 \end{pmatrix}, D=\begin{pmatrix} 1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & 3 \end{pmatrix} $$

Calculating the inverse matrix of $S$ yields:

$$ S^{-1}=\begin{pmatrix} \frac{1}{2} & 0 & -\frac{1}{2} \\ \frac{1}{4} & -\frac{\sqrt{2}}{4} & \frac{1}{4} \\ \frac{1}{4} & \frac{\sqrt{2}}{4} & \frac{1}{4} \end{pmatrix} $$

Calculating $D^{n}$ yields:

$$ D^{n}=\begin{pmatrix} 1^{n} & 0 & 0 \\ 0 & (-1)^{n} & 0 \\ 0 & 0 & 3^{n} \end{pmatrix} $$

Solving $A^{n}=S D^{n} S^{-1}$ yields:

$$ A^{n}=\begin{pmatrix} \frac{1}{2}+(-1)^{n}\frac{1}{4}+\frac{3^{n}}{4} & (-1)^{n+1}\frac{\sqrt{2}}{4}+\frac{\sqrt{2}}{4}3^{n} & -\frac{1}{2}+(-1)^{n}\frac{1}{4}+\frac{3^{n}}{4} \\ (-1)^{n+1}\frac{\sqrt{2}}{4}+\frac{\sqrt{2}}{4}3^{n} & (-1)^{n}\frac{1}{2}+\frac{1}{2}3^{n} & (-1)^{n+1}\frac{\sqrt{2}}{4}+\frac{\sqrt{2}}{4}3^{n} \\ -\frac{1}{2}+(-1)^{n}\frac{1}{4}+\frac{3^{n}}{4} & (-1)^{n+1}\frac{\sqrt{2}}{4}+\frac{\sqrt{2}}{4}3^{n} & \frac{1}{2}+(-1)^{n}\frac{1}{4}+\frac{3^{n}}{4} \end{pmatrix} $$

Python Code

Verify the results with sympy:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
import sympy as sp

# Define sqrt(2)
sqrt_2 = sp.sqrt(2)

# Define matrices S and S_inv
S = sp.Matrix([[1, 1, 1],
                [0, -sqrt_2, sqrt_2],
                [-1, 1, 1]])

S_inv = sp.Matrix([
    [sp.Rational(1, 2), 0, -sp.Rational(1, 2)],
    [sp.Rational(1, 4), -sqrt_2 / 4, sp.Rational(1, 4)],
    [sp.Rational(1, 4), sqrt_2 / 4, sp.Rational(1, 4)]
])

# Define diagonal matrix D
lambda_1 = 1
lambda_2 = -1
lambda_3 = 3
D = sp.diag(lambda_1, lambda_2, lambda_3)

# Define n
n = sp.symbols('n')

# Calculate D^n
D_n = D**n

# Calculate A^n = S D^n S_inv
A_n = S * D_n * S_inv

# Print result
print(f"A^n = S D^n S_inv =")
sp.pprint(A_n)

Output is as follows:

UTokyo-2020-1-2

Alternative Solution

Problem 4 hints at the Cayley-Hamilton Theorem, a tool for calculating matrix powers.

After finding the eigenvalues in Problem 1, let:

$$ \lambda^{n} = f(\lambda)g(\lambda) +a\lambda^{2} +b\lambda +c $$

Substituting the eigenvalues, and since $f(\lambda_{1})=f(\lambda_{2})=f(\lambda_{3})=0$, we get the system of equations:

$$ \left\{\begin{matrix} 1 =a+b+c \\ (-1)^{n} =a-b+c \\ 3^{n} =9a+3b+c \end{matrix}\right. $$

Solving the system of equations yields:

$$ \left\{\begin{matrix} a=\frac{(-1)^{n}+3^{n}-2}{8} \\ b=1-\frac{1+(-1)^{n}}{2} \\ c=\frac{1+(-1)^{n}}{2}-\frac{(-1)^{n}+3^{n}-2}{8} \end{matrix}\right. $$

Plugging these into $A^{n}=f(A)g(A)+aA^{2}+bA+cI$, since $f(A)=0$, it simplifies to $A^{n}=aA^{2}+bA+cI$.

$$ A^{n}=\begin{pmatrix} \frac{1}{2}+(-1)^{n}\frac{1}{4}+\frac{3^{n}}{4} & (-1)^{n+1}\frac{\sqrt{2}}{4}+\frac{\sqrt{2}}{4}3^{n} & -\frac{1}{2}+(-1)^{n}\frac{1}{4}+\frac{3^{n}}{4} \\ (-1)^{n+1}\frac{\sqrt{2}}{4}+\frac{\sqrt{2}}{4}3^{n} & (-1)^{n}\frac{1}{2}+\frac{1}{2}3^{n} & (-1)^{n+1}\frac{\sqrt{2}}{4}+\frac{\sqrt{2}}{4}3^{n} \\ -\frac{1}{2}+(-1)^{n}\frac{1}{4}+\frac{3^{n}}{4} & (-1)^{n+1}\frac{\sqrt{2}}{4}+\frac{\sqrt{2}}{4}3^{n} & \frac{1}{2}+(-1)^{n}\frac{1}{4}+\frac{3^{n}}{4} \end{pmatrix} $$

Problem 3

Problem: Find $\exp(A)$.


As shown in Problem 2, $A$ is diagonalizable, thus:

$$ \begin{align*} \exp(A)&=\exp(SDS^{-1})\\ &=I+SDS^{-1}+\frac{1}{2!}(SDS^{-1})^{2}+\frac{1}{3!} (SDS^{-1})^{3}+\cdots \\ &= I+SDS^{-1} +\frac{1}{2!}SD^{2}S^{-1}+\frac{1}{3!}SD^{3}S^{-1}+\cdots \end{align*} $$

Factoring out $S$ and $S^{-1}$ gives:

$$ \begin{align*} \exp(SDS^{-1})&=S(I+D+\frac{1}{2!}D^{2}+\frac{1}{3!}D^{3}+\cdots)S^{-1} \\ &=S\exp(D)S^{-1} \end{align*} $$

Since $D$ is a diagonal matrix, we have:

$$ \exp(D)=\begin{pmatrix} e & 0 & 0 \\ 0 & e^{-1} & 0 \\ 0 & 0 & e^{3} \end{pmatrix} $$

Calculating $\exp(A)=S\exp(D)S^{-1}$ yields:

$$ \exp(A)=\begin{pmatrix} \frac{1}{2}e+\frac{1}{4}e^{-1}+\frac{1}{4}e^{3} & -\frac{\sqrt{2}}{4}e^{-1}+\frac{\sqrt{2}}{4}e^{3} & -\frac{1}{2}e+\frac{1}{4}e^{-1}+\frac{1}{4}e^{3} \\ -\frac{\sqrt{2}}{4}e^{-1}+\frac{\sqrt{2}}{4}e^{3} & \frac{1}{2}e^{-1}+\frac{1}{2}e^{3} & -\frac{\sqrt{2}}{4}e^{-1}+\frac{\sqrt{2}}{4}e^{3} \\ -\frac{1}{2}e+\frac{1}{4}e^{-1}+\frac{1}{4}e^{3} & -\frac{\sqrt{2}}{4}e^{-1}+\frac{\sqrt{2}}{4}e^{3} & \frac{1}{2}e+\frac{1}{4}e^{-1}+\frac{1}{4}e^{3} \end{pmatrix} $$

Python Code

Verify the results with sympy:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
import sympy as sp

# Define matrix A
A = sp.Matrix([[1, sp.sqrt(2), 0],
                [sp.sqrt(2), 1, sp.sqrt(2)],
                [0, sp.sqrt(2), 1]])

# Calculate exp(A)
exp_A = A.exp()

# Print result
print("exp(A) =")
sp.pprint(exp_A)

Output is as follows:

UTokyo-2020-1-3

Problem 4

Problem: Prove the equality for real number $\alpha$. (Cayley-Hamilton theorem is fair game for the proof).


Find the characteristic equation for matrix $B$:

$$ \det(B-\lambda I)=\begin{pmatrix} -\lambda & -\frac{2}{3} & \frac{1}{3} \\ \frac{2}{3} & -\lambda & -\frac{2}{3} \\ -\frac{1}{3} & \frac{2}{3} & -\lambda \end{pmatrix} = -\lambda^{3}-\lambda $$

By the Cayley-Hamilton theorem, we know:

$$ P(B)=-B^{3}-B=0 $$

Thus:

$$ B^{3}=-B $$

From this, we can derive:

$$ \begin{align*} B^{4}&=B^{3}B=-B^{2} \\ B^{5}&=B^{4}B=-B^{3}=B \end{align*} $$

Substitute into $\exp(\alpha B)$:

$$ \begin{align*} \exp(\alpha B) &=I+\alpha B+\frac{1}{2!}(\alpha B)^{2}+\frac{1}{3!}(\alpha B)^{3}+\frac{1}{4!}(\alpha B)^{4}+\frac{1}{5!}(\alpha B)^{5}+\cdots \\ &=I+\alpha B+\frac{1}{2!}\alpha^{2}B^{2}-\frac{1}{3!}\alpha^{3}B-\frac{1}{4!}\alpha^{4}B^{2}+\frac{1}{5!}\alpha^{5}B+\cdots \\ &=I+(\alpha-\frac{1}{3!}\alpha^{3}+\frac{1}{5!}\alpha^{5}-\cdots)B+(\frac{1}{2!}\alpha^{2}-\frac{1}{4!}\alpha^{4}+\cdots)B^{2} \end{align*} $$

Notice the coefficient of $B$ is the Taylor series for $\sin(\alpha)$, and the coefficient of $B^{2}$ is the Taylor series for $1-\cos(\alpha)$. Therefore:

$$ \exp(\alpha B) = I + (\sin\alpha)B + (1-\cos(\alpha))B^{2} $$

Problem 5

Problem: Given a 3D real vector $a$, define the function $f$ for a 3D real vector $x$ (formula omitted). For $n \ge 2$, prove that $f$ is minimized when $x=(I+B^{2})a$.


From Problem 4, we have $\exp(\alpha B)$. Plugging in $\alpha=\frac{2\pi k}{n}$ gives us:

$$ \exp(\frac{2\pi k}{n}B)=I+(\sin(\frac{2\pi k}{n}))B+(1-\cos(\frac{2\pi k}{n}))B^{2} $$

Substituting $x=(I+B^{2})a$ into $f$ yields:

$$ \begin{align*} f(x)&=\sum_{k=1}^{n}\left \| [I+(\sin(\frac{2\pi k}{n}))B+(1-\cos(\frac{2\pi k}{n}))B^{2}]a - (I+B^{2})a \right \|^{2} \\ &=\sum_{k=1}^{n}\left \| [I+(\sin(\frac{2\pi k}{n}))B+(1-\cos(\frac{2\pi k}{n}))B^{2}-I-B^{2}]a \right \|^{2} \\ &=\sum_{k=1}^{n}\left \| [(\sin(\frac{2\pi k}{n}))B-(\cos(\frac{2\pi k}{n}))B^{2}]a \right \|^{2} \end{align*} $$

Since $f(x)$ is the square of a Euclidean norm (essentially a squared distance), $f(x)\ge0$.

Since $\sin x$ and $\cos x$ are $2\pi$-periodic, the summation from $k=1$ to $n$ effectively samples these functions across a full period. For $n \ge 2$, the sum of these sampled values is zero due to symmetry. Hence:

$$ \sum_{k=1}^{n}(\sin(\frac{2\pi k}{n}))=\sum_{k=1}^{n}(\cos(\frac{2\pi k}{n}))=0 $$

Therefore, when $x=(I+B^{2})a$, $f(x)=0$. Since we also know $f(x)\ge0$, the original claim holds true.

References

MIT - Linear Algebra

LaTeX Equation Editor

《Linear Algebra》Tutorial Videos by Professor Song Hao

Wikipedia - Cayley-Hamilton theorem

Dragon-Slaying Technique: Crushing High Powers of Matrices with Cayley-Hamilton Theorem, It’s That Easy.


Wrote with ChatGPT