Kronecker's theorem of the title is the following.
Theorem. — Let $\alpha\in\mathbf C$ be an algebraic integer all of whose conjugates have absolute value at most $1$. Then either $\alpha=0$, or $\alpha$ is a root of unity.
This theorem has several elementary proofs. In this post, I explain the simple proof proposed by Gebhart Greiter in his American Mathematical Monthly note, adding details so that it would (hopefully) be more accessible to students.
The possibility $\alpha=0$ is rather unentertaining, hence let us assume that $\alpha\neq0$.
Let us first analyse the hypothesis. The assumption that $\alpha$ is an algebraic integer means that there exists a monic polynomial $f\in\mathbf Z[T]$ such that $f(\alpha)=0$. I claim that $f$ can be assumed to be irreducible in $\mathbf Q[T]$; it is then the minimal polynomial of $\alpha$. This follows from Gauss's theorem, and let me give a proof for that. Let $g\in \mathbf Q[T]$ be a monic irreducible factor of $f$ and let $h\in\mathbf Q[T]$ such that $f=gh$. Chasing denominators and dividing out by their gcd, there are polynomials $g_1,h_1\in\mathbf Z[T]$ whose coefficients are mutually coprime, and natural integers $u,v$ such that $g=ug_1$ and $h=vh_1$. Then $(uv)f=g_1 h_1$. Since $f$ is monic, this implies that $uv$ is an integer. Let us prove that $uv=1$; otherwise, it has a prime factor $p$. Considering the relation $(uv)f=g_1h_1$ modulo $p$ gives $0=g_1 h_1 \pmod p$. Since their coefficients are mutually coprime, the polynomials $g_1$ and $h_1$ are nonzero modulo $p$, hence their product is nonzero. This is a contradiction.
So we have a monic polynomial $f\in\mathbf Z[T]$, irreducible in $\mathbf Q[T]$, such that $f(\alpha)=0$. That is to say, $f$ is the minimal polynomial of $\alpha$, so that the conjugates of $\alpha$ are the roots of $f$. Note that the roots of $f$ are pairwise distinct — otherwise, the $\gcd(f,f')$ would be a nontrivial factor of $f$. Moreover, $0$ is not a root of $f$, for otherwhise one could factor $f=T\cdot f_1$.
Let us now consider the companion matrix to $f$: writing $f=T^n+c_1T^{n-1}+\dots+c_n$, so that $n=\deg(f)$, this is the matrix \[ C_f = \begin{pmatrix} 0 & \dots & 0 & -c_n \\ 1 & \ddots & \vdots & -c_{n-1} \\ 0 & \ddots & & -c_{n-2} \\ 0 & \cdots & 1 & -c_1 \end{pmatrix}.\] If $e_1,\ldots,e_n$ are the canonical column vectors $e_1=(1,0,\dots,0)$, etc., then $C_f\cdot e_1=e_2$, \ldots, $C_f \cdot e_{n-1}=e_{n}$, and $C_f\cdot e_n = -c_{n} e_1-\dots -c_1 e_n$. Consequently, \[ f(C_f)\cdot e_1 = C_f^n\cdot e_1 +c_1 C_f^{n_1} \cdot e_1+ \dots +c_n e_1 = 0.\] Moreover, for $2\leq k\leq n$, one has $f(C_f)\cdot e_k=f(C_f)\cdot C_f^{k-1}\cdot e_1=C_f^{k-1}\cdot f(C_f)\cdot e_1=0$. Consequently, $f(C_f)=0$ and the complex eigenvalues of $f(C_f)$ are roots of $f$. Since $f$ has simple roots, $C_f$ is diagonalizable and their exists a matrix $P\in\mathrm{GL}(n,\mathbf C)$ and diagonal matrix $D$ such that $C_f=P\cdot D\cdot P^{-1}$, the diagonal entries of $D$ are roots of $f$. Since $0$ is not a root of $f$, the matrix $D$ is invertible, and $C_f$ is invertible as well. More precisely, one can infer from the definition of $C_f$ that $g(C_f)\cdot e_1\neq 0$ for any nonzero polynomial $g$ of degre $<n$, so that $f$ is the minimal polynomial of $C_f$. Consequently, all of its roots are actually eigenvalues of $C_f$, and appear in $D$; in particular, $\alpha$ is an eigenvalue of $C_f$.
For every $k\geq 1$, one has $C_f^k=P\cdot (D^k)\cdot P^{-1}$. Since all entries of $D$ have absolute value at most $1,$ the set of all $D^k$ is bounded in $\mathrm{M}_n(\mathbf C)$. Consequently, the set $\{C_f^k\,;\, k\in\mathbf Z\}$ is bounded in $\mathrm M_n(\mathbf C)$. On the other hand, this set consists in matrices in $\mathrm M_n(\mathbf Z)$. It follows that this set is finite.
There are integers $k$ and $\ell$ such that $k< \ell$ and $C_f^k=C_f^\ell$. Since $C_f$ is invertible, one has $C_f^{\ell-k}=1$. Since $\alpha$ is an eigenvalue of $C_f,$ this implies $\alpha^{\ell-k}=1$. We thus have proved that $\alpha$ is a root of unity.