Eigenvalues and invertible matrices for fields and vector spaces:
Let $K$ be a field (so $K^n$ is a $K$-vector space) and let $A \in K^{n\times n}$ be an $n\times n$-matrix. Then we have the following definition and theorem:
Definition: An element $\lambda\in K$ is called an eigenvalue of $A$ iff there is a $v\in K^n\setminus\{0\}$ such that $Av=\lambda v$.
Theorem: The matrix $A$ is invertible iff all of its eigenvalues are invertible (i. e. 0 isn't an eigenvalue).
"Eigenvalues" and invertible matrices for rings and modules:
Let $R$ be a commutative ring with a multiplicative identity (so $R^n$ is an $R$-module) and let $B\in R^{n\times n}$ be an $n\times n$-matrix. Then let's define what an eigenvalue of $B$ is:
Definition: An element $\mu\in R$ is called an eigenvalue of $B$ iff there is a $v\in R^n\setminus\{0\}$ such that $Bv=\mu v$.
Clearly, it is possible for an $R^{n\times n}$-matrix to have only invertible eigenvalues and not to be invertible itself. Take for example $\begin{pmatrix} 3 & -13 \\ 1 & -3\end{pmatrix}\in \mathbb{Z}^{2\times 2}$. All of its eigenvalues are invertible (simply because it doesn't have any eigenvalues in $\mathbb{Z}$); however the matrix is not invertible (because its determinant is 4 which is not a unit in $\mathbb{Z}$). So the following is for sure false:
Not a theorem! The matrix $B$ is invertible if all of its eigenvalues are invertible (i. e. are units in $R$).
Now, I wonder if the following is true:
Theorem? If the matrix $B$ is invertible, then all of its eigenvalues are invertible (i. e. are units in $R$).
If it is true, how can it be proved? If it is false, what would be a counter-example?