Sunday, November 30, 2025

The determinant of transvections (an update)

In the previous post, I had explained how I could prove a general version of the classic fact that transvections have determinant 1. Recall here that transvections in an $R$-module $M$ are linear maps of the form $x\mapsto x+f(x)v$, where $f$ is a linear form on $M$ and $v\in M$ is a vector such that $f(v)=0$. To be able to talk of determinant, I assumed that $M$ had a finite basis, so that linear maps correspond to matrices and the determinant of the matrix serves as a definition for the definition of the corresponding linear map. (It does not depend on the choice of a basis.) 

That proof had three steps: 

  1.  When $R$ is a field $F$ (and $\dim(M)\geq 2$), one can use linear algebra to get a basis $(e_1,\dots,e_n)$ of $M$ such that $v=e_1$ and $f=e_2^*$. Then the matrix of the transvection is triangular with ones on the diagonal, so its determinant is $1$. 
  2. When $R$ is a domain, one can consider its field of fractions $F$ and the base change to $F$ of the given transvection. By the case of fields, its determinant is $1$, and since $R$ is a subring of $F$, the determinant of the initial transvection is $1$ as well. 
  3. In general, one observes that our transvection is deduced, by base change, from the “universal” transvection, which lives on the ring $R=\mathbf Z[f_1,\dots,f_n,v_1,\dots,v_n]/\langle\sum f_i v_i\rangle$ and prove that this ring is an integral domain when $n\geq 2$. 

But one can do more with less effort! 

Indeed, the linear forms given by an expression $x\mapsto x+f(x)v$ are also interesting when $f(v)=-2$: they give symmetries in the direction given by $v$ with respect to the hyperplane defined by $f$, and, at least over fields, their determinant is $-1$. This suggests the following, more general, result.  

Theorem. — Let $M$ be an $R$-module with a finite basis, let $f\in M^*$ be a linear form and let $v\in M$. The linear map given $u$ by $x\mapsto x + f(x)v$ has determinant $1+f(v)$. 

To prove this result, we will first prove the case where $R$ is a field $F$. Then, we have two subcases, according to $f(v)=0$ or $f(v)\neq 0$. 

  • If $f(v)=0$, then $u$ is a transvection and its determinant is $1=1+f(v)$, as shown in the previous post.
  • Otherwise, $f(v)\neq 0$ and there is a basis $(e_1,\dots,e_n)$ of $M$ such that $(e_1,\dots,e_{n-1})$ is a basis of $\ker(f)$ and $e_n=v$. Then the matrix of $u$ is diagonal, with entries $(1,\dots,1,1+f(v))$, and its determinant is $1+f(v)$, as claimed. 

The case where $R$ is a domain is proved as before, by embedding $R$ into its field of fractions $F$. 

Finally, we reduce to the general case which is simply $R=\mathbf Z[f_1,\dots,f_n,v_1,\dots,v_n]$, without any need to quotient by the ideal $\langle \sum f_i v_i\rangle$. That ring is an integral domain, hence the theorem. The gain is that we don't need anymore to prove that the polynomial $\sum f_i v_i$ is irreducible.

No comments :

Post a Comment