# Hilbert’s 90th theorem

A proposition states that for a finite Galois extension $k\subset K$, the first group cohomology is trivial, $H^1(Gal(K/k),K^*)=0$.

We denote the Galois group as $G=Gal(K/k)$. And for any $F\in H^1(G,K^*)$, that is to say, $F(gh)=F(g)g(F(h))$. We want to find some $a\in K^*$ such that $F(g)=g(a)/a$.

Then according to Dedekind’s theorem that the homomorphisms of $k-$vector space $g:K\rightarrow K$($g\in G$) are linearly independent, which means that there is some $b\in K$ such that $\sum_{g\in G}F(g)g(b)=c\neq0$. So, we get that $h(c)=\sum_gh(F(g))h(g(b))=\sum_gF(hg)F(h)^{-1}hg(b)=\sum_gF(g)g(b)F(h)^{-1}$. In other words, we have that

$F(h)=c/h(c)$

Then, $1/c$ is such a candidate such that $F(h)=h(1/c)1/(1/c)$.

One special case of this proposition is the Hilbert’s 90th theorem, which states:

If $k\subset K$ is a cyclic Galois extension with the Galois group $G=<\sigma>$ where $\sigma$ is one generator of $G$. Then for any $a\in K^*$ such that $N_{K/k}(a)=1$, there is a $b\in K$, such that $a\sigma(b)=b$.

What does this theorem mean? It means that if $a$ satisfies some condition(the norm is $1$), then the linear transformation $a\sigma:K\rightarrow L$ has an eigenvector $b$ of eigenvalue $1$.

One crucial observation is that, the linear transformation $f=a\sigma$ has finite order: $f(b)=a\sigma(b),f^2(b)=a\sigma(a)\sigma^2(b),...,f^k(b)=a\sigma(a)\sigma^2(a)...\sigma^{k-1}(a)\sigma^k(b)$. Since $G$ is of finite order, say $n$, then we have that $f^n(b)=N_{K/k}(a)b=b$. This shows that, not strictly speaking, that $f=a\sigma$ has eigenvalue $1$, thus the existence of a corresponding eigenvector $b$.

So, what is the role of the condition $N_{K/k}(a)=1$ here? In some sense it is a coincidence: if the norm is not $1$, yet anyway it must lies in the base field $k$, say, $N_{K/k}(a)=a'\in k$, so the problem is changed to find an eigenvector of eigenvalue $a'^{1/n}$ instead of $1$.

Another proof of this theorem using the proposition is to construct an element $f\in H^1(G,K^*)$. We do as follows: $f(id)=1,f(\sigma)=a$ and use the property that $f(\sigma^{i+1})=f(\sigma)\sigma(f(\sigma^i))$ we construct the others $f(\sigma^i)$. Note that, we should verify that it is indeed a cocycle: for example, $f(id)=f(\sigma^n)=a\sigma(a)\sigma^2(a)...\sigma^{n-1}(a)=N_{K/k}(a)$. Here we use essentially the condition that $N_{K/k}(a)$, and see that $f(\sigma^n)=1$, which shows the coherence. So, according to the proposition, we have that there exists a $b\in K$ such that $a=f(\sigma)=\sigma(b)/b$, which completes the proof.

This inspires us to reconsider the meaning of the proposition. The proposition says that, we want to find a common eigenvector $b$ of common eigenvalue $1$ for these linear transformations $F(g)g:K\rightarrow K$. Suppose that $G$ is of order $n$, then using the fact that $F$ is a cocycle, we can show that $(F(g)g)^n=id_K$. So, this shows that these $F(g)g$ have a common eigenvalue $1$. So, the problem is reduced to seek a common eigenvector for this common eigenvalue. So, next perhaps we should try to prove that these $F_g=F(g)g:K\rightarrow K$ commute. Yet we have that

$F_gF_h(b)=F_g(F(h)h(b))=F(g)g(F(h)h(b))=F(g)g(F(h))gh(b)=F(gh)gh(b)=F_{gh}(b)$.

This shows that, $\bar{F}:G\rightarrow GL_k(K),g\mapsto F_g$ is a group homomorphism! This is rather interesting: we can use the theory of representation to find a common eigenvector(a pity is that we can not in general show that these $F_g$ commute). So, one common way is to choose(suppose that $n$ and the characteristic of $k$ are coprime) some $b'\in K$, then $b=\frac{1}{n}\sum_{g\in G}F_g(b')$. We have, of course, to show that $b\neq0$. This can be done by showing that $g\in G$ are linearly independent over $K$, so we can use Dedekind’s lemma.

There is another proof to Hilbert’s 90th theorem which is rather insightful. The problem is to find an eigenvector for $a\sigma$. Perhaps one good idea is to look at this question from a larger field, say $K$. So, we take the tensor product $V=K\bigotimes_k K$. Suppose that $K$ is spanned by an element $x_1\in K$ and all its $n$ conjugates, $K=k(x_1,...,x_n)$. So, we can expand $F=a\sigma$ to $V$. Note that, $V$ has a basis $V=k$ where $K=k=k$. So, we can define $\sigma(xy_i)=\sigma(x)y_{i+1}(y_{n+1}=y_1)$. So, $\sigma$ in fact permutes the factors in $V=K\times K\times K...\times K$. Now for an element $(b_1,b_2,...,b_n)=\sum_ib_iy_i\in V$, we have that $F(b_1,b_2,...,b_n)=(a\sigma(b_n),a\sigma(b_1),...,a\sigma(b_{n-1}))$. So, if we can find an eigenvector for this $F:V\rightarrow F$, then we are done. Quite surprisingly, we indeed can find one! Look at the element $(1,a,a\sigma(a),a\sigma(a)\sigma^2(a),...,a\sigma(a)\sigma^2(a)...\sigma^{n-2}(a))$. Using again the essential hypothesis that $N_{K/k}(a)=1$, we find that this is one eigenvector of eigenvalue for $F$. The last argument is essential the same as the first one, that is $F$ in fact permutes the factors in $V$, so at last it is of finite order.

The last proof can be largely generalized, see this(the answer by Emerton). In that page, we can find another proof, which reduces the problem to a rather simple fact: all vector spaces over $k$ of finite dimension are isomorphic.