A proposition states that for a finite Galois extension , the first group cohomology is trivial, .
We denote the Galois group as . And for any , that is to say, . We want to find some such that .
Then according to Dedekind’s theorem that the homomorphisms of vector space () are linearly independent, which means that there is some such that . So, we get that . In other words, we have that
Then, is such a candidate such that .
One special case of this proposition is the Hilbert’s 90th theorem, which states:
If is a cyclic Galois extension with the Galois group where is one generator of . Then for any such that , there is a , such that .
What does this theorem mean? It means that if satisfies some condition(the norm is ), then the linear transformation has an eigenvector of eigenvalue .
One crucial observation is that, the linear transformation has finite order: . Since is of finite order, say , then we have that . This shows that, not strictly speaking, that has eigenvalue , thus the existence of a corresponding eigenvector .
So, what is the role of the condition here? In some sense it is a coincidence: if the norm is not , yet anyway it must lies in the base field , say, , so the problem is changed to find an eigenvector of eigenvalue instead of .
Another proof of this theorem using the proposition is to construct an element . We do as follows: and use the property that we construct the others . Note that, we should verify that it is indeed a cocycle: for example, . Here we use essentially the condition that , and see that , which shows the coherence. So, according to the proposition, we have that there exists a such that , which completes the proof.
This inspires us to reconsider the meaning of the proposition. The proposition says that, we want to find a common eigenvector of common eigenvalue for these linear transformations . Suppose that is of order , then using the fact that is a cocycle, we can show that . So, this shows that these have a common eigenvalue . So, the problem is reduced to seek a common eigenvector for this common eigenvalue. So, next perhaps we should try to prove that these commute. Yet we have that
This shows that, is a group homomorphism! This is rather interesting: we can use the theory of representation to find a common eigenvector(a pity is that we can not in general show that these commute). So, one common way is to choose(suppose that and the characteristic of are coprime) some , then . We have, of course, to show that . This can be done by showing that are linearly independent over , so we can use Dedekind’s lemma.
There is another proof to Hilbert’s 90th theorem which is rather insightful. The problem is to find an eigenvector for . Perhaps one good idea is to look at this question from a larger field, say . So, we take the tensor product . Suppose that is spanned by an element and all its conjugates, . So, we can expand to . Note that, has a basis where . So, we can define . So, in fact permutes the factors in . Now for an element , we have that . So, if we can find an eigenvector for this , then we are done. Quite surprisingly, we indeed can find one! Look at the element . Using again the essential hypothesis that , we find that this is one eigenvector of eigenvalue for . The last argument is essential the same as the first one, that is in fact permutes the factors in , so at last it is of finite order.
The last proof can be largely generalized, see this(the answer by Emerton). In that page, we can find another proof, which reduces the problem to a rather simple fact: all vector spaces over of finite dimension are isomorphic.