Semisimplicity and Representations, Part 2

This entry is a direct continuation of this post and is part of a larger series on representation theory that starts with that post. In this blog post, we continue our study of semisimple rings and bring it to a conclusion by proving a classification. Then we apply this insight into the structure of semisimple algebras to the the group algebra to get more information on irreducible representations.

 Endomorphism Rings of Semisimple Modules

The goal behind the following lemmas and corollaries is to compute the endomorphism ring of a semisimple module that is a finite direct sum of simple modules, the motivation being that to finite-dimensional representations in the semisimple case (cf. 3.20) and to a semisimple rings as a module over itself (cf. 3.24).
The idea behind the proof is simply to “multiply out” the hom set that gives us endomorphisms by pulling out (finite) direct sums from both arguments and then use Schur’s lemma to understand the hom sets between simple modules.

Definition 4.1 For a ring R and a natural number n, M_n(R) denotes the matrix ring with entries in R with the usual formulas for addition and multiplication.

The following lemma is a generalization of a well-known fact from linear algebra:

Lemma 4.2 Let S be a right module over a ring R, then \mathrm{End}_R(S^n) \cong M_n(\mathrm{End}_R(S))

Proof Lets index the direct summands in S^n = \bigoplus_{i=1}^n S_i (so S_i=S for all i, but we’re keeping track of which component it is)
Note that as abelian groups, the isomorphism is clear, since we have \mathrm{End}_R(S^n) = \mathrm{Hom}_R( \bigoplus_{i=1}^n S_i,  \bigoplus_{j=1}^n S_j) = \bigoplus_{i=1}^n\bigoplus_{i=j}^n \mathrm{Hom}_R(S_i,S_j)
Now for each summand \mathrm{Hom}_R(S_i,S_j)=\mathrm{End}_R(S) at least as an abelian group, which we can use to idenitfy elements in \bigoplus_{i=1}^n\bigoplus_{i=j}^n \mathrm{Hom}_R(S_i,S_j) with elements in M_n(\mathrm{End}_R(S) by sending the component living in \mathrm{Hom}_(S_i,S_j) with the entry at the position (i,j). The verification that this respects multiplication amounts to the same calculation that shows that composition of linear maps corresponds to matrix multiplication.

This gives a description of the endomorphism rings of finite direct sums of simple modules, where each simple submodule is in the same isomorphism class. To properly “multiply out” our endomorphism rings, we just need another application of Schur’s lemma:

Lemma 4.3 Let M and N be modules over a ring R and suppose that \mathrm{Hom}_R(M,N) = 0 = \mathrm{Hom}_R(N,M), then we have as rings \mathrm{End}_R(M \oplus N) \cong \mathrm{End}_R(M) \times \mathrm{End}_R(N)
If R is a K-algebra for some field K, then this an isomorphism of K-algebras.

Proof We always have an injection \mathrm{End}_R(M) \times \mathrm{End}_R(N) \to \mathrm{End}_R(M \oplus N) given by (\varphi,\psi) \mapsto \varphi \oplus \psi. Here \varphi \oplus \psi means that we apply \varphi on the first, and \psi on the second summand. Since addition and composition of such endomorphisms can be carried out component-wise, this is a ring homomorphism and if everything is a K-vector space, it is also K-linear. The image of that map is the set of endomorphisms of M \oplus N that map M and N to itself. Given our conditions, if we have any endomorphism of M \oplus N, restricting to M and composing with the projection M \oplus N \to N, gives a homomorphism M \to N which is zero by assumption. This shows that the endomorphism maps M to itself, and by the same argument N, thus the map is surjective.

Corollary 4.4 Let M be a right module over a ring R that is a finite sum of simple submodules M=\bigoplus_{i=1}^k S_i^{n_i} such that the S_i are pairwise non-isomorphic. Let D_i=\mathrm{End}_R(S_i) be the endomorphism rings, then \mathrm{End}_R(M) \cong \prod_{i=1}^k M_{n_i}(D_i)

Proof By Schur’s lemma, there are no nonzero homomorphisms between semisimple modules that don’t have a simple submodule in common, so that we can apply 4.3 k times to get that \mathrm{End}_R(M) \cong \prod_{i=1}^k \mathrm{End}_R(S_i^{n_i}). Now apply 4.2 to each factor.

Using this, we get a partial converse to Schur’s lemma:

Corollary 4.5 Let M be a module that is a finite sum of simple submodules, then M is simple if and only if the endomorphism ring of M is a division ring.

Proof One direction is just Schur’s lemma. The other direction follows from 4.4 by noting that the only way that \prod_{i=1}^k M_{n_i}(D_i) is a division ring is if k=1=n_1.

Enter the Matrix Ring

Corollary 4.4 already gives us a description of endomorphism rings of modules which are finite sums of simple submodules. In the description, matrix rings over division algebras appear. This means to deepen our understanding of those endomorphism rings, we should study matrix rings.

We begin by studying their ideals:

Lemma 4.6 Let R be a ring and n \in \Bbb N, then the map from two-sided ideals of R to two-sided ideals of M_n(R), sending I to M_n(I) is a bijection. (Here M_n(I) is the subset of all elements in M_n(R) where each entry is contained in I)

Proof It’s clear that M_n(I) is a two-sided ideal in M_n(R) when I is a two-sided ideal in I. It’s also clear that the map I \mapsto M_n(I) is injective (we can recover I from M_n(I) just by looking at the set of elements of R that appear in the entries of the matrices in M_n(I).
For surjectivity, let E_{ij} be the matrix that is zero everwhere except at (i,j), where it is 1. Then if J \subset M_n(R) is a twosided ideal and A \in J, we compute that E_{11}AE_{11} is the matrix that agrees with A at the place (1,1) and is zero everywhere else. A calculation shows that the set of elements in R that appear as an entry in the place (1,1) is a two-sided ideal I in R, but then J\supset M_n(I), because we can permute the entries of a matrix that has only one non-zero entry by multiplying from the left and right by permutation matrices (and then by taking sums, we get that every element in M_n(I) is contained in J.
On the other hand J \subset M_n(I), because we can first multiply a matix from the left and right by permutating matrices and then multiply by E_{11} from the left and right to get that we could have also defined I as the set of elements which appear as any matrix entry for elements in J. Thus J \subset M_n(I).

This lemma implies in particular that if D is a division ring, then M_n(D) has no proper non-zero two-sided ideals. We can use this to together with another property that holds in general for finite-dimensional modules:

Lemma 4.7 Let M be a module over a ring R and suppose that R contains a division ring D such that M is a finite-dimensional D-vector space. (We don’t require that R is a D-algebra, i.e. that D is commutative and in the center of R), then M contains a simple submodule.

Proof Start with any non-zero submodule of M. e.g. M itself. If M is not simple, choose a proper non-zero submodule of M, then if that submodule is not simple choose a proper non-zero submodule etc.
Since the dimension over D has to decrease at every step, this process has to terminate, which gives us a simple submodule.

Lemma 4.8 Let R be a ring that has no proper nonzero two-sided ideals that has a minimal right ideal I, then R is semisimple and every simple module over R is isomorphic to I

Proof Let R be such a ring and let I be a minimal right ideal, then we can make a two-sided ideal out of I by taking the product RI=\sum_{r \in R} rI.
Since R doesn’t contain proer nonzero two-sided ideals and RI \neq 0, we get that R=RI=\sum_{r \in R} rI. For each fixed r, rI is a homomorphic image of the simple module I under the image of the right R-linear map x \mapsto rx, so the image is either isomorphic to I (and hence simple) or zero by Schur’s lemma.
The implication (2.) implies (3.) (compare the proof or the remark after the proof) in 3.14 gives us that there is a subset T \subset R such that R=\bigoplus_{r \in T} rI. Thus R is semisimple. 3.25 implies that every simple module is isomorphic to a direct summand in the sum \bigoplus_{r \in T} rI, but every summand of that is isomorphic to rI which implies that every simple module is isomorphic to I.

A natural question is how to recover R from M_n(R). The following lemma provides this, if we also have the module R^n:

Lemma 4.9 Let R be a ring and consider R^n as the column space, which is a right M_n(R)-module, then we get that \mathrm{End}_{M_n(R)}(R^n)=R. If we think of \mathrm{End}_{M_n(R)}(R^n) as a submodule of \mathrm{End}_{R}(R^n)=M_n(R), then \mathrm{End}_{M_n(R)}(R^n) is given by all scalar multiples of the identity.

Let E_{i,j} be the matrix that has zeroes everywhere except a 1 at (i,j), let e_i be the vector in R^n that has zeroes everyhwere except a one in the i-th component. For any \sigma \in S_n let P_\sigma be the permutation matrix associated to \sigma such that applying that matrix corresponds to permuting the entries with \sigma.
Then for any \varphi \in \mathrm{End}_{M_n(R)}(R^n), we get that \varphi(e_1)=\varphi(e_1 E_{1,1})=\varphi(e_1)E_{1,1}. Now multiplying with E_{1,1} corresponds to making all entries zero except the first entry which is left untouched. Thus \varphi(e_1) is a multiple of e_i, so we can find a unique \lambda \in R such that \varphi(e_1)=\lambda e_1.
For every i let \tau_{1,i} \in S_n be the transposition that switches 1 and i, then we have \varphi(e_i)=\varphi(e_1 P_{\tau_{1,i}})=\varphi(e_1) P_{\tau_{1,i}} = \lambda e_1 P_{\tau_{1,i}} = \lambda e_i.
Since the e_i form a basis, we have seen that \varphi is just scalar multiplication (from the left) with \lambda, or as a matrix \lambda \cdot \mathrm{Id}, where \mathrm{Id} denotes the identity matrix.
This proves \mathrm{End}_{M_n(R)}(R^n)=R.

Putting the last few lemmas together, we obtain the main result on modules over matrix rings over a division algebra:

Lemma 4.10 Let D be a division algebra and n \in \mathbb{N}, then M_n(D) is semismple and the unique simple right module over M_n(D) is D^n and \mathrm{End}_{M_n(D)}(D^n) = D. (Here we’re regarding D^n as row vectors so that the right M_n(D)-action makes sense.)

Proof 4.6 implies that M_n(D) has no nonzero proper two-sided ideals and 4.7 implies that it has a simple submodule, so that 4.8 applies and M_n(D) is semisimple with only one isomorphism class of simple modules.
The statement that D^n is a simple M_n(D)-module is just linear algebra:
Given any non-zero vector v in D^n, for all w \in D^n, there’s a linear transformations (i.e. a matrix) that sends v to w. Thus D^n is a simple module over M_n(D) as every non-zero submodule is the whole module. The part about the endomorphism ring follows from 4.9.

Now we come to the main theorem about semisimple rings that classifies them and also relates their ring-theoretic properties to their modules.

Theorem 4.11 (Artin-Wedderburn) A ring R is a semisimple iff there exist natural numbers k, n_1, \dots, n_k and division rings D_1, \dots D_k such that \displaystyle R \cong \prod_{i=1}^k M_{n_i}(D_i).
Here the k is the number of simple modules M_1, \dots M_k up to isomorphism, D_i is the endomorphism ring of M_i and \mathrm{dim}_{D_i}(M_i)=n_i. In particular, k is uniquely determined and the n_i and D_i are unique up to isomorphism and permutation of the factors.

Proof Note that for every ring R, considered as a right module over itself, we have an isomorphism of rings R \cong \mathrm{End}_R(R) by applying 4.9 with n=1. Let R=\bigoplus_{i=1}^k M_i^{n_1} by a decomposition of R into simple right submodules such that the M_i are pairwise non-isomorphic. Let D_i=\mathrm{End}_R(M_i).
By 4.4, we have an isomorphism \mathrm{End}_R(R) \cong \prod_{i=1}^k M_{n_i}(D_i). Here k is uniquely determined as number of simple modules over R up to isomorphism. The D_i together with the n_i are uniquely determined as the endomorphism rings of the simple modules and the dimension of the simple modules over their endomorphism ring (cf. lemma 4.10 for modules over matrix rings over a division algebra and 2.5 for modules over a product).
For the reverse direction, note that matrix rings over division rings are semisimple by 4.10 and 3.15 implies that finite products of semisimple rings are semisimple.

After having finally proved the main theorem for semisimple rings, we can give lots of applications:

Corollary 4.12 Semisimplicity for rings is left-right symmetric, i.e. if R is symmetric, so it is opposite ring R^{op}.

Proof Using 4.11, it’s enough to show that M_n(D)^{op}\cong M_n(D^{op}) since the opposite ring of a division ring will be a division rings as well.
An isomorphism M_n(D)^{op} \to M_n(D^{op}) is given by transposition A \mapsto A^{T}.

Corollary 4.13 Let A be a semisimple ring, then the following are equivalent:

  1. R is commutative
  2. For all simple R-modules M, the endomorphism ring D=\mathrm{End}_R(D) is a field and \mathrm{dim}_D(M)=1.

Proof We apply 4.11: \prod_{i=1}^k M_{n_i}(D_i) is commutative iff all n_i are 1 and all D_i are commutative, i.e. fields.

Corollary 4.14 Let G be an abelian group, then every irreducible real representation of G is at most 2-dimensional.

Proof Let G be an abelian finite group. Since \mathbb{C} is the algebraic closure of \mathbb{R} and has dimension 2 over \mathbb{R}, we get that all finite field extensions of \mathbb{R} have dimension at most 2. By 4.13 all simple \Bbb{R}[G]-modules are one-dimensional over their endomorphism rings, which are finite field extensions of \mathbb{R} by 3.27 and 4.13, so the endomorphism rings are at most two-dimensional, thus all simple \Bbb{R}[G]-modules have dimension at most 2 over \mathbb{R}.

Corollary 4.15 Let G be a finite group and K be an algebraically closed field such that the characteristic of K doesn’t divide the order of G, then the following are equivalent:

  1. G is abelian
  2. All irreducible representations of G are one-dimensional
  3. The number of irreducible representations is equal to the order of G

Proof The equivalence of (1) and (2) follows from 4.13 and 3.27 and 3.29, since 3.27 and 3.29 together imply that the endomorphism ring of any simple K[G]-module is K, so the statement follows from 4.13.
The equivalence of (2) and (3) follows from 3.30

Reminder For a group G, the abelianization G^{ab} is the largest commutative quotient of G. Explicitly, it is given by the quotient of the subgroup generated by all commutators. It has the universal property that every morphism from G to an abelian group factors uniquely through the map G \to G^{ab}.

Corollary 4.16 Let G be a finite group and K be an algebraically closed field such that the characteristic of K doesn’t divide the order of G, then the number of one-dimensional representations of G up to isomorphism is equal to the order of the abelianization |G^{ab}|

Proof One-dimensional representations are homomorphisms G \to \mathrm{GL}_1(K) = K^\times.
Since K^\times is commutative, these correspond to homomorphisms G^{ab} \to \mathrm{GL}_1(K), i.e. one-dimensional representations of G^{ab}. One-dimensional representations are automatically irreducible and 4.15 tells us that since G^{ab} is abelian, there are exactly |G^{ab}| up to isomorphism.

The Center of the Group Algebra

The last corollary gives a partial answer to the question for the number of irreducible representations of a group, by characterizing the number of one-dimensional representations in terms of the group theory of G. In this section, we give a group-theoretic characterization of the number of all irreducible representations. (Given suitable assumptions on the field)

Definition 4.17 If R is a ring, then the center Z(R) is defined as Z(R)=\{z \in R \mid \forall r \in R: zr=rz \}. It is a commutative subring of R.

Lemma 4.18 If R is a ring and n \in \mathbb{N}, then Z(M_n(R)) consits of those diagonal matrices where all diagonal entries are the same value that lies in Z(R). So we have an isomorphism of rings Z(R) \cong Z(M_n(R)).

Proof We rergard R^n as row vectors which is naturally a right M_n(R)-module. Then for z \in Z(M_n(R)), the map R^n \to R^n, v \mapsto vz is M_n(R)-linear, because for M \in M_n(R), we have vMz=vzM.
By 4.9 z is a diagonal matrix where all diagonal entries are the same. Clearly the diagonal entry must lie in Z(R), as z commutes with other such diagonal matrices.

Lemma 4.19 If R and S are rings, then Z(R \times S)=Z(R) \times Z(S).

Proof This is an easy computation. Multiplication in R \times S is defined component-wise.

Corollary 4.20 If R is a semisimple ring with the Artin-Wedderburn decomposition R \cong \prod_{i=1}^k M_{n_i}(D_i), then Z(R) \cong \prod_{i=1}^k Z(D_i).

Corollary 4.21 If A is a semisimple algebra over a field K, then \mathrm{dim}_K(Z(A)) \geq k where k is the number of simple A-modules up to isomorphism. We have equality if K is algebraically closed.

Proof By 4.11, the number of simple A-modules up to isomorphism is k if A= \prod_{i=1}^k M_{n_i}(D_i). By 4.20, we have Z(R) \cong \prod_{i=1}^k Z(D_i). Each factor has least dimension 1, which shows the inequality.
If K is algebraically closed, then D_i=K for all i by 3.29 which shows that equality holds.

Corollary 4.22 If A is semisimple real algebra and k is the number of simple A-modules up to isomorphism, then 2k \geq \mathrm{dim}_{\mathbb{R}}(Z(A)) \geq k

Proof Argue as in the proof of 2.21 and use the fact that Z(D) where D is a finite-dimensional division algebra over \mathbb{R} has to be isomorphic to \mathbb{R} or \mathbb{C}, since it is a finite field extension of \mathbb{R}, so the dimension is at most two.

These corollaries relate the number of simple modules of a semisimple algebra and their center. Thus the next thing to do is to study the center of group algebras.

Reminder If G is a group, then for g \in G, the conjugacy class of g consists of all elements in G that are conjugate to g. In other words, if we consider the action G \times G \to G (h,g) \mapsto hgh^{-1}, then this is the orbit of g under this action. The different conjugacy classes are the minimal subsets of G that are closed under conjugation and form a partition of G.

Lemma 4.23 Let K be a field and let G be a group (we don’t need it to be finite). Then Z(K[G]) has a K-basis given by the set of elements of the form \sum_{g \in C} g where C varies over all conjugacy classes of G with finitely many elements.

Proof Since K[G] is a K-algebra with a K-basis given by G, we can check if an elements is in the center just by checking if it commutes with every element of G. Let x=\sum_{g \in G} \lambda_g g be an element of Z(K[G]) (so all but finitely many \lambda_g are zero. Then x is in the center of K[G] if and only if for all h \in G, we have xh=hx \Leftrightarrow x=h^{-1}xh.
Writing out x, this equation becomes \sum_{g \in G} \lambda_g g = h^{-1}(\sum_{g \in G} \lambda_g g)h =\sum_{g \in G} \lambda_g h^{-1}gh=\sum_{g \in G} \lambda_{hgh^{-1}} g. Here we used that g \mapsto hgh^{-1} is a bijection.
Comparing coefficients, this is equivalent to \lambda_g = \lambda_{hgh^{-1}} for all g.
If this holds for all g, then we get that the (finite) set of elements X g such that \lambda_g \neq 0 is closed under conjugation and if two elements inside X are conjugate, their coefficients are equal. Take a system of representatives for X/G where G acts on X by conjugation. Then by the above considerations, we get that x=\sum_{g \in G} \lambda_g g=\sum_{g \in X} \lambda_g g= \sum_{g \in X/G} \lambda_g \sum_{h \in C(g)} h, where C(G) is the conjugacy class of g. This shows that the set of elements of the form \sum_{g \in C} g where C is a finite conjugacy class is a generating system for Z(K[G]). It is linearly independent because distinct conjugacy classes are distinct.

Theorem 4.24 If G is a finite group and K is a field such that the characteristic of K doesn’t divide the order of G and let k be the number of irreduible representations, up to isomorphism. Let c be the number of conjugacy classes of G. Then k \leq c with equality if K is algebraically closed. If K=\mathbb{R}, then we have k \leq c \leq 2k.

Proof By 4.23 c is the dimension of the center of K[G]. Now apply 4.21 and 4.22.

Thus we have obtained a relation between the number of irreducible representations and the element structure of G by heavily employing the structure theory for semisimple rings. This is a nice illustration of the power of ring-theoretic tools in representation theory.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s