Tuesday, November 1, 2016

Polynomial Identities

Let $F$ be a field and $F\langle X\rangle$ the free associative algebra with alphabet $X$. Let $f(x_1,\ldots,x_m)\in F\langle X\rangle$ be a nonzero polynomial (in non-commuting variables.) We say that an algebra $A$ satisfies the identity $f=0$ if $f(a_1,\ldots,a_m)=0$ for all $m$-tuples of elements of $A$.


Examples:
  1. $A$ commutative: then $A$ satisfies $x_1x_2-x_2x_1=0$.
  2. $A$ nilpotent (say $A^n=0$.) Then $A$ satisfies $x_1\ldots x_n=0$.
  3. $A$ finite dimensional (say of dimension $d$.) Let $$f(x_1,\ldots,x_{d+1})=\sum_{\sigma\in \mathcal{S}_{d+1}}(-1)^{|\sigma|}x_{\sigma(1)}\ldots x_{\sigma(d)}.$$ Then $f$ is multilinear and antisymmetric. If $e_1,\ldots,e_d$ is a basis for $A$, then $f(a_1,\ldots,a_{d+1})$ can be expressed as a linear combination of elements $f(e_{i_1},\ldots,e_{i_{d+1}})$ by multilinearity. However in each such summand, some basis element appears twice, and so $f$ vanishes by antisymmetry. We denote this particular $f$ by $S_{d+1}(x_1,\ldots,x_{d+1})$.
We will call algebras satisfying a polynomial identity "PI algebras."

Lemma: If $A$ satisfies a polynomial identity, it satisfies a multilinear polynomial identity.

Proof:
Let $f$ be a nonzero identity satisfied by $A$. We show how to linearize in the first variable $x_1$. Suppose $x_1$ appears with maximal degree $d$. Introduce new variables $y_1,\ldots, y_d$ and consider the sum
$$f(y_1+\cdots+y_d,x_2,\ldots, x_n)-\sum f(y_1+\cdots+\hat{y}_i+\cdots+y_d,x_2,\ldots)+\cdots$$
which is an alternating sum in which the second term is the sum of ways of omitting one variable. The next term will be the sum of ways of omitting two variables etc. Then this identity still holds since it is a sum of $f$'s, each of which is $0$, but it has the effect that it kills any momomials involving fewer than $d$ instances of $x_1$ and it linearizes those that do. In fact, thinking of the $x_1$'s as indicating $d$ spots in a subword, it sums over putting the variables $y_1,\ldots, y_d$ in those spots in all possible ways.  So for example $f(x_1,x_2)=x_1^2x_2x_1$ would get linearized to
$$\sum_{\sigma\in \mathcal{S}_3} y_{\sigma(1)}y_{\sigma(2)}x_2y_{\sigma(3)}.$$
Now  that we have linearized $x_1$, repeat the process to linearize the other variables.
$\Box$

Lemma: If $A$ is a PI algebra and $R$ is a commutative $F$-algebra, then $A\otimes_F R$ is also a PI algebra.

Proof:
In view of the previous lemma, we may assume that $A$'s polynomial identity, $f(x_1,\ldots, x_m)=0$ is multilinear. So $$f=\sum_{\sigma\in S_m}\alpha_\sigma x_{\sigma(1)}\cdots x_{\sigma(m)},$$ for some coefficients $\alpha_\sigma\in F$.
 We claim that $A\otimes_F R$ satisfies the same polynomial identity. Since $f$ is linear in each variable, $f$ applied to a general $m$-tuplet of elements of $A\otimes_F R$ is a linear combination of $f$ applied to basic tensors. Now
$\begin{align*}
f(a_1\otimes r_1,\ldots ,a_m\otimes r_m)&=\sum_{\sigma\in \mathcal{S}_m}\alpha_\sigma a_{\sigma(1)}\cdots a_{\sigma(m)}\otimes r_{\sigma(1)}\cdots r_{\sigma(m)}\\
&= \sum_{\sigma\in \mathcal{S}_m}\alpha_\sigma a_{\sigma(1)}\cdots a_{\sigma(m)}\otimes r_{1}\cdots r_{m}\\
&=f(a_1,\ldots,a_m)\otimes r_1\cdots r_m\\
&=0
\end{align*}
$
$\Box$

Corollary: The algebra of $n\times n$ matrices over a commutative $F$ algebra $R$,  $M_n(R)$, is a PI algebra.

Proof:
$M_n(F)$ is finite dimensional, so is a PI algebra as we saw earlier. On the other hand,
$M_n(R)=M_n(F)\otimes _F R$, so the previous lemma applies. $\Box$

Indeed, since the dimension of $M_n(F)$ is $n^2$, we know that $M_n(R)$ will satisfy the identity $S_{n^2+1}(x_1,\ldots,x_{n^2+1})=0$. It becomes an interesting question as to whether this is the best that can be done. Does $M_n(R)$ satisfy a lower degree identity? The answer turns out to be yes.

Theorem: $M_n(R)$ satisfies the identity $S_{2n}(x_1,\ldots,x_{2n})=0$ but does not satisfy any identity of degree $<2n$.

Suppose that $A$ satisfies an identity of degree $k<2n$. Linearize, rename variables, and multiply by a scalar so that $$f=x_1\ldots x_k+\sum_{\sigma\neq 1} \alpha_\sigma x_{\sigma(1)}\cdots x_{\sigma(k)}.$$ Let $e_{ij}$ be the matrix with a $0$ in every entry except at the $(i,j)$ entry where there is a $1$.
Now let $$x_1=e_{11},x_2=e_{12},x_2=e_{22}, x_4=e_{23},\ldots$$
Then the only order in which this multiplies to be nonzero is $x_1 x_2\ldots x_k$, so $f(x_1,\ldots,x_k)= x_1\ldots x_k\neq 0$, and therefore $A$ does not satisfy the identity $f$.

Stay tuned until the next blog post for the fact that $S_{2n}=0$.







No comments:

Post a Comment