Tensor Products: A few explicit calculations.
December 17, 2010
I planned to do a post about tensor products (what they are, why we should care, what we do with them, etc.) but because I’m not comfortable with all of that quite yet, I’m going to assume you know what tensor products are, and do a few explicit calculations. So, in short, if you don’t already know what tensor products are, don’t read this post.
Our notation will be as follows: is a field, is a commutative ring with , and will denote the tensor product of modules over a ring . As usual, will denote the polynomials in with coefficients in .
(Note: My thanks to Brooke, who pointed out that I kept writing "+" when I meant "." I hope I’ve not made this error elsewhere, as tensors are "pretty different" from standard addition.)
Let’s show that .
Let’s quietly drop the subscript now. We ask ourselves, what does have in it? It’s made up of a finite sum of elements which look like
and what does look like? Since it’s just a polynomial with coefficients in , it looks like
where . Putting this together, we have that an element in the tensor product looks something like:
which is slightly messy. Note that the sum on the left is summing up a finite number of those little tensor elements. This is a lot to work with, but because our mapping will be linear, it will suffice doing this for the single tensor element
which makes things a bit nicer. Now, if we want to make an isomorphism, we need some maps. Let’s define
where this sum spans over all possible considered. Proving that this is linear on is not difficult, and this (by the property of tensor products) implies that this is a well-defined -module homomorphism from . You can show this for yourself directly!
So, what does our map do? It takes our tensor product, and it "shifts" the coefficients over and multiplies them. This should seem reasonable, since this is what we’d expect the tensor product to do. Conversely, given an element like this, how can we "get back" to the tensor product?
Define the following map:
What does this map do? It takes an element like the one we get from and places a tensor in between the and terms and pulls out the coefficient. This should seem reasonable to you, since if we got then by the properties of tensors:
Take note that this, too, is linear and a well-defined homomorphism (as you can check) — there is, in fact, not a whole lot to check here, because given some other element, you can always reduce it to the form above.
Now all we need to do is show that the compositions of these maps give us the identity. Notice that, so far, this is pretty standard stuff. The only thing that is really different is the properties of tensor products. Let’s just get down to it.
Which is exactly what we wanted. Notice that we get a number of these equalities by linearity. Nice. Similarly,
and so we see that this actually gives an isomorphism. Thus, we have that .
This isomorphism should tell you a little something about the tensor product. We’ve taken two polynomial rings and "linked them up" by being able to exchange the ring coefficients back and forth across the tensor product.
This is a shorter example, but it’s a nice one. In the last example, I noted that being able to exchange ring elements back and forth allowed us to join two polynomial rings together. But what if our rings are not equal? Well, let’s see for a simple case how the tensor would work.
For example, let’s look at Notice what the tensor here says: we can exchange back and forth values in . I claim that this is isomorphic to . Let’s make the isomorphism. Recall that the tensor product is finite sums of elements which look like
but we’ll just work with since the rest will follow by linearity.
Define the map:
Define another map:
for . Check both of these are well defined, etc, etc.
There is a bit to be said about fractions being equivalent to other fractions and the like, but not much.
I’ll expand this section when I think of cool other examples related to this.
As I get more examples typed up (and it is a pain to type them up!) I’ll post them here. In particular, it’s interesting to think what happens to, say, when you tensor it with various things.