(SR) So, what's a Tensor, for Chrissake?

I will assume we all know what's meant by a vector space.

I guess I'll be the one to give a definition of a vector space. I'm going to paraphrase (and add some information) from a Dover book called "Linear Algebra" since I can't cannot remember ever single property without forgetting at least two or three.

A vector space V (over a field R) is a set of objects called vectors together with two operations. The first is addition of vectors. For every u, v, w in V we have:

1) u + v is uniquely defined element of V
2) u + (v + w) = (u + v) + w
3) u + v = v + u
4) There exists a vector $$0_{V}$$ such that $$u + 0_{V} = 0_{V} + u = u$$
5) For each u in V there exists vector $$-u$$ such that $$u + (-u) = (-u) + u = 0_{V}$$

In other words, we can say that $$(V, +)$$ is an abelian group.

There also an operation between elements of R and elements of V called scalar multiplication such that for all a, b in R and for all u, v in V

1) au is a vector in V
2) a(u + v) = au + av
3) (a + b)u = au + bu
4) a(bu) = (ab)u
5) 1u = u
6) 0u = $$0_{V}$$

I think that for the purpose of this thread, you can let R (the field in the above definition) be the set of real numbers under normal multiplication and addition of real numbers.
 
Last edited:
Yeah, thanks for that Absane. Maybe I should have gone through that, but I didn't want to front-load too heavily with definitions. But thanks, I hope others find it useful.
 
but I didn't want to front-load too heavily with definitions.

There are enough new definitions and ideas in a single class session to bog down even the brightest students. Why not confuse ourselves some more ;)
 
So you think my thread was not well-motivated? Maybe you're right. But, as there are members here who seem to feel that General Relativity is something they can have opinions about, and are using the word "tensor" in the wrong way, I thought, at the very least, I might show that there's some pretty heavy-duty concepts you need under your belt before you can even begin to do that

Plus I hoped someone might learn something.

Plus I had fun doing it.

Plus, I find I have a problem myself, later for that
 
So you think my thread was not well-motivated? Maybe you're right. But, as there are members here who seem to feel that General Relativity is something they can have opinions about, and are using the word "tensor" in the wrong way, I thought, at the very least, I might show that there's some pretty heavy-duty concepts you need under your belt before you can even begin to do that

Plus I hoped someone might learn something.

Plus I had fun doing it.

Plus, I find I have a problem myself, later for that

I too think that all posts in this thread are of great effort. Thank you. I think that if this goes quite further, it might qualify as a sticky thread?
 
QuarkHead: In post 12, is R the real numbers?

That is, the dual vector space maps the vector space to the real numbers---i.e. just a statement that you can define a good inner product?
 
Yeah Absane, must have had a humourectomy last night. Anyway, I guess I can draw this thread to a natural close, by showing how tensor arithmetic works, in very approximate (but I think true enough) form.

First a couple of general remarks. Some of you may be wondering "when are we going to see the terms covariant and contravariant defined?" Well you won't from me; I had their usage thrashed out of me early on, and you might see why by looking at the Wiki article. Anyway, if I were to use them, I suspect it would be oppositely to most people.

So, I mentioned last night I had a slight problem. I think I've fixed it, but go cautiously, dear reader, until we get the thumbs up from a physicist.

Recall I defined the bilinear map $$g: V \times V \to \mathbb{R}, \;\; g(u, v) = \alpha \in \mathbb{R}$$. This is called the inner product on $$V \times V$$. This, by definition, gives metric information about pairs of elements in V, i.e. length and angle.

But the tensor space $$V^* \otimes V^*$$ is by definition the space of all bilinear real-valued maps on $$ V \times V$$, so $$g \in V^* \otimes V^*$$ must be a type (0, 2) tensor. Under certain restricted conditions (not requiring positive-definiteness, for one thing), I will call this the metric tensor. (In this simple form, I think this may only be true in Euclidean or pseudo-Euclidean space. Is this right? Please verify this, somebody!)

So let's put this in tensor form. Let $$u =A^i,\;v =B^j$$ be type (1, 0) tensors, i.e vectors, and write $$g(u,v)=g_{ij}A^iB^j = C = \alpha$$, since we now know that scalars are type (0,0) tensors. Look closely;

We have a pair of lowered indices on g, and a matching pair, one each on A and B, which somehow disappear on the equality. We can make this a general rule, $$A_iB^i=C$$, and be extension $$A_{ij}B^i = C_j$$. This called tensor contraction.

With the following rule for tensor multiplication - multiplying a rank $$r$$ tensor by a rank $$s$$ yields a rank $$r+s$$ tensor - i find I can write the above as $$A_iB^i =C^i_i = C$$. We see that tensor contraction reduces the rank of the tensor by 2.

All we need to know now is that for multiplication, $$r, \;s$$ need not be equal, but for addition $$A^i+B^j = C^k$$ they must be.

I think that's about it, folks! If anyone wants to talk about transformations, we can, but as this about about what a tensor is, I suggest a separate thread for that.
 
QuarkHead: In post 12, is R the real numbers?
That is, the dual vector space maps the vector space to the real numbers---i.e. just a statement that you can define a good inner product?
Sorry, Ben, missed this. Yes, since I changed my field from the generic F to R, R is the real field.

I had been going to cover that, but decided in the end up it wasn't needed for my rather superficial development. But anyway, you are right. It goes like this:

to any vector space V over R, I can associate a vector space V* of real-valued linear functionals such that, for each $$u \in V$$ there is some $$\varphi_u$$. In the case that V is an inner product space (not all are), for each $$\varphi_u$$ I can always find some $$ v$$ such that $$\varphi_u(v) = g(v,u) = g(u,v)$$ The second equality holds in that form only if V is over R. If the base field is $$\mathbb{C}$$ this becomes $$g(v,u) = g\overline{(u,v)}$$ i.e. the complex conjugate.

EDIT: in the above, something doesn't look quite right. I seem to be abusing notation: I defined g to be bilinear, whereas $$\varphi_u(v)$$ is only linear in v. Let me ponder on that.
I am aware that those who use bra-ket notation adopt a slightly different convention.
 
Last edited:
I am aware that those who use bra-ket notation adopt a slightly different convention.

That's certainly the notation that I grew up with :)

Now that we have all of this beautiful machinery, can we take a specific example of all of this? For example, using something that would tie this to quantum mechanics, i.e. position and momentum spaces?
 
Oh, you will sometimes see these called 1-forms; I should avoid the terminology if I were you, as some writers call 1-forms covectors, as we just did, others call 1-forms a field of covectors. This is just one of a few areas in this subject where the terminology is highly muddled
QH
I have enjoyed this thread enormously but I have to say you floored me with your last sentence here. Tensors have been around for a long long time and Mathematics is all about precision and formalism if there is anything on this earth that is precise.

So how can there be confusion or muddled understandings about terminology ?
Is this a new extension of tensor theory that has not yet settled, brought about by say ... recent experimental results in particle physics or something like that ?
 
EDIT: in the above, something doesn't look quite right. I seem to be abusing notation: I defined g to be bilinear, whereas $$\varphi_u(v)$$ is only linear in v. Let me ponder on that.
Well OK, it is a slight abuse, due to the way in which I first presented the inner product. It's pretty innocuous, so I hope you'll forgive it. But if not, there is a very straightforward way to shimmy around it, which I can show if really required.
 
That's certainly the notation that I grew up with
Well, we could get into a discussion on that, if you like. Suffice say I don't like it (I have my reasons!), but it's a matter of taste I guess.

Now that we have all of this beautiful machinery, can we take a specific example of all of this? For example, using something that would tie this to quantum mechanics, i.e. position and momentum spaces?
Not from me, mate, I don't really "do" applications (see my opening disclaimer). I might be able to give a very, very inexpert guide to tangent tensor spaces on manifolds (relevant to GR!), but that would be all. Sorry
 
So how can there be confusion or muddled understandings about terminology ?
Is this a new extension of tensor theory that has not yet settled, brought about by say ... recent experimental results in particle physics or something like that ?
As per my post to Ben - I know almost nothing about "experimental results in physics" and the like.

But let me give a few examples of what I called confusion: as I said, physicists sometimes equate 1-forms with covectors, and sometimes with covector fields. And in general, they often use the term "tensor" to refer to what I might call (were I clever enough) a tensor field.

Moreover, the usage of the terms "covariant" and "contravariant" is a muddle, as I hinted, This discussion gives some insights.

I guess it all boils down to this: are tensors of more intrinsic interest to mathematicians than they are useful to engineers and physicists? I suspect that real mathematicians would regard them as a bit "clunky" and opt for something like differential forms? Who knows?
 
Certainly the trend in physics is to use differential forms. The whole point of general relativity is to write things in arbitrary coordinates, and differential forms are tailored for this purpose.

I have to go to work now, but I will try to add something about physical applications vis a vis quantum mechanics this evening or tomorrow or something.

Good work though---you know they say physicists like to pretend to be mathematicians when nobody is looking :)
 
Good work though---you know they say physicists like to pretend to be mathematicians when nobody is looking :)


My high school physics teacher was like that. He's a brillant man working towards his PhD in physics in his mid to late 40s... just to have it. But when he would explain things to us, it would get on my nerves when he would butcher mathematics for the sake of deriving an equation.

:mad:
 
The Grassman ring is a good way to collate these concepts. Perhaps the thread maker will say more.
 
I take it you mean me? Well I can't! What's a "Grassman ring"? If you know, do please say what it is, and how it is helpful here.

Cheers.
 
Great thread. I think to actually get people to realise that tensors are useful things, it's worth giving a specific, non-trivial example. Something like the Riemann tensor perhaps? I.e given a connection $$\nabla$$ on $$\mathcal{M}$$, define the Riemann tensor via:

$$ R(X,Y,Z) \equiv \nabla_X \nabla_Y Z - \nabla_Y \nabla_X Z- \nabla_{[X,Y]}Z $$

for vector fields X,Y,Z. I remember this example explicitly because it suddenly made me realise why tensors are (a) are non-trivial and (b) awesome for something such as GR.
 
Back
Top