Jump to content

Recommended Posts

If anyone could explain how the following is done, it would be greatly appreciated!

 

Consider the set L of all linear transformations. Let L_1: V -> W and L_2: V -> W be linear transformations. Define a vector addition on linear transformations as (L_1 + L_2): V -> W, where (L_1 + L_2)(v) = L_1(v) + L_2(v). Define also a scalar multiplication on linear transformations as (c * L_1): V -> W, where (c * L_1)(v) = cL_1(v). Using these operations, we may consider L a vector space.

 

Here is the question:

Show that for every L_1 ε L, there exists some L_2 ε L such that L_1 + L_2 + 0_L.

Link to comment
Share on other sites

- We have a TeX implementation in here that you can use. Usage is [ math] <tex code here> [ /math] without the spaces in the brackets.

- I assume you meant [math] L_1 + L_2 = 0 [/math].

- A natural start is to convince yourself what 0 is/looks like.

- You might (I did) need one of the conditions for linear transformations, namely L(ax) = aL(x).

Link to comment
Share on other sites

  • 4 months later...
If anyone could explain how the following is done, it would be greatly appreciated!

 

Consider the set L of all linear transformations. Let L_1: V -> W and L_2: V -> W be linear transformations. Define a vector addition on linear transformations as (L_1 + L_2): V -> W, where (L_1 + L_2)(v) = L_1(v) + L_2(v). Define also a scalar multiplication on linear transformations as (c * L_1): V -> W, where (c * L_1)(v) = cL_1(v). Using these operations, we may consider L a vector space.

 

Here is the question:

Show that for every L_1 ε L, there exists some L_2 ε L such that L_1 + L_2 + 0_L.

I'm going to assume that that last condition is L_1+ L_2= 0_L, the "0" transformation that maps every vector in v to the 0 vector in W.

 

Take a look at L_2(v)= -(L_1(v)).

Link to comment
Share on other sites

I'm going to assume that that last condition is L_1+ L_2= 0_L, the "0" transformation that maps every vector in v to the 0 vector in W.

 

I wonder if there's not a slight slip-o'- the tongue here. For. Let L be the set of all linear transformations V --> W. Then L will be a vector space ( we know that it is) if, (among other things), for some [math]L_m \in L[/math] there is some [math] L_n \in L[/math] s.t. [math]L_m + L_n = 0_L , { }\text m \neq n[/math]; one says that [math]0_L[/math] is the identity on L, i.e. the identity operator

 

But the identity operator [math]0_L[/math] sends each [math]v \in V[/math] to itself, and not to the zero vector in W, surely? Am I mad?

Link to comment
Share on other sites

But the identity operator [math]0_L[/math] sends each [math]v \in V[/math] to itself, and not to the zero vector in W, surely? Am I mad?

 

Not too sure about that: [imath]0_L \in L[/imath] necessarily, which means [imath]0_L : V \to W[/imath]. It is quite possible that [imath]V \neq W[/imath] - indeed it is entirely possible that [imath]V \cap W = \phi[/imath] - so [imath]0_L[/imath] can't be the identity function.

 

I'm a little confused as to the original poster's question. If you're taking it as given that L is a vector space, then this is a trivial question because it's one of the axioms (additive inverse). However, if you're actually trying to prove that L is a vector space, then do as HallsofIvy suggests.

Link to comment
Share on other sites

I never really understood the need to demonstrate the other axioms in order to show that an object is a vector space, if you just show that it is closed under vector addition and scalar multiplication, then the existance of additive inverses soon follows, same with the the existance of a zero vector which you could simply define as a-a

 

the only other axiom that I see a need for is that every vector is unique, otherwise the rest follows from the closure.

Link to comment
Share on other sites

Not too sure about that: [imath]0_L \in L[/imath] necessarily, which means [imath]0_L : V \to W[/imath]. It is quite possible that [imath]V \neq W[/imath] - indeed it is entirely possible that [imath]V \cap W = \phi[/imath] - so [imath]0_L[/imath] can't be the identity function.
Now I am confused.

 

Every vector space admits of an identity, right? Let [math]L(V, W) [/math] be the space of all linear maps [math]V \rightarrow W[/math], with [math]I_L[/math] defined by

 

for all [math] L_i \in L(V,W), I_L+L_i= I_L+L_i=L_i[/math]. This is our identity on L(V,W), right?

 

Suppose [math] V\cap W = \emptyset [/math] But, [math] I_L \in L(V,W)[/math], so as you say, for all [math]L_i \in L(V,W), L_i: V \rightarrow W[/math], including [math]I_L[/math]. So what is the action of [math]I_L[/math] on [math]V[/math]? I haven't a clue, does anyone? Am I being dumb here?

 

(Tactful answers only, please!)

Link to comment
Share on other sites

for all [math] L_i \in L(V,W), I_L+L_i= I_L+L_i=L_i[/math]. This is our identity on L(V,W), right?

 

Yes, that's the additive identity. Generally, I write [imath]0_L[/imath] to avoid confusion between that and the multiplicative identity (which lies in the field over the vector space). In pretty much all the linear algebra I've done, it's called the zero vector, zero element, etc.

 

Simply put, the identity function is [imath]0_L(v) = 0_W \ \forall v \in V[/imath]. This is easily checked. Fix [imath]v \in V, T \in L[/imath]. Then, clearly, [imath]T(v) \in W[/imath]. Hence, [imath]T(v) + 0_L(v) = T(v) + 0_W = T(v)[/imath] since W is a vector space.

 

Let me try and clarify my previous post a bit. Here's what you posted earlier:

 

But the identity operator [math]0_L[/math] sends each [math]v \in V[/math] to itself, and not to the zero vector in W, surely? Am I mad?

 

What I was trying to say, in a rather convoluted fashion, is that there's absolutely no guarantee that you can do this since you know nothing about V and W. If the two vector spaces are disjoint (i.e. [imath]V \cap W = \phi[/imath]) then there's no possible way that the function [imath]0_L[/imath], which maps elements of V into W, could map a vector [imath]v \in V[/imath] to itself.

 

Does this clear it up a bit?

Link to comment
Share on other sites

Yes, thank you (I had more or less come to that conclusion myself), but for one small point.

 

The identity operator [math]I[/math] satisfies [math]I(v) = v [/math] for all [math]v \in V[/math]. Confusingly, this is not the identity on the vector space of all linear maps [math]V \rightarrow W[/math]. Call this the vector identity.

 

This is, you rightly showed, and as HoI originally claimed, the zero operator, [math]0_{L(V,W)}[/math], the zero vector in L(V,W), which satisfies [math]0_{L(V,W)}(v) = 0_W[/math] for all [math] v \in V[/math].

 

My slight niggle is you did call this the identity function, which it isn't.

 

(This result is no more than we should expect, as the vector space operation is addition)

 

Apologies to both of you.

Link to comment
Share on other sites

I wonder if there's not a slight slip-o'- the tongue here. For. Let L be the set of all linear transformations V --> W. Then L will be a vector space ( we know that it is) if, (among other things), for some [math]L_m \in L[/math] there is some [math] L_n \in L[/math] s.t. [math]L_m + L_n = 0_L , { }\text m \neq n[/math]; one says that [math]0_L[/math] is the identity on L, i.e. the identity operator

 

But the identity operator [math]0_L[/math] sends each [math]v \in V[/math] to itself, and not to the zero vector in W, surely? Am I mad?

Apparently you are mad! If V and W are not the SAME vector space, then there is NO "identity operator".

 

Yes, the set of linear transformations from V->W is a (dim(v))(dim(W)) dimensional vector space with addition of vectors defined by (L_1+ L_2)(v)= L_1(v)+ L_2(v). In particular, the 0 transformation must satisfy (L_1+ 0_L)v= L_1(v)+ 0_L(v)= L_1(v). In other words, the 0 transformation must be the transformation that takes every vector in V into the 0 vector in W.

 

Of course, if we look at linear transformations V->V then we can form a RING in which the "multiplication" is composition of transformations. (We cannot do that with V->W with V and W different because then L_1(L_2(v)) would be meaningless- L_2(v) is in W, not V, so we cannot apply L_1 to it.)

In THAT case the "multiplicative" indentity is the identity transformation.

Link to comment
Share on other sites

Now that's unkind. You chose to ignore my Pauline conversion, and my apology to you and Dave. Ah well, maybe I deserved it.

 

Hey, YOU were the one who asked "Am I mad?" Logically I could only agree with your first comment (which i did not) or with your second!

 

 

I think I was typing that WHILE you were typing your response to Dave. Though I do like the "Pauline Conversion". I wondered for a moment who "Pauline" was!

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.