Jump to content

uniqueness theorem


tymo

Recommended Posts

Hey guys,

 

I am a bit new to this forum and this is my first post (i used to just read stuff that came up here). I hope its not inappropriate to ask something right away, because i will do now ask for any advice. I'm studying applied mathematics, and at the moment im working on differential equations. Now i found something i could find an answer to in my book, or somewhere else on the internet. In my textbook (martin braun, differential equations and their applications), i stumbled on to the following question:

 

Show that the solution of y(t) of the given initial-value problem exists on the specified interval:

y' =y^2 + cos(t^2) , y(0)=0; on the interval 0 <= t<=1/2.

the existence-theorem on this subject tells me that I need a rectangle [t_0 < t < t_ +a ] X [y_0 -b , y_0 +b] to be able to use the theorem. But, thats my problem here, I can't construct a proper rectangle, because there's no |y(t)| <=b specified.

 

Now my question is, how do I apply the existance theorem to a initial value problem when the specified interval has boundairies for t, but not for y. (if i use my own brain, i'd say just use |y| <= \infty, but i cant justify that)

 

Could anyone point me in the right direction or give me a helpful answer? Would be great!

 

x tymo

Link to comment
Share on other sites

Hey guys,

 

I am a bit new to this forum and this is my first post (i used to just read stuff that came up here). I hope its not inappropriate to ask something right away, because i will do now ask for any advice. I'm studying applied mathematics, and at the moment im working on differential equations. Now i found something i could find an answer to in my book, or somewhere else on the internet. In my textbook (martin braun, differential equations and their applications), i stumbled on to the following question:

 

Show that the solution of y(t) of the given initial-value problem exists on the specified interval:

y' =y^2 + cos(t^2) , y(0)=0; on the interval 0 <= t<=1/2.

the existence-theorem on this subject tells me that I need a rectangle [t_0 < t < t_ +a ] X [y_0 -b , y_0 +b] to be able to use the theorem. But, thats my problem here, I can't construct a proper rectangle, because there's no |y(t)| <=b specified.

 

Now my question is, how do I apply the existance theorem to a initial value problem when the specified interval has boundairies for t, but not for y. (if i use my own brain, i'd say just use |y| <= \infty, but i cant justify that)

 

Could anyone point me in the right direction or give me a helpful answer? Would be great!

 

x tymo

 

Go read the existence theorem (Chapter 1 Theorem 2) again and see how to apply it. Hint: It is up to you to determine a suitable rectangle.

Link to comment
Share on other sites

Well I did read it again, and I think I am missing it, I looked at an example where they gave -\infty <y <\intfy, there they used the fact that |f(x,y)| <= K, if I apply that to my problem :

 

Can I make the rectangle then dependent of y?

 

Then I could take b = y^2 and thus M = y^2 +1.

but then still: minimum of 1/2 and (y^2 / y^2 +1), and that is not a definite minimum? I mean its 1/2 for y>1, but for y<1 y^2/y^2+1 is the minimum...

 

I think I'm missing a step or I'm thinking too difficult ?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.