Jump to content

uniqueness theorem

Featured Replies

Hey guys,

 

I am a bit new to this forum and this is my first post (i used to just read stuff that came up here). I hope its not inappropriate to ask something right away, because i will do now ask for any advice. I'm studying applied mathematics, and at the moment im working on differential equations. Now i found something i could find an answer to in my book, or somewhere else on the internet. In my textbook (martin braun, differential equations and their applications), i stumbled on to the following question:

 

Show that the solution of y(t) of the given initial-value problem exists on the specified interval:

y' =y^2 + cos(t^2) , y(0)=0; on the interval 0 <= t<=1/2.

the existence-theorem on this subject tells me that I need a rectangle [t_0 < t < t_ +a ] X [y_0 -b , y_0 +b] to be able to use the theorem. But, thats my problem here, I can't construct a proper rectangle, because there's no |y(t)| <=b specified.

 

Now my question is, how do I apply the existance theorem to a initial value problem when the specified interval has boundairies for t, but not for y. (if i use my own brain, i'd say just use |y| <= \infty, but i cant justify that)

 

Could anyone point me in the right direction or give me a helpful answer? Would be great!

 

x tymo

Hey guys,

 

I am a bit new to this forum and this is my first post (i used to just read stuff that came up here). I hope its not inappropriate to ask something right away, because i will do now ask for any advice. I'm studying applied mathematics, and at the moment im working on differential equations. Now i found something i could find an answer to in my book, or somewhere else on the internet. In my textbook (martin braun, differential equations and their applications), i stumbled on to the following question:

 

Show that the solution of y(t) of the given initial-value problem exists on the specified interval:

y' =y^2 + cos(t^2) , y(0)=0; on the interval 0 <= t<=1/2.

the existence-theorem on this subject tells me that I need a rectangle [t_0 < t < t_ +a ] X [y_0 -b , y_0 +b] to be able to use the theorem. But, thats my problem here, I can't construct a proper rectangle, because there's no |y(t)| <=b specified.

 

Now my question is, how do I apply the existance theorem to a initial value problem when the specified interval has boundairies for t, but not for y. (if i use my own brain, i'd say just use |y| <= \infty, but i cant justify that)

 

Could anyone point me in the right direction or give me a helpful answer? Would be great!

 

x tymo

 

Go read the existence theorem (Chapter 1 Theorem 2) again and see how to apply it. Hint: It is up to you to determine a suitable rectangle.

  • Author

Well I did read it again, and I think I am missing it, I looked at an example where they gave -\infty <y <\intfy, there they used the fact that |f(x,y)| <= K, if I apply that to my problem :

 

Can I make the rectangle then dependent of y?

 

Then I could take b = y^2 and thus M = y^2 +1.

but then still: minimum of 1/2 and (y^2 / y^2 +1), and that is not a definite minimum? I mean its 1/2 for y>1, but for y<1 y^2/y^2+1 is the minimum...

 

I think I'm missing a step or I'm thinking too difficult ?

  • Author

thanks i got it! i can use b so it will fit the rest of theorem, not the other way around. thanks again!

Archived

This topic is now archived and is closed to further replies.

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.