Jump to content

phaedo

Members
  • Posts

    10
  • Joined

  • Last visited

Profile Information

  • Favorite Area of Science
    Quant Finance
  • Occupation
    Researcher

phaedo's Achievements

Quark

Quark (2/13)

1

Reputation

  1. If norm(a.x) = | a | norm(x) then as | a | goes to infinity the norm of a.x goes to infinity... Thus a norm can't be bounded?
  2. Thanks for pointing groupoids out. It would be nice to find a generalization of addition for any [x, y] + [z, u], but it's not required. One difficulty is that the norm I am considering here is bounded... Not sure if we can make sense of this.
  3. Hello, My knowledge of abstract algebra beyond linear vector spaces is very limited. My problem is inspired from the (not very well-known) triangular inequality between angles of a tetrahedron (see for example http://convexoptimization.com/wikimization/index.php/Fifth_Property_of_the_Euclidean_Metric): [latex] \left| \widehat{x,y}-\widehat{y,z} \right| \leq \widehat{x,z}\leq\widehat{x,y}+\widehat{y,z} [/latex] where x,y,z are three vectors and [latex]\widehat{x,y}[/latex] is the angle formed by the vectors x,y. I am looking for a way to abstract this into some algebra of vertices. Say a vertex is the tuple [x,y], we would need to define some "addition" operator in a transitive way so that [x,y]+[y,z] = [x,z], and we would define a norm as [latex]\| {[x,y]} \| = \widehat{x,y}[/latex]. The triangular inequality above would then read in the familiar way: [latex] \left| \| [x,y] \| - \| [y,z] \| \right| \leq \| [x,y] + [y,z] \| \leq \| [x,y] \| + \| [y,z] \| [/latex] Does this look familiar to anyone? Thanks in advance for any pointers. p.
  4. Does anyone whether parabolic PDEs of the form:<br> [math] u_t(x,t) = -a \cdot x(1-x)^2 u_x(x,t) + \frac {1} {2} b \cdot x^2(1-x)^2 u_{xx}(x,t) [/math] <br>(a, b constants, t > 0, 0 < x < 1) have a closed-form fundamental solution?
  5. I think I could relax the problem a little and look for e.g. a vector v such that [math]v^T v = 1, v^T A v = tr(A)[/math]. I am aware of some of the work done in quant finance related to optimization problems, not sure this is what I'm looking for though... Thanks for your very pertinent remarks. p.
  6. I 'm sorry if I wasn't clear. I am well-aware that R(x) has tight bounds [math]\lambda_{min} \leq R(x) \leq \lambda_{max}[/math] over all x, but here x is fixed and the lower bound [math] (x^T A v)^2/x^T x[/math] I'm interested in depends on x, so it may well be better than [math]\lambda_{min}[/math] in some cases. I am trying to maximize [math]\textstyle \left( \sum_i \lambda_i x^T v_i \right)^2[/math] over all 2n possible choices of eigenbases (vi), not each [math]\lambda_i |x^T v_i|[/math] as you suggest. I'd like to find the optimal "all-ones" vector v* without having to solve for each vi.
  7. @DrRocket: my v isn't a free vector, it is the all-ones vector in on of the eigenbases...
  8. Thanks a lot for your interest in my problem. Yes A is pos-def The v I was mentioning in [math](x^T A v)^2[/math] was my "all-ones vector" in any of the 2n normalized eigenbases, so the Schwartz inequality won't give the answer The idea is to get a better lower bound than [math]\lambda_{\text min}[/math] for the Rayleigh ratio [math] R(x) := \frac{x^T A x}{x^T x} [/math] for fixed x, and the whole things is part of a larger problem in quant finance that would be too long to expose here. It can be shown that [math]R(x) \geq \frac{(x^T A v)^2}{x^T x}[/math], and I'd like to find "this" v without doing the whole spectral decomposition thing... p
  9. Right, I understand your point... The analogy I have in mind is the canonical "all-ones" vector e = (1, 1, ..., 1) in Rn which is the sum of the canonical basis (ei), and somehow I want to find the corresponding "all-ones" vector v whose coordinates are (1, 1, ..., 1) in "the" (normalized) eigenbasis. But you are reminding me there are still 2n of them. What I really want is to mazimize the value of (xT A v)2 for fixed x over all possible such v's... I guess this becomes some sort of discrete optimisation problem, I'll keep thinking about it Thanks p No I do mean "the" sum of eigenvectors -- but there may be some connection with the trace...
  10. Consider a matrix A in Rn with eigenvectors vi and eigenvalues \lambdai Does anyone know of an efficient method to solve for the vector v := v1 + ... + vn in one go (rather than doing the whole spectral decomposition)? (I am especially interested in the case where A is real symmetric) Thanks p
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.