Jump to content

datahead8888

Members
  • Posts

    3
  • Joined

  • Last visited

Posts posted by datahead8888

  1. Hello,

    Background:
    I was going to implement an implicit approach to 3d tetrahedra deformations (first in Matlab then in C++).
    I need the Jacobian for the partial derivative of force wrt position to do this.
    I found this paper that gives a method for computing it:
    http://www-bcf.usc.edu/~jbarbic/Barb...nessMatrix.pdf

    The Question:
    In the report, he defines a 2skew function (to the right of the page). I'm pretty confused on what it means. I think I forgot what skew means from linear algebra - that's probably part of my problem. It sounds like skew symmetric means that a negated matrix is equal to its transpose. What exactly is this 2skew function for?

  2. Suppose we have a function consisting of a series of matrices multiplied by a vector:
    f(X) = A * B * b
    --where X is a vector containing elements that are contained within A, b, and/or b,
    --A is a matrix, B is a matrix, and b is a vector

    Each Matrix and the vector is expressed as more terms, ie...
    X = (x1, x2, x3)

    A =
    [ x1 + y1 y4 y7 ]
    [ y2 x2 + y5 y8 ]
    ] y3 y6 x3 + y9 ]

    B =
    [ y1 x2 + y4 x3 + y7 ]
    [x1 + y2 y5 y8 ]
    ] y3 y6 y9 ]

    b = [y1 y2 y3]' (' means transposed)

    Now we want to find the Jacobian of f - ie the partial derivative of f wrt X.

    One way to do this is to multiply the two matrices and then multiply that by the vector, creating one 3x1 vector in which each element is an algebraic expression resulting from matrix multiplication. The partial derivative could then be computed per element to form a 3x3 Jacobian. This would be feasible in the above example, but the one I'm working is a lot more complicated (and so I would also have to look for patterns in order to simplify it afterwards).

    I was wanting to try to use the chain rule and/or the product rule for partial derivatives if possible. However, with the product rule you end up with A' * B * b + A * B' * b + A * B * b', where each derivative is wrt to the vector X. I understand that the derivative of a matrix wrt a vector is actually a 3rd order tensor, which is not easy to deal with. If this is not correct, the other terms still have to evaluate to matrices in order for matrix addition to be valid. If I use the chain rule instead, I still end up with the derivative of a matrix wrt a vector.

    Is there an easier way to break down a matrix calculus problem like this? I've scoured the web and cannot seem to find a good direction.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.