Jump to content

akerman

Members
  • Posts

    2
  • Joined

  • Last visited

Profile Information

  • Favorite Area of Science
    computer science

akerman's Achievements

Lepton

Lepton (1/13)

0

Reputation

  1. Hello, I am really struggling with some basic things related to regression tree. Mainly I do not understand how I am suppose to calculate square error and form the recursive partition. Here is the set-up: I have a dataset $\{((1,1),9),((1,2),-4),((1,3),2),((2,2),4),((2,3),2)\}$ and I need to find the regression tree corresponding tot the greedy recursive partitioning procedure with respect to square error. And then I would like to draw the recursive partition. I literally have no idea how to even start solving this and it does not seem too difficult as well. Could someone help me out and possibly put detailed solution or partial solution. Anything will be helpful.
  2. I am preparing myself for maths exam and I am really struggling with kernels. I have following six kernels and I need to prove that each of them is valid and derive feature map. 1) K(x,y) = g(x)g(y), g:R^d -> R With this one I know it is valid but I don't know how to prove it. Also is g(x) a correct feature map? 2) K(x,y) = x^T * D * y, D is diagonal matrix with no negative entries With this one I am also sure that it is valid but I have no idea how to prove it or derive feature map For the following four I don't know anything. 3) K(x,y) = x^T * y - (x^T * y)^2 4) K(x,y) = $\prod_{i=1}^{d} x_{i}y_{i}$∏di=1xiyi 5) cos(angle(x,x')) 6) min(x,x'), x,x' >=0 Please help me as I am very struggling with kernel methods and if you could please provide as much explanation as possible
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.