naywhayare changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
govg has quit [Ping timeout: 255 seconds]
govg has joined #mlpack
govg has quit [Changing host]
govg has joined #mlpack
andrewmw94 has quit [Quit: Leaving.]
govg has quit [Ping timeout: 264 seconds]
govg has joined #mlpack
govg has quit [Ping timeout: 240 seconds]
govg has joined #mlpack
govg has quit [Ping timeout: 264 seconds]
govg has joined #mlpack
govg has quit [Changing host]
govg has joined #mlpack
udit_s has joined #mlpack
govg has quit [Ping timeout: 240 seconds]
govg has joined #mlpack
< marcus_zoq>
udit_s: Just for clarification, you transform the target to a 2D-vector if we use two labels and the output is a 3D-vector if we use 3 labels, right?
< udit_s>
marcus_zoq: 3D-vector as in a cube ? for n class labels, I just have a weight matrix with n rows.
< udit_s>
marcus_zoq: also, I think I've figured out what the problem was and why yesterday's solution wasn't converging.
< udit_s>
- no bias vector.
< udit_s>
I'm taking care of that right now.
< marcus_zoq>
udit_s: If you compute 'weightVectors * data.col(j);' I was expecting a value, so I though you transform the target value to a binary vector.
< marcus_zoq>
udit_s: Something like: y = x(i,1)*w(2,1)+x(i,2)*w(3,1) -> single value
< udit_s>
No, I'm getting the max value from tempLabelMat. as the single value.
< udit_s>
marcus_zoq: Oh, and I think I fixed it. Just need to run some more tests to be sure, but it works for AND, OR gate implementations and the random linearly separable test case
< udit_s>
Let me just upload the code.
< udit_s>
udit_s: So basically, I've added a col in weightVector for the bias vector. And inserted a row of ones in trainData and testData for bias variables. It properly converges now.
< udit_s>
marcus_zoq: ^ oops.
< marcus_zoq>
udit_s: Okay sounds good?
< udit_s>
marcus_zoq: So what next ? I'll write a few more tests, edit the documentation; are we assuming linearly separable data as the input ?
< marcus_zoq>
udit_s: Yeah we are assumig linearly separable data. Test great idea, and a main would be great.
< marcus_zoq>
udit_s: Can you also add a test with just two inputs?
govg has quit [Ping timeout: 240 seconds]
govg has joined #mlpack
udit_s has quit [Quit: Leaving]
andrewmw94 has joined #mlpack
udit_s has joined #mlpack
udit_s has quit [Quit: Leaving]
Anand has joined #mlpack
< Anand>
Marcus : I didn't really get you. What is the m_rates vector? How do I implement the get_probs() method?
< marcus_zoq>
Anand: Hello, here is the shogun nbc header file and the cpp file:
< marcus_zoq>
What we need is the m_rates vector. The problem is, we can't directly access this vector. A protected member is accessible in the class that defines them and in classes that inherit from that class. So, we create a new class that inherit form the shogun CGaussianNaiveBayes class.
< Anand>
Yeah right, I got that. But, what exactly is m_rates?
< Anand>
log of m_prob?
< marcus_zoq>
Anand: A shogun vector object that contains the probabilities.