ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< Manav-KumarGitte> @zoq can we include it in mlpack?
ghostrider669 has joined #mlpack
ghostrider669 has quit [Ping timeout: 256 seconds]
ImQ009 has joined #mlpack
togo has joined #mlpack
< hemal[m]> ```
< hemal[m]> oad_save_test.cpp:(.text+0x21e57): undefined reference to `bool mlpack::data::Load<double>(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, arma::SpMat<double>&, bool, bool)'
< hemal[m]> ```
< JoelJosephGitter> >error: no matching function for call to >‘mlpack::ann::FFN<mlpack::ann::MeanSquaredError<>, >mlpack::ann::GaussianInitialization>::Backward(arma::mat&, arma::mat&)’
< JoelJosephGitter> what could cause this?
< hemal[m]> Joel Joseph (Gitter): could you paste the lines of code causing the error ?
< JoelJosephGitter> https://pastebin.com/Rf0ASne1
< hemal[m]> Backward() requires 2 parameters, and you have passed only one `actionProbs_target` , that could be the reason. Not sure though
< JoelJosephGitter> i made an edit in that link now,,, i did put in two arma::mat matrices; i think the reason is some mistake with my "build"...
< JoelJosephGitter> i'll revert and check..
< JoelJosephGitter> The problem goes away when I remove the `SoftMax` layer.
< JoelJosephGitter> I used the ann folder from @mrityunjay-tripathi 's softmax-layer branch
< JoelJosephGitter> On his fork
< chopper_inbound[> Joel Joseph (Gitter) : Let me take a look into it.
< chopper_inbound[> Joel Joseph (Gitter): Can you check the dimensions of the target vector and the dimension of the last "Linear" layer?
< JoelJosephGitter> As u can see from the code I pasted, the last linear layer has dimensions 2. And the target vector I used for "Backward" is the vector that I got from the output of "Forward".
< chopper_inbound[> ohh...there is more code? i didn't scrolled down.
< JoelJosephGitter> `ss` is `3x2` and `as` is `2x2`
< chopper_inbound[> the input to the backward function is not the actual input. It is the output of the forward function (Here `as`). And in place of `as` in backward function call there should be backpropagated error.
< JoelJosephGitter> but they have the dimensions, am i right?
< JoelJosephGitter> *same dimensions
< JoelJosephGitter> in this case
< chopper_inbound[> the `activation` here doesn't have the same dimension as the `input`.
< chopper_inbound[> *activation = output of forward function
< JoelJosephGitter> This code works when I remove the SoftMax layer... Can u check that?
< JoelJosephGitter> `.Backward(INPUT_TO_NET, OUTPUT_OF_NET, GRADIENT_MATRIX)`
< JoelJosephGitter> ohh.. you mean the softmax has input and output dimensions different?
< JoelJosephGitter> *softmax activation layer
< chopper_inbound[> No. You can refer to `softmax_impl.hpp` line no. 33 where I set the size of output same as input.
< JoelJosephGitter> Hmm, then I don't understand why this error occurs
< JoelJosephGitter> Are u getting this error on your pc?
< JoelJosephGitter> https://pasteboard.co/J4wNANT.png u can see here that my Forward and Backward code should work if you compare it to the "test" for MeanSquaredError module.
< chopper_inbound[> I understand. Basically I am getting results when `activation` is a vector and not matrix. The backward function has to be extended for matrix input as well. Maybe you can try flattening the output if that doesn't harm for now. Thanks.
< JoelJosephGitter> I can't flatten the output since it's going in in batches.
< JoelJosephGitter> Are u suggesting I do a loop
< JoelJosephGitter> I am putting in several training inputs in one go.
< chopper_inbound[> Ok. No problem, I will try to fix that if you give me some time
< JoelJosephGitter> Thanks for the reply.
< JoelJosephGitter> :) sure.
metaljack34 has joined #mlpack
< metaljack34> Does ID3 (decision_tree) implementation support pruning?
< chopper_inbound[> <JoelJosephGitter ":) sure."> I have made the changes and https://pastebin.com/embed_iframe/FL831rBR works fine. The test earlier has some flaw (I am not quite sure what the error is) as even after the the change there were some errors regardless of the type of "last layer" used. For example: https://pastebin.com/embed_iframe/HVYRwQWf
ImQ009 has quit [Quit: Leaving]
togo has quit [Quit: Leaving]