verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
vivekp has quit [Ping timeout: 255 seconds]
vivekp has joined #mlpack
mikeling has joined #mlpack
govg has quit [Ping timeout: 260 seconds]
govg has joined #mlpack
Thyrix has joined #mlpack
govg has quit [Ping timeout: 260 seconds]
govg has joined #mlpack
vivekp has quit [Ping timeout: 240 seconds]
vivekp has joined #mlpack
Thyrix has quit [Quit: Thyrix]
keshav has joined #mlpack
keshav has quit [Ping timeout: 260 seconds]
< layback>
4
layback has quit [Remote host closed the connection]
keshav has joined #mlpack
keshav has quit [Ping timeout: 260 seconds]
Deep_Thought has joined #mlpack
Deep_Thought has quit [Ping timeout: 255 seconds]
dineshraj01 has joined #mlpack
govg has quit [Ping timeout: 255 seconds]
Deep_Thought has joined #mlpack
Deep_Thought has quit [Ping timeout: 245 seconds]
govg has joined #mlpack
dineshraj01 has quit [Ping timeout: 264 seconds]
mikeling has quit [Quit: Connection closed for inactivity]
dineshraj01 has joined #mlpack
palashahuja has joined #mlpack
< palashahuja>
are there tests of lstm with rnn in mlpack ?
< palash>
@zoq, is the training happening from lines 737-774 ?
< palash>
with lstm ?
< palash>
I got the idea, though ..
< zoq>
palash: The GradientLSTMLayerTest test case doesn't really train the network, it's a numerical check of the LSTM gradient function. 'struct GradientFunction' is just an abstraction used for the CheckGradient. That way we don't have to write a new CheckGradient function for a specific model.
< palash>
also, can we measure cross entropy error in neural networks ? are there tools present that do so ?
< zoq>
palash: It looks like I haven't updated the cross entropy function for the neural network code, so right now there is no cross entropy error function.
< zoq>
Btw. any contribution is greatly appreciated :)