verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
mentekid has quit [Ping timeout: 240 seconds]
mentekid has joined #mlpack
aashay has quit [Quit: Connection closed for inactivity]
mikeling has joined #mlpack
andrzejku has joined #mlpack
dfne has joined #mlpack
< dfne>
hi ryan, thanks. I saw your message afterwards. I will try adding more noise.
aashay has quit [Quit: Connection closed for inactivity]
sumedhghaisas has joined #mlpack
< sumedhghaisas>
zoq: Hey Marcus... there?
< sumedhghaisas>
had couple of questions...
< sumedhghaisas>
why is identity layer used?
kris1 has quit [Quit: kris1]
< zoq>
sumedhghais: Actually, it's expected that a model has at least two layers, so for the Gradient check, where you like to check a single layer, adding an identity layer is just a cheap trick, to fulfill that requirement. I think we could detect that inside the model implementation, but I'm not sure that's necessary right now. Also, I see some of the recurrent test models also use an identity layer, we can
< zoq>
remove that one.
< sumedhghaisas>
zoq: ahh okay. I added Forward tests for the GRU. those are passing.
< sumedhghaisas>
but the gradient tests are failing. Going over the gradient calculations again.
< zoq>
sumedhghais: Okay, I'll recheck the Gradient as well.
< sumedhghaisas>
zoq:
< sumedhghaisas>
I made a stupid mistake there
< sumedhghaisas>
forgot to take into consideration the effect of forget gate. My god. That was really bad...
< sumedhghaisas>
and the network was still learning... Fixed it just now