verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
chenzhe has quit [Ping timeout: 258 seconds]
chenzhe has joined #mlpack
chenzhe has quit [Quit: chenzhe]
chenzhe has joined #mlpack
_ohm has joined #mlpack
chenzhe has quit [Ping timeout: 258 seconds]
_ohm has left #mlpack []
chenzhe has joined #mlpack
chenzhe has quit [Ping timeout: 258 seconds]
vinayakvivek has joined #mlpack
hxidkd has joined #mlpack
systematic has quit [Ping timeout: 260 seconds]
hxidkd has quit [Ping timeout: 252 seconds]
hxidkd has joined #mlpack
hxidkd has left #mlpack []
govg has quit [Quit: leaving]
witness_ has joined #mlpack
govg has joined #mlpack
clicksaswat has joined #mlpack
vivekp has quit [Ping timeout: 245 seconds]
< clicksaswat> hi guys! i submitted a proposal for gsoc under the title "Research and Implementation of Augmented RNNs", but due to timing constraints, never had a chance to ask for feedbacks, if you guys have time for it now, any comments for the proposal would be really appreciated.
vivekp has joined #mlpack
clicksaswat has quit [Ping timeout: 260 seconds]
clicksaswat has joined #mlpack
< zoq> clicksaswat: I can comment on the proposal, if you like.
vivekp has quit [Ping timeout: 240 seconds]
vivekp has joined #mlpack
mikeling has joined #mlpack
govg has quit [Ping timeout: 252 seconds]
Trion has joined #mlpack
govg has joined #mlpack
clicksaswat_ has joined #mlpack
clicksaswat has quit [Ping timeout: 260 seconds]
clicksaswat has joined #mlpack
witness_ has quit [Quit: Connection closed for inactivity]
< clicksaswat> zoq: hey zoq! that would be really great. I would like to hear especially about the part where I've mentioned extending RNN architectures to support for external memory. Is it feasible? Also, if you have comments/doubts about any other parts of the proposal, feel free to mention it.
clicksaswat_ has quit [Ping timeout: 260 seconds]
Trion has quit [Quit: Have to go, see ya!]
clicksaswat has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
clicksaswat has joined #mlpack
witness_ has joined #mlpack
aashay has joined #mlpack
mikeling has quit [Quit: Connection closed for inactivity]
chenzhe has joined #mlpack
clicksaswat has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
vivekp has quit [Ping timeout: 240 seconds]
vivekp has joined #mlpack
ironstark has quit []
ironstark has joined #mlpack
witness_ has quit [Quit: Connection closed for inactivity]
aashay has quit [Quit: Connection closed for inactivity]
< rcurtin> no word from the team that is supposed to be installing masterblaster...
< rcurtin> don't know if they will have it done before they leave tonight
< rcurtin> if they don't that means downtime until probably Monday or Tuesday
< rcurtin> but... at least it is better than the year it took to get it online...
vinayakvivek has quit [Quit: Connection closed for inactivity]
chenzhe has quit [Ping timeout: 260 seconds]
chenzhe has joined #mlpack
sumedhghaisas has joined #mlpack
< sumedhghaisas> rcurtin: Hey Ryan... did you get a chance to look at that softmax regression problem?
< sumedhghaisas> :)
< rcurtin> it's on my queue
< rcurtin> actually pretty close to the top finally :)
< sumedhghaisas> haha... cause I was thinking... I am right now implementing a neural net architecture nd suddenly realized in classification we always have a softmax layer on top
< sumedhghaisas> so I have this mathematical question
< sumedhghaisas> if we call our thing a reduced softmax...
< sumedhghaisas> softmax has many optimums... as we can subtract any vector from the weights
< sumedhghaisas> but reduced softmax has only 1 optimum
< sumedhghaisas> thus an optimization should be more stable in reduced softmax right?
< sumedhghaisas> I am not so sure about it but it feels right
chenzhe1 has joined #mlpack
chenzhe has quit [Read error: Connection reset by peer]
chenzhe1 is now known as chenzhe