verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
nilay has joined #mlpack
govg has quit [Ping timeout: 260 seconds]
nilay has quit [Ping timeout: 250 seconds]
nilay has joined #mlpack
Mathnerd314 has quit [Ping timeout: 260 seconds]
mentekid has joined #mlpack
Rodya has quit [Quit: Adieu, dear Werther!]
mentekid has quit [Ping timeout: 240 seconds]
mentekid has joined #mlpack
marcosirc has joined #mlpack
< marcosirc> clear
Mathnerd314 has joined #mlpack
chrismeyer has joined #mlpack
govg has joined #mlpack
chrismeyer has quit [Ping timeout: 250 seconds]
mentekid has quit [Ping timeout: 244 seconds]
mentekid has joined #mlpack
mentekid has quit [Ping timeout: 260 seconds]
Rodya has joined #mlpack
mentekid has joined #mlpack
< nilay> zoq: do you think, we can put the codes of Forward, Backward and Gradient in cnn.hpp, in some other utils file, so that we don't need to make a net in order to call those functions
< nilay> the subnet_layer could then directly call these functions..., otherwise i will have to duplicate that code in subnet_layer
< zoq> nilay: Sounds like a good idea, we could put it in network_util
< nilay> ok great..
< zoq> nilay: Btw. nice blog post.
< nilay> thanks :)
sumedhghaisas has joined #mlpack
mentekid has quit [Ping timeout: 276 seconds]
< nilay> zoq: here is the problem with the gradient function...
< nilay> i think instead of doing GradientConvolutionRule::Convolution(inputSlices, deltaSlices, output); what we want to do is output = inputSlices % deltaSlices.
< nilay> also here is the concat_layer : https://gist.github.com/nilayjain/43196fc3956be585bf9c5469c4ce8bc0 , and the concat_layer_test: https://gist.github.com/nilayjain/c000953eee50faf0f496ba399031c0d8 .. let me know what you think... i'll look at it tomorrow..
nilay has quit [Quit: Page closed]
marcosirc has quit [Quit: WeeChat 1.4]
Mathnerd314 has left #mlpack []
mentekid has joined #mlpack
sumedhghaisas has quit [Remote host closed the connection]
< zoq> nilay: Looks like if you use arma::cube delta = arma::ones(2, 2, 1); instead of arma::cube delta = arma::ones(5, 5, 1); you get the right output.
mentekid has quit [Ping timeout: 244 seconds]
< zoq> nilay: The concat layer looks good, didn't even know that 'output = arma::join_slices(output, std::get<I>(layers).OutputParameter());' works if output isn't initalized.