verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
vivekp has quit [Ping timeout: 260 seconds]
govg has joined #mlpack
ImQ009 has joined #mlpack
manish7294 has joined #mlpack
< manish7294> rcurtin: Thanks! the algo you suggested helped gaining significant speedups. 50k data points -> 43.53secs :)
manish7294 has quit [Ping timeout: 260 seconds]
travis-ci has joined #mlpack
< travis-ci> ShikharJ/mlpack#133 (Deconv - f27f79f : Shikhar Jaiswal): The build has errored.
travis-ci has left #mlpack []
travis-ci has joined #mlpack
< travis-ci> ShikharJ/mlpack#134 (Deconv - 4b31aef : Shikhar Jaiswal): The build has errored.
travis-ci has left #mlpack []
prakhar_code[m] has quit [Ping timeout: 255 seconds]
killer_bee[m] has quit [Ping timeout: 276 seconds]
prakhar_code[m] has joined #mlpack
killer_bee[m] has joined #mlpack
vivekp has joined #mlpack
travis-ci has joined #mlpack
< travis-ci> ShikharJ/mlpack#135 (AtrousConv - 78e126c : Shikhar Jaiswal): The build has errored.
travis-ci has left #mlpack []
< ShikharJ> rcurtin: zoq: You There?
< zoq> ShikharJ: yes
< ShikharJ> Hi, I wanted to discuss regarding the dilated convolution layer. Is this a good time to talk?
< zoq> about to step out, but I can stay a little bit longer
< ShikharJ> Ah, sure, just a small doubt. Aarush (in his PR) had created a separate NaiveAtrousConvolution rule for the additional dilation parameter. But I think we can incorporate that in the existing naive_convolution implementation
< ShikharJ> would this be a good idea?
< ShikharJ> That way, there'd be no need for an additional file, and we can just provide the default dilation parameter as 1 for the existing layers.
< zoq> Right, I don't think anyone is going to use something like a NaiveAtrousConvolution rule for another method anyway.
< ShikharJ> Yeah, also, we can then use a combination of parameters to obtain something like a Dilated Transposed Convolution layer, and so on (something that Tensorflow provides as separate methods).
< zoq> hm, does anyone use a Dilated Transposed Convolution layer?
< ShikharJ> I'm not sure, never came across any paper that explicitly used that, but theoretic examples of such a convolution wouldn't be hard to find.
< zoq> not sure, it makes sense to combine both, one captures a larger receptive field and the other one is mainly used for upsampling, have to think about it. But, if I remember right, I saw a paper that used both methods in a network.
< ShikharJ> Agreed. This sounds plausible to me :)
< ShikharJ> Pushed in the changes, for now, only the additional tests for Layer Normalization and Dilated Convolutions are remaining.
sumedhghaisas has joined #mlpack
travis-ci has joined #mlpack
< travis-ci> ShikharJ/mlpack#136 (AtrousConv - 876a786 : Shikhar Jaiswal): The build has errored.
travis-ci has left #mlpack []
vivekp has quit [Ping timeout: 240 seconds]
vivekp has joined #mlpack
travis-ci has joined #mlpack
< travis-ci> ShikharJ/mlpack#137 (AtrousConv - bd1ad3d : Shikhar Jaiswal): The build has errored.
travis-ci has left #mlpack []
govg has quit [Ping timeout: 264 seconds]
govg has joined #mlpack
witness_ has joined #mlpack
sumedhghaisas2 has joined #mlpack
sumedhghaisas has quit [Ping timeout: 240 seconds]
sumedhghaisas has joined #mlpack
sumedhghaisas2 has quit [Ping timeout: 248 seconds]
sumedhghaisas has quit [Read error: Connection reset by peer]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Read error: Connection reset by peer]
sumedhghaisas2 has joined #mlpack
sumedhghaisas2 has quit [Read error: Connection reset by peer]
sumedhghaisas2 has joined #mlpack
ImQ009 has quit [Quit: Leaving]
sumedhghaisas2 has quit [Ping timeout: 250 seconds]