verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
vivekp has quit [Ping timeout: 260 seconds]
govg has joined #mlpack
ImQ009 has joined #mlpack
manish7294 has joined #mlpack
< manish7294>
rcurtin: Thanks! the algo you suggested helped gaining significant speedups. 50k data points -> 43.53secs :)
manish7294 has quit [Ping timeout: 260 seconds]
travis-ci has joined #mlpack
< travis-ci>
ShikharJ/mlpack#133 (Deconv - f27f79f : Shikhar Jaiswal): The build has errored.
< ShikharJ>
Hi, I wanted to discuss regarding the dilated convolution layer. Is this a good time to talk?
< zoq>
about to step out, but I can stay a little bit longer
< ShikharJ>
Ah, sure, just a small doubt. Aarush (in his PR) had created a separate NaiveAtrousConvolution rule for the additional dilation parameter. But I think we can incorporate that in the existing naive_convolution implementation
< ShikharJ>
would this be a good idea?
< ShikharJ>
That way, there'd be no need for an additional file, and we can just provide the default dilation parameter as 1 for the existing layers.
< zoq>
Right, I don't think anyone is going to use something like a NaiveAtrousConvolution rule for another method anyway.
< ShikharJ>
Yeah, also, we can then use a combination of parameters to obtain something like a Dilated Transposed Convolution layer, and so on (something that Tensorflow provides as separate methods).
< zoq>
hm, does anyone use a Dilated Transposed Convolution layer?
< ShikharJ>
I'm not sure, never came across any paper that explicitly used that, but theoretic examples of such a convolution wouldn't be hard to find.
< zoq>
not sure, it makes sense to combine both, one captures a larger receptive field and the other one is mainly used for upsampling, have to think about it. But, if I remember right, I saw a paper that used both methods in a network.
< ShikharJ>
Agreed. This sounds plausible to me :)
< ShikharJ>
Pushed in the changes, for now, only the additional tests for Layer Normalization and Dilated Convolutions are remaining.
sumedhghaisas has joined #mlpack
travis-ci has joined #mlpack
< travis-ci>
ShikharJ/mlpack#136 (AtrousConv - 876a786 : Shikhar Jaiswal): The build has errored.