verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
wenhao has quit [Ping timeout: 260 seconds]
< zoq>
ShikharJ: What about writing a simple shell script that tests a couple of parameters? We can write a simple executable (gan_main.cpp) and use the serialization feature to save the network; we could also just save the weights.
< zoq>
ShikharJ: About the optimization, we can add the support later with the batch support, I think there are are couple papers out there that use different optimization methods, but I don't think the difference is huge.
< zoq>
ShikharJ: You can also work on the benchmark system, perhaps preparing the dataset etc. is faster that way, your call. Usually, I use a combination of tmux + mosh + vim which works quite well on a remote machine.
< Atharva>
zoq: In all the layer objects there is a inputParameter member but it never seems to be used anywhere. Can you tell me what I am missing here?
alsc has quit [Quit: alsc]
mikeling has joined #mlpack
< ShikharJ>
zoq: I'll post the results tomorrow, then we can merge the PR.
travis-ci has joined #mlpack
< travis-ci>
ShikharJ/mlpack#169 (DCGAN - 96b51c1 : Shikhar Jaiswal): The build has errored.
< zoq>
ShikharJ: I like the idea to use the existing class. I thought about the optimizer seperation. Let me comment on the PR.
< zoq>
Atharva: This is meant to be used for the recurrent attention model, there is an open PR; but you are currently it's not used. And I think we could remove the parameter for the layer that aren't going to be used, if you like to.
mikeling has quit [Quit: Connection closed for inactivity]
< Atharva>
zoq: Can you please point me to the PR. Is it clear in the PR which layers require that parameter?
< Atharva>
zoq: Is it okay to throw a std::error while constructing a layer object if there is some mistake we think that people might make when they use it for the first time?
< Atharva>
rcurtin:
< zoq>
I thought there was an open PR, let me check the code.
< zoq>
okay, we need the conv and linear layer.
< zoq>
yeah, this is fine, you can also use Log::Fatal
< Atharva>
zoq: Okay, I will open a PR removing inputParameter from all the other layers. I guess the ones in ffn and ffn_impl are required.
< Atharva>
zoq: Cool, thanks for that.
< ShikharJ>
zoq: I'll prefer sticking to the plan I mentioned. Impllement GAN and DCGAN till Phase I (next 7 days), and then take up the task for optimizer separation and batch sizes together.
< ShikharJ>
zoq: What is your preference?
< zoq>
I do not have a preference, both is fine for me.
< ShikharJ>
zoq: Ah, okay. I'll post the results tomorrow and inform you then :)
ImQ009 has quit [Read error: Connection reset by peer]
ImQ009 has joined #mlpack
travis-ci has joined #mlpack
< travis-ci>
ShikharJ/mlpack#170 (DCGAN - af748b1 : Shikhar Jaiswal): The build has errored.