verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
wenhao has quit [Ping timeout: 260 seconds]
< zoq> ShikharJ: What about writing a simple shell script that tests a couple of parameters? We can write a simple executable (gan_main.cpp) and use the serialization feature to save the network; we could also just save the weights.
< zoq> ShikharJ: About the optimization, we can add the support later with the batch support, I think there are are couple papers out there that use different optimization methods, but I don't think the difference is huge.
< zoq> ShikharJ: You can also work on the benchmark system, perhaps preparing the dataset etc. is faster that way, your call. Usually, I use a combination of tmux + mosh + vim which works quite well on a remote machine.
vivekp has joined #mlpack
ImQ009 has joined #mlpack
wenhao has joined #mlpack
ImQ009 has quit [Ping timeout: 268 seconds]
ImQ009 has joined #mlpack
wenhao has quit [Ping timeout: 260 seconds]
< jenkins-mlpack> Project docker mlpack nightly build build #337: SUCCESS in 2 hr 48 min: http://masterblaster.mlpack.org/job/docker%20mlpack%20nightly%20build/337/
alsc has joined #mlpack
< Atharva> zoq: In all the layer objects there is a inputParameter member but it never seems to be used anywhere. Can you tell me what I am missing here?
alsc has quit [Quit: alsc]
mikeling has joined #mlpack
< ShikharJ> zoq: I'll post the results tomorrow, then we can merge the PR.
travis-ci has joined #mlpack
< travis-ci> ShikharJ/mlpack#169 (DCGAN - 96b51c1 : Shikhar Jaiswal): The build has errored.
travis-ci has left #mlpack []
< zoq> ShikharJ: I like the idea to use the existing class. I thought about the optimizer seperation. Let me comment on the PR.
< zoq> Atharva: This is meant to be used for the recurrent attention model, there is an open PR; but you are currently it's not used. And I think we could remove the parameter for the layer that aren't going to be used, if you like to.
mikeling has quit [Quit: Connection closed for inactivity]
< Atharva> zoq: Can you please point me to the PR. Is it clear in the PR which layers require that parameter?
< Atharva> zoq: Is it okay to throw a std::error while constructing a layer object if there is some mistake we think that people might make when they use it for the first time?
< Atharva> rcurtin:
< zoq> I thought there was an open PR, let me check the code.
< zoq> okay, we need the conv and linear layer.
< zoq> yeah, this is fine, you can also use Log::Fatal
< Atharva> zoq: Okay, I will open a PR removing inputParameter from all the other layers. I guess the ones in ffn and ffn_impl are required.
< Atharva> zoq: Cool, thanks for that.
< ShikharJ> zoq: I'll prefer sticking to the plan I mentioned. Impllement GAN and DCGAN till Phase I (next 7 days), and then take up the task for optimizer separation and batch sizes together.
< ShikharJ> zoq: What is your preference?
< zoq> I do not have a preference, both is fine for me.
< ShikharJ> zoq: Ah, okay. I'll post the results tomorrow and inform you then :)
ImQ009 has quit [Read error: Connection reset by peer]
ImQ009 has joined #mlpack
travis-ci has joined #mlpack
< travis-ci> ShikharJ/mlpack#170 (DCGAN - af748b1 : Shikhar Jaiswal): The build has errored.
travis-ci has left #mlpack []
ImQ009 has quit [Quit: Leaving]
alsc has joined #mlpack
vivekp has quit [Ping timeout: 265 seconds]
alsc has quit [Quit: alsc]
alsc has joined #mlpack
alsc has quit [Quit: alsc]