verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
govg has joined #mlpack
sumedhghaisas__ has quit [Ping timeout: 240 seconds]
kris1 has joined #mlpack
partobs-mdp has joined #mlpack
shikhar has joined #mlpack
< partobs-mdp> zoq: rcurtin: Finally got the non-zero parameters in HAM, but the unit test is still messed up. It seems like I've done some armadillo magic wrong, but could you take a look into the issue?
< partobs-mdp> The code in the latest commit
< partobs-mdp> One of the problems is that the JOIN doesn't get valid parameters and emits some negative values (despite having weight matrix consisting of positive elements)
shikhar has quit [Ping timeout: 240 seconds]
sumedhghaisas__ has joined #mlpack
shikhar has joined #mlpack
rohit has joined #mlpack
rohit has quit [Quit: Page closed]
partobs-mdp has quit [Remote host closed the connection]
govg has quit [Ping timeout: 248 seconds]
govg has joined #mlpack
vivekp has joined #mlpack
sumedhghaisas__ has quit [Ping timeout: 240 seconds]
kris__ has joined #mlpack
< kris__> Hi lozhnikov could you run this file for me.
< kris__> These are the parameters ./gan_keras.o -i ./train4.txt -o ./output.txt -m 1000 -e 1000 -r 0.001 -b 100 -g 2 -N 100 -G 1024 -D 1024 -t -1 -s -v
< lozhnikov> kris__: I have to do another work right now, I'll run your code in an hour.
kris1 has quit [Quit: kris1]
shikhar has quit [Quit: WeeChat 1.4]
vivekp has quit [Ping timeout: 240 seconds]
vivekp has joined #mlpack
kris1 has joined #mlpack
Ram has joined #mlpack
Ram is now known as Guest75930
< Guest75930> hi. i have just working on open-source projects.
< Guest75930> How do i get in touch with this organization
< zoq> Guest75930: Hello there, http://www.mlpack.org/involved.html and http://www.mlpack.org/gsoc.html are probably helpful.
< zoq> Guest75930: We are always open for new contributions, let us know if we should clarify anything.
< Guest75930> thank you. will do
Guest75930 has quit [Ping timeout: 260 seconds]
vivekp has quit [Ping timeout: 240 seconds]
vivekp has joined #mlpack
vivekp has quit [Ping timeout: 240 seconds]
vivekp has joined #mlpack
< kris1> lozhnikov: I think now the resize layer is fixed. You can have a look.
kris1 has quit [Quit: kris1]
kris1 has joined #mlpack
vivekp has quit [Ping timeout: 240 seconds]
vivekp has joined #mlpack
< kris1> Lozhnikov: I was think that we should add a parameter for noise size in Gan.hpp.
< kris1> Right now we have only one noise sample. On each batch we train the Gan on batchSize * trainData(0, batchSize) + noise*BatchSize.
< kris1> or we are training on noise very less. So if the batchSize is small this would not be a big matter. But if the batchSize = 100. the diffrence is pretty huge. I think this the reason the gradients are not big in out case.
< kris1> What do you think ?
vivekp has quit [Ping timeout: 246 seconds]
shikhar has joined #mlpack
< kris1> Also rather than having an extra parameter i was thinking we always set the noiseSize = batchSize so the now the predictor would (trainData.n_rows + batchSize, trainData.n_cols)
shikhar has quit [Quit: WeeChat 1.4]
kris1 has quit [Quit: kris1]