verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
< zoq> Thanks for all the status updates, really interesting.
govg has quit [Ping timeout: 256 seconds]
govg has joined #mlpack
prakhar_code[m] has quit [Ping timeout: 245 seconds]
killer_bee[m] has quit [Ping timeout: 260 seconds]
vivekp has quit [Ping timeout: 260 seconds]
vivekp has joined #mlpack
prakhar_code[m] has joined #mlpack
killer_bee[m] has joined #mlpack
govg has quit [Ping timeout: 256 seconds]
govg has joined #mlpack
haritha1313 has joined #mlpack
sulan_ has joined #mlpack
sumedhghaisas has joined #mlpack
< haritha1313> zoq: You mentioned writing another class for merging two networks, similar to FAN. I am unable to find any such class. Could you please give some reference to it?
< ShikharJ> haritha1313: I guess it is Identity Layer, but again better to get confirmation from zoq!
< haritha1313> ShikharJ: Actually, I need to merge two networks (sequential/FNN) using add/multiply merge.
< haritha1313> By identity layer, you mean base layer, right?
< ShikharJ> I'm not sure if you can do this, but maybe you can try doing this (https://github.com/mlpack/mlpack/pull/1204/files#diff-73a3e2abae16e9a84a96856b191900e4R50)?
< haritha1313> Thanks for your help. I'll take a look at it :)
Trion has joined #mlpack
< jenkins-mlpack> Project docker mlpack nightly build build #333: SUCCESS in 2 hr 34 min: http://masterblaster.mlpack.org/job/docker%20mlpack%20nightly%20build/333/
sumedhghaisas has quit [Remote host closed the connection]
Trion has quit [Quit: Entering a wormhole]
haritha1313 has quit [Ping timeout: 260 seconds]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Ping timeout: 276 seconds]
sumedhghaisas has joined #mlpack
sumedhghaisas2 has joined #mlpack
sumedhghaisas has quit [Ping timeout: 260 seconds]
sumedhghaisas2 has quit [Ping timeout: 276 seconds]
sumedhghaisas has joined #mlpack
< Atharva> Is it okay to initialise helper matrices while implementing functions for a class, inside the function?
ImQ009 has joined #mlpack
sumedhghaisas2 has joined #mlpack
sumedhghaisas has quit [Ping timeout: 260 seconds]
sumedhghaisas2 has quit [Ping timeout: 276 seconds]
sumedhghaisas has joined #mlpack
sumedhghaisas2 has joined #mlpack
sumedhghaisas has quit [Ping timeout: 260 seconds]
< rcurtin> Atharva: I'm not sure I have enough context to say for sure, but I don't see any particular issue with that. I'd say, go ahead and do what you need to do, and if it's an issue it'll probably get brought up during a PR review :)
< Atharva> I figured it out, so no problem now.
< rcurtin> sounds good :)
< zoq> haritha1313: Actually I was talking about writing an extra class similair to the GAN class #1204, the idea is we can construct the GAN with two networks (generator and discriminator). We could do the same for th NCF module. Internally we merge the results. Do you think that could work here, if not let's just use Sequential layer, wich acts as a FFN class, but can be used with the merge/add layer.
< zoq> haritha1313: About the failing test, since it's just a failing simple test to see if we can retrain, let's use:
< zoq> arma::mat randomData = arma::zeros(100, 100);
< zoq> randomData.diag().ones();
< zoq> instead of
< zoq> arma::mat randomData(100, 100);
< zoq> randomData.randu();
< zoq> which should solve the issue.
< ShikharJ> zoq: Can I have a review of #1204 please? If I can be sure of its correctness for batchSize=1 case, then I can also run it for batchSize > 1 case and get the code merged.
sumedhghaisas2 has quit [Ping timeout: 265 seconds]
sumedhghaisas has joined #mlpack
haritha1313 has joined #mlpack
< haritha1313> zoq: Wow, that just solved it :D . I still don't understand though why it was giving error for random values.
< haritha1313> Ah, you meant GAN. A typo then. Actually there wasn't much documentation available even for sequential layer. Thats why I thought of looking into the alternative solution you gave.
< haritha1313> I'll see how it turns out with sequential itself then.
sumedhghaisas2 has joined #mlpack
sumedhghaisas has quit [Ping timeout: 276 seconds]
sumedhghaisas2 has quit [Ping timeout: 240 seconds]
sumedhghaisas has joined #mlpack
haritha1313 has quit [Ping timeout: 260 seconds]
vivekp has quit [Ping timeout: 260 seconds]
sumedhghaisas has quit [Ping timeout: 265 seconds]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Ping timeout: 260 seconds]
sumedhghaisas2 has joined #mlpack
__sulan__ has joined #mlpack
sulan_ has quit [Ping timeout: 264 seconds]
sulan_itk has joined #mlpack
__sulan__ has quit [Ping timeout: 268 seconds]
sulan_itk is now known as sulan_
sumedhghaisas has joined #mlpack
sumedhghaisas2 has quit [Ping timeout: 260 seconds]
sumedhghaisas has quit [Ping timeout: 276 seconds]
sumedhghaisas has joined #mlpack
sumedhghaisas2 has joined #mlpack
sumedhghaisas has quit [Ping timeout: 265 seconds]
< ShikharJ> zoq: Could you tell me how I could visualize or plot .arm data?
< zoq> one way ist to save the matrix as csv and to use pyplot or maplot
< rcurtin> if you were lazy, you could use the preprocessing utilities as a format converter:
< rcurtin> mlpack_preprocess_split -i data.arm -t train.csv -T test.csv; cat train.csv test.csv > data.csv
< rcurtin> (but that will shuffle the data too, if that matters)
< zoq> waht about using armadillo and save as pgm
< rcurtin> oh, yeah, that could work also
< rcurtin> but I think Armadillo will only do 8-bit PGM, which I guess will work if it's MNIST data or something
sumedhghaisas has joined #mlpack
sumedhghaisas2 has quit [Ping timeout: 265 seconds]
< Atharva> I am sorry if it's a very basic doubt, but what do delta matrices signify in the layer objects? When I studied backpropogation, I don't remember encountering it.
< Atharva> Isn't delta and error the same thing?
sulan_ has quit [Quit: Leaving]
ImQ009 has quit [Quit: Leaving]