verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
vivekp has quit [Read error: Connection reset by peer]
vivekp has joined #mlpack
witness_ has joined #mlpack
< ShikharJ> rcurtin: The current lmnn_test.cpp/LMNNLowRankAccuracyTest seems to have the test statement commented out. Was this done on purpose?
< rcurtin> at the time yes, but I was working on a fix for it
< ShikharJ> rcurtin: Ah, I see :)
travis-ci has joined #mlpack
< travis-ci> manish7294/mlpack#13 (tree - a8bb50a : Manish): The build is still failing.
travis-ci has left #mlpack []
travis-ci has joined #mlpack
< travis-ci> manish7294/mlpack#78 (tree - a8bb50a : Manish): The build failed.
travis-ci has left #mlpack []
wenhao has joined #mlpack
< wenhao> Hi guys I'm back from Europe:)
< Atharva> wenhao: Where all did you go?
< wenhao> I first went to ICML at Stockholm and attended some paper presentations on Preference and Rank Learning
< wenhao> Then we went to paris, lucern and rome
< Atharva> That's awesome!
< wenhao> A funny thing is that the last day we were at paris was the day france won the world cup
< Atharva> Oh! The environment must have been crazy there.
< wenhao> so a lot of people were celebrating at city paris
< wenhao> yes totally
< wenhao> we didn't plan to be a part of the celebration and we were on metro to our hotel, but the moving crowd in the metro forced us to triumph arch
< Atharva> wenhao: That must have been quite an experience, only a few tourists probably get to see the kind of Paris you saw.
< Atharva> wenhao: Otherwise, Paris is mostly calm.
< wenhao> Atharva: Yes! It was lucky for us to experience that just before we left paris. It was a lot of fun.
< wenhao> Atharva: Paris at it's calm days is also a nice place to stay and explore
< Atharva> wenhao: Yeah. How was Rome? I reallly wanted to go there on my trip but didn't have enough time.
< Atharva> wenhao: Yes it is.
< wenhao> Oh did you also go to europe?
< Atharva> Yes, earlier this summer :)
< wenhao> Atharva: Rome was nice too. A quite small city with lots of historic arts
< wenhao> Atharva: It was very hot when we were at rome
< Atharva> wenhao: Yeah it's historic attractions is the main reason I wanted to go.
< wenhao> Atharva: What cities did you go to?
< Atharva> I went to Paris, Amsterdam, Berlin, Prague, Budapest and Munich in that order
< wenhao> Atharva: wow a lot of cities. Sounds great.
< Atharva> wenhao: Yeah, it was really good.
wenhao_ has joined #mlpack
wenhao_ has quit [Client Quit]
wenhao_ has joined #mlpack
wenhao has quit [Ping timeout: 252 seconds]
< ShikharJ> wenhao_: Were you presenting a paper at ICML this year?
yaswagner has joined #mlpack
ImQ009 has joined #mlpack
wenhao_ has left #mlpack []
wenhao has joined #mlpack
< wenhao> ShikharJ: nope I don't have any paper at ICML. I just listened to the presentations and talked with other attendees.
gtank_ has quit [Ping timeout: 256 seconds]
gtank_ has joined #mlpack
< ShikharJ> wenhao: Ah nice, hope you had a good time there.
witness_ has quit [Quit: Connection closed for inactivity]
wenhao has quit [Quit: Page closed]
travis-ci has joined #mlpack
< travis-ci> manish7294/mlpack#79 (tree - 12e9a36 : Manish): The build is still failing.
travis-ci has left #mlpack []
travis-ci has joined #mlpack
< travis-ci> manish7294/mlpack#14 (tree - 12e9a36 : Manish): The build is still failing.
travis-ci has left #mlpack []
< Atharva> Armadillo doesn't automatically cast from bool to int, does it?
< Atharva> I tried assigning a boolean matrix to an int or double matrix and it throws no match for ‘operator=’ (operand types are ‘arma::Mat<int>’ and ‘arma::enable_if2<true, const arma::mtGlue<long long unsigned int, arma::Mat<double>, arma::Mat<double>, arma .....
< zoq> arma::conv_to<arma::Mat<int> >::from(data); should work
< Atharva> zoq: It worked, thanks!
< zoq> Atharva: Great :)
< zoq> Atharva: Did you have time to run the experiments against RMSProp?
< Atharva> zoq: I haven't done it till now, I will start training in some time as I go to sleep.
< Atharva> Should I test both the convolutional as well as normal VAE?
< zoq> Atharva: Up to you, I guess if it works we can start another one.
< Atharva> zoq: Okay, I will test it on normal VAE then.
< ShikharJ> zoq: Just like Weight() returns the weights of the BinaryRBM, WeightCube() returns the weight of the SSRBM. Are we sure we wish to remove the WeightCube function?
ImQ009 has quit [Quit: Leaving]
< zoq> ShikharJ: I think we could derive the cube form the mat form, but perhaps it's a good idea to keep both, if we keep both do you think we could find another function name?
< ShikharJ> zoq: Neither functions are used inside the class anywhere, so I'm guessing Kris provided them for user convenience. I can remove both of them if you think if that could work as well.
< ShikharJ> zoq: We could have just one weight parameter which is a cube, and set its size during Reset() which would be dependent on the variant of RBM being used, though I'm not sure if I could alias a matrix from a cube or not.
< zoq> ShikharJ: That should work; I like the idea.
< ShikharJ> rcurtin: I'm seeing frequent failures in LMNNMainTest everytime a Travis build is initiated. I'm not sure if you're aware of it so just letting you know :)
< rcurtin> right, there is a PR open for it