verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
trapz has quit [Quit: trapz]
flyingpot has joined #mlpack
flyingpot has quit [Ping timeout: 240 seconds]
prasanna082 has quit [Read error: Connection reset by peer]
mikeling has joined #mlpack
flyingpot has joined #mlpack
govg has quit [Ping timeout: 258 seconds]
govg has joined #mlpack
flyingpot has quit [Ping timeout: 260 seconds]
shihao has joined #mlpack
flyingpot has joined #mlpack
flyingpot has quit [Ping timeout: 240 seconds]
trapz has joined #mlpack
vinayakvivek has joined #mlpack
trapz_ has joined #mlpack
trapz has quit [Ping timeout: 240 seconds]
trapz_ is now known as trapz
flyingpot has joined #mlpack
louishenrifranc has joined #mlpack
louishenrifranc has quit [Client Quit]
trapz has quit [Quit: trapz]
usama has joined #mlpack
Arun_r has joined #mlpack
CaptainFreak has joined #mlpack
Arun_r has quit [Client Quit]
shihao has quit [Quit: Page closed]
snd has joined #mlpack
CaptainFreak has quit [Ping timeout: 260 seconds]
NarayanSrinivasa has joined #mlpack
< NarayanSrinivasa> Hello people . I went through the ideas list and quite a few intrigued me , Deep Learning , Reinforcement Learning the most.But i feel i can try adding the Cross Validation Feature back to ML-Pack . There are no readings given in ideas page
< NarayanSrinivasa> I am not new to ML and i have already completed the Andrew Ng Course and did a project in BioInformatics.
< NarayanSrinivasa> Could someone tell me what all aspects i should be familiar with to contribute to this project.I have compiled ML-Pack from source and implemented a few simple programs to familiarise myself
vivekp has quit [Ping timeout: 240 seconds]
NarayanSrinivasa has quit [Quit: Page closed]
vivekp has joined #mlpack
vivekp has quit [Changing host]
vivekp has joined #mlpack
Thyrix has joined #mlpack
vpal has joined #mlpack
Thyrix has quit [Ping timeout: 260 seconds]
vivekp has quit [Ping timeout: 260 seconds]
vpal is now known as vivekp
drewtran has quit [Ping timeout: 260 seconds]
snd has quit [Ping timeout: 260 seconds]
flyingpot has quit [Remote host closed the connection]
krishna has joined #mlpack
krishna has quit [Client Quit]
usama has quit [Ping timeout: 260 seconds]
KHC has joined #mlpack
rob has joined #mlpack
< rob> hi
rob has quit [Client Quit]
KHC_ has joined #mlpack
KHC_ has quit [Client Quit]
KHC has quit [Quit: Page closed]
tempname_ has joined #mlpack
usama has joined #mlpack
avs has joined #mlpack
avs is now known as Guest53006
Guest53006 has quit [Quit: Page closed]
shikhar has joined #mlpack
trapz has joined #mlpack
govg has quit [Quit: leaving]
chvsp has joined #mlpack
trapz has quit [Quit: trapz]
usama has quit [Quit: Page closed]
trapz has joined #mlpack
govg has joined #mlpack
chvsp has quit [Quit: Page closed]
shikhar has quit [Ping timeout: 260 seconds]
trapz has quit [Quit: trapz]
shikhar has joined #mlpack
trapz has joined #mlpack
trapz has quit [Quit: trapz]
shikhar has quit [Ping timeout: 260 seconds]
< zoq> kris: HasParametersCheck<T, P&(T::*)()>::value is a Type Trait and combined with SFINAE really powerfull.
< zoq> T is the type of the object we like to test against and P&(T::*)() is the function definiton e.g. in this case we like to figure out if for example the a given layer has a Parameters function.
< zoq> How does the Parameters() function look like: 'OutputDataType& Parameters() { return weights; }' so, let me write the expression a little bit different:
< zoq> RETURNTYPE_OF_FUNCTION(T::*)(FUNCTION_ARGUEMTNS)::value, RETURNTYPE_OF_FUNCTION = OutputDataType& and since the Parameters function has no arguments it's empty.
< zoq> Another example let's say we like to figure out if the given object has double& Alpha(const size_t a, double b) { return alpha; } the trait looks like: HasAlphaCheck<T, double&(T::*)(const size_t, double)>::value; RETURNTYPE_OF_FUNCTION = double&, FUNCTION_ARGUEMTNS = const size_t, double.
hashcoder has joined #mlpack
< zoq> kris: See my comments on the gist.
zulfiqarjunejo has joined #mlpack
mikeling has quit [Quit: Connection closed for inactivity]
< zulfiqarjunejo> Hello
< zoq> zulfiqarjun: Hello there!
< zulfiqarjunejo> how are you zoq?
< zulfiqarjunejo> I am here looking for mailing list for GSoC 2017. Can you guide me what to write in subscription email?
< zulfiqarjunejo> Oh yes. Earlier when I went to mailing list from GSoC, it redirected me to 'mailto:....'
< zulfiqarjunejo> Thanks zoq :)
< zoq> ah, I see, yeah, you have to subscribe first to be able to send a mail to the list.
zulfiqarjunejo has quit [Quit: Page closed]
Thyrix has joined #mlpack
shihao has joined #mlpack
shikhar has joined #mlpack
shikhar has quit [Quit: Page closed]
tejank10 has joined #mlpack
tejank10 has quit [Client Quit]
Thyrix has quit [Quit: Page closed]
chvsp has joined #mlpack
govg has quit [Ping timeout: 240 seconds]
< chvsp> Hi zoq, as I asked you yesterday about adding a batchnorm layer, I was wondering about how would we proceed to write tests for it. As I have read that the effect of such a layer is only prominent in deep networks with several layers. It is not possible for us to train a large network in the test module.
hashcoder has quit [Ping timeout: 260 seconds]
< zoq> chvsp: I can think of two test right now: 1. Test that checks the gradient; 2. We can use a pre trained network and run it only for a number of iterations, and see if the error changes over time.
< chvsp> zoq: 1. The gradient wouldn't be deterministic, as we choose a random minibatch everytime. Are you hinting at taking the same minibatch everytime?
< zoq> Yeah, we have to make it deterministic, but that should be a problem.
< chvsp> Right. Then we would have to engineer this batches to cater to every corner case I guess.
< chvsp> *these batches
< zoq> We can start with a simple example, but I agree creating an worst case input would be even better.
< chvsp> Cool, I will look into it.
< chvsp> Could you please review my PR about the Kathirvalavakumar Subavathi tests. It would be great if it is merged.
< zoq> I agree that would be great, I'm trying to review a couple of PR's later today.
< chvsp> Sure. Whenever you are free. :)
kris1 has joined #mlpack
tempname_ has quit [Quit: Page closed]
< chvsp> zoq: About the PR review, can we as participants, contribute to the review in any way to reduce the burden on you? If any, do let us know, will be happy to help.
< zoq> chvsp: It's definitely not a burden; what we like to do is to give everyone helpful comments or start a discussion over code parts that could be tackled differently, etc. and that sometimes takes some more time as pointing out failures without any direction to solve the issue.
< chvsp> zoq: Cool, just a random thought...
supertramp-sid has joined #mlpack
< supertramp-sid> Hello guys, I wanted to ask doubt regarding GSOC'17 . I want to work on mlpack on the cross-validation and hyper-parameter tuning module. As suggested I will look into the current code base. I wanted to ask you if you had any suggestion on how should I go about drafting a simple proposal that you guys would prefer to see. Thanks.
nklm has joined #mlpack
govg has joined #mlpack
VIvek__ has joined #mlpack
VIvek__ has quit [Client Quit]
sumedhghaisas has joined #mlpack
supertramp-sid has quit [Quit: Page closed]
shihao has quit [Quit: Page closed]
drewtran has joined #mlpack
< kris1> zoq:is there a way to build only ann
shihao has joined #mlpack
< shihao> Hi there!
< shihao> If I make some changes in one of tests, I have to rebuild 'mlpack_test' for all tests? It takes a very long time.
< kris1> if you write BOOST_AUTO_TEST_SUITE(TestSuite)
< kris1> you can do this bin/mlpack_test -t TestSuite
< kris1> it will only run the test in the TestSuite
< kris1> refer to the description here
< kris1> shihao:
< shihao> oh, got it!
< shihao> Thank you, krisl!
< shihao> If I want to test correctness of posteriors in nbc, can I add a new csv file which contains posteriors calculated by other tools, like sklearn ?
< kris1> zoq: a while back you suggested to add this line to std::vector<LayerTypes>& Model() { return network; } for getting the model but this is giving error while building
< kris1> shihao: hard code the posterios in the test file while check something and check_is_close(some_function(), real_value)
< shihao> krisl: got it !
diehumblex has quit [Quit: Connection closed for inactivity]
sumedhghaisas has joined #mlpack
kris1 has left #mlpack []
< shihao> Hi guys. I added a new test file 'testResProba.csv' to test prosteriors in nbc and Travis CI build failed since there is no such file there. How can I solve this problem?
nklm has quit [Quit: Page closed]
sumedhghaisas has quit [Ping timeout: 260 seconds]
kris1 has joined #mlpack
< kris1> @zoq are there any tests for the visitor patterns
< kris1> sorry i mean visitors
< kris1> std::cout << boost::apply_visitor(WeightSizeVisitor(), model) << std::endl;
< kris1> i was doing this and this gave a error
< kris1> saying ffn has no apply_visitor
< zoq> kris1: Make sure model is of type LayerTypes, also you said "std::vector<LayerTypes>& Model() { return network; }" results in an error, can you give me the error message?
< kris1> FFN<NegativeLogLikelihood<> > model;
< kris1> so its of FFN type
< kris1> i think ffn should also be in LayerTypes
< zoq> yeah, and the visitor works on LayerTypes, so model.Model()[0] for the first layer model.Model()[1] for the second, with the assumption that model.Model returns std::vector<LayerTypes>
< kris1> i see lstm, convolution but not ffn
< kris1> ok
< kris1> but this assumes that std::vector<LayerTypes>& Model() { return network; } this works. but it dosen't for me
< kris1> right now atleas. wait i will post the error msg
< kris1> zoq:i have to add this line to ffn.hpp right?
< zoq> yes
< zoq> Regarding the build question; You can't just build the ann code, at least not without modifying the CMake file, on the other side if you run make it should build modified files only.
< kris1> The error is something like this
< kris1> InitializationRuleType>::Model()’ cannot be overloaded
< kris1> std::vector<LayerTypes>& Model() { return network; }
< kris1> the full error http://pastebin.com/mFaPx5h2
< kris1> zoq:
< zoq> strange, can you also post your ffn.hpp file?
< shihao> zoq: Hi zoq, I add a new test file and Travis CI build failed me. How can it be solved?
< shihao> I was trying to test posteriors in nbc.
< zoq> shihao: Can you send me the travis link?
< shihao> zoq:https://travis-ci.org/mlpack/mlpack/builds/207777119
< kris1> zoq:https://gist.github.com/kris-singh/9c6550f1763578e4723228c017b2f837
< zoq> shihao: I can't see that you pushed the testResProba.csv to tests/data, also maybe we can just define the matrix, something like: arma::mat testResProba("1 2 3 4; 1 2 3 4")?
< zoq> kris1: The function is implemented twice line 173 and 49
< shihao> zoq: Oh, I forgot it. After fixing it, should I close my PR and create another one?
< shihao> zoq: I noticed that travis started to rebuild :)
< zoq> Ah, yeah every time you make a commit + push travis will rebuild the PR.
< shihao> I am curious what debugging tool or IDE you guys are using?