verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
travis-ci has joined #mlpack
< travis-ci> mlpack/mlpack#1848 (master - 4a324f2 : Ryan Curtin): The build passed.
travis-ci has left #mlpack []
aashay has quit [Quit: Connection closed for inactivity]
flyingpot has joined #mlpack
flyingpot_ has joined #mlpack
mikeling has joined #mlpack
flyingpot has quit [Ping timeout: 240 seconds]
flyingpot_ has quit [Ping timeout: 240 seconds]
govg has quit [Ping timeout: 240 seconds]
govg has joined #mlpack
vivekp has quit [Ping timeout: 268 seconds]
vivekp has joined #mlpack
flyingpot has joined #mlpack
aashay has joined #mlpack
vivekp has quit [Ping timeout: 260 seconds]
vivekp has joined #mlpack
flyingpot has quit [Ping timeout: 240 seconds]
govg has quit [Ping timeout: 268 seconds]
flyingpot has joined #mlpack
vivekp has quit [Ping timeout: 240 seconds]
govg has joined #mlpack
vivekp has joined #mlpack
mikeling has quit [Quit: Connection closed for inactivity]
flyingpot has quit [Ping timeout: 260 seconds]
vpal has joined #mlpack
vivekp has quit [Ping timeout: 260 seconds]
flyingpot has joined #mlpack
vpal has quit [Read error: Connection reset by peer]
vivekp has joined #mlpack
vivekp has quit [Ping timeout: 260 seconds]
flyingpot has quit [Ping timeout: 240 seconds]
flyingpot has joined #mlpack
vivekp has joined #mlpack
flyingpot has quit [Ping timeout: 240 seconds]
flyingpot has joined #mlpack
flyingpot has quit [Remote host closed the connection]
flyingpot has joined #mlpack
< vivekp> zoq: Hi, I've finally completed the implementation of Adamax. (sorry, could only get started today itself.)
< vivekp> However, I'm still working on its tests.
< vivekp> Looking at the two tests of Adam optimizer, I think its possible to write very similar tests for Adamax as well
< vivekp> I just had a query regarding:
< vivekp> arma::shuffle(arma::linspace<arma::Col<size_t>>(0, (numFunctions - 1), numFunctions))
< vivekp> in adam_impl.hpp
< vivekp> please correct me if I'm wrong, this basically creates a column matrix of size 3 with values three values; 0, 1 and 2 shuffled across its rows right?
< zoq> vivekp: Sound good, no worries about the PR, take your time.
< zoq> Regarding linspace, it creates an column vector with size numFunctions where numFunctions is the number of samples, but it looks like [0, 1, 2, 3, ..., numFunctions - 1].
< zoq> It's used to index the sample at iteration i. Afterwards, the elements are randomly shuffled with arma::shuffel so e.g. [0, 3, 7, 9, 1]
< vivekp> okay, got it. numfunctions appears to be 3 in this case. Thanks
mikeling has joined #mlpack
< vivekp> zoq: looks like SimpleAdaMaxTestFunction (similar to SimpleAdaTestFunction) is failing but the second test passes.
< vivekp> Should I open a PR may be for further discussion?
< zoq> vivekp: Good idea, makes it easier to take a look at the issue.
< vivekp> yes, indeed
kris has joined #mlpack
flyingpot has quit [Ping timeout: 268 seconds]
flyingpot has joined #mlpack
flyingpot has quit [Ping timeout: 260 seconds]
Smeet has joined #mlpack
Smeet has left #mlpack []
< zoq> rcurtin: Ah, I forgot, the svg is a little bit different, e.g. the right node is at the same height as the left node, not sure the intentions was to put it at a different height?
< rcurtin> zoq: I remember saying to Ryan Birmingham "a little bit of randomness is okay", but I think I meant in the structure of the tree itself, not the position of the nodes
< rcurtin> like I think a balanced binary tree would be a little too boring
< rcurtin> but probably all of the nodes should line up :)
< rcurtin> thanks for making the SVG, I think it is a nice improvement... but I am not very good with visual art so I don't think I could have done a good job myself :)
< zoq> rcurtin: I'm also not good at it, I guess a designer would do a better job, but until then it should work. Also since I can use Ryan's work as a template it's not that hard.
< rcurtin> yeah
< rcurtin> I don't think he is a graphic designer either, so maybe all of us are just throwing stuff together and seeing what sticks :)
< zoq> rcurtin: absolutely :)
< zoq> rcurtin: Do you think we have to change the colors? I think it works on the white background, but we can play with it.
< rcurtin> I think maybe lighter shades of red might look better on the white background
< rcurtin> more saturated? I'm not sure what exactly the right word would be
kesslerfrost has joined #mlpack
nish21 has joined #mlpack
kesslerfrost has quit [Read error: Connection reset by peer]
< nish21> hello! I just wanted to confirm something.
< nish21> I was citing a paper in my implementation for FTRL. The list of authors are really huge. I can use et al right?
< rcurtin> nish21: is this for a bibtex citation in the code?
< nish21> rcurtin: yes
< rcurtin> hehe, this might not be the answering you are hoping for, but it would be best if you could add all the author names there :)
< rcurtin> this way, if someone wants to use that code, they have an easy bibtex citation they can immediately use
< rcurtin> you can usually get a bibtex citation pre-generated from Google Scholar, that can be very helpful
< nish21> rcurtin: yeah, i got a generated bibtex code. If i'm not wrong I should wrap this as well to 80 characters?
< rcurtin> yeah, that would be appreciated :)
< nish21> rcurtin: will do :)
< rcurtin> great, thanks
dineshraj01 has joined #mlpack
kesslerfrost has joined #mlpack
kesslerfrost has quit [Read error: Connection reset by peer]
kesslerfrost has joined #mlpack
< kris> Just a quick question. When using ffn_impl.hpp i see use of network. Its defined as vector<Layertype> network in ffn.hpp file. What i don't see it get initialized anywhere. Am i missing something??
< zoq> kris: Not sure I get your questions, what you see is the instantiation of the vector that holds the layer added. You can add a new Layer with Add(...) line 162 and 169 in ffn.hpp.
< kris> thanks zoq. I get it now.
< zoq> kris: Here to help, let us know if you have any more questions.
nish21 has quit [Ping timeout: 260 seconds]
flyingpot has joined #mlpack
benchmark has joined #mlpack
benchmark has quit [Client Quit]
flyingpot has quit [Ping timeout: 240 seconds]
kesslerfrost has quit [Quit: kesslerfrost]
< kris> I was reading about the visitor pattern. It was mentioned that it is used for implementing dynamic dispatch in c++.
< kris> dynamic dispatch means that in dividend.func(divisor) which function is called is based upon both the dividend and the divisor.Right?
< kris> Does mlpack use it in the same way. boost::apply_visitor(outputParameterVisitor, network.back()). The only type we need here is for network.back? So how does this mean a dynamic dispatch
dineshraj01 has quit [Quit: Leaving]
flyingpot has joined #mlpack
flyingpot has quit [Ping timeout: 240 seconds]
< kris> I was working with ffn. If we build a model and want the parameters of the network.
< kris> do we have to do something like this boost::apply_visitor outputParameterVisitor . But i am not sure what layer and how it should be passed
benchmark has joined #mlpack
benchmark has quit [Client Quit]
dineshraj01 has joined #mlpack
< zoq> kris: You can use Parameters() to get the complete parameter matrix: e.g.: model.Parameters().
< dineshraj01> Hi, after refactoring the range search model code, it was quite intuitive to do the same for RASearch. Kindly review the PR
< rcurtin> dineshraj01: great, thanks, I'll take a look when I can
< dineshraj01> okay :)
< zoq> hm, looks like I have to increase the threshold
< rcurtin> I like the change to only 2 significant digits, I think it makes it easier to read and filter out benchmarking noise
< zoq> You can also just take a look at the 3rd parameter which is the difference. Also, the summary at the end should show the number of benchmarks where diff > threshold (8%), which should work for the next run.
< zoq> So, hopefully we will see "Benchmarks 29 of 29 passed"
< rcurtin> ah, nice, maybe it would be good to add "(diff)" after the difference for clarity?
< rcurtin> this will be a lot nicer than 'benchmark' coming into #mlpack and randomly dumping like 100 messages at a time :)
< zoq> yeah, I think we should also go through the dataset list and remove some, e.g. testing rangesearch on wine and cloud is probably not the best idea.
< rcurtin> yeah, there are probably some extras there
< rcurtin> I can help look into that next week... today I am fighting against the ICML deadline...
< zoq> rcurtin: yeah, sure ... have you implemented RandomForest for the ICML paper?
travis-ci has joined #mlpack
< travis-ci> dineshraj01/mlpack#8 (ra-model-boost-variant - 57ef4a6 : Marcus Edel): The build failed.
travis-ci has left #mlpack []
< rcurtin> zoq: nope, not that paper, unfortunately there wasn't time for that
< rcurtin> someone else in my group discovered an interesting defense against adversarial samples for deep neural networks
< rcurtin> so I have been helping him write that paper, since he has never written a paper... it's time-consuming to teach someone to write
< rcurtin> today it became apparent that one of the attacks he implemented wasn't working, so I am now doing everything I can to get it working by tomorrow
< rcurtin> I'll send a link when it's posted on arXiv in a few days, I think it's an interesting paper (but then, my perspective is biased of course!)
< zoq> Sounds definitely interesting, eager to read the paper.
< govg> The deadline's in 24 hours, right?
< rcurtin> yeah, midnight UTC
< rcurtin> so 28 hours remain
< govg> Haha.
< govg> All the best.
< rcurtin> basically everything else is coming together fine, I just wish that it had come together a week ago...
< govg> It's funny how even if you've been working for a year on something, your paper will get written only minutes before the deadline :S
< rcurtin> yeah, I said to myself I would never pull another all-nighter before the ICML or NIPS deadline when I left graduate school
< rcurtin> but that may be what I am doing tonight...
< zoq> :)
travis-ci has joined #mlpack
< travis-ci> dineshraj01/mlpack#9 (ra-model-boost-variant - c0d54ac : dinesh Raj): The build passed.
travis-ci has left #mlpack []
travis-ci has joined #mlpack
< travis-ci> dineshraj01/mlpack#10 (master - a581cef : Marcus Edel): The build was broken.
travis-ci has left #mlpack []
< kris> @zoq actually i was working on the xavier init method. I wanted to see how to use the visitors for models.
benchmark has joined #mlpack
benchmark has quit [Client Quit]
ozym4nd145 has quit [Ping timeout: 260 seconds]
dineshraj01 has quit [Ping timeout: 240 seconds]
flyingpot has joined #mlpack
flyingpot has quit [Ping timeout: 240 seconds]
kris has quit [Ping timeout: 255 seconds]
flyingpot has joined #mlpack
flyingpot has quit [Ping timeout: 268 seconds]
agneet42 has joined #mlpack
< agneet42> Hello everyone,
< agneet42> I'm a second year CS Undergrad from Jadavpur University India.
< agneet42> My interest lie in Deep Learning and it's implementation in solving real life problems. I've worked with various DL Models. Ranging from CNN' to RNN's ( LSTM, Vanilla RNN's) to Autoencoders and have an impending interest in them.
< agneet42> I'd really like to contribute to mlpack as part of GSoC 2017. Could someone get me started with the project ideas and it's implementation?
< agneet42> Thank you,
< rcurtin> agneet42: hello there, I saw you just sent an email to the mailing list
< rcurtin> some very quick pointers (since I have to get up in a moment) might be to look at these pages:
< rcurtin> I'm not sure which project specifically you are interested in, but you should definitely search the mailing list archives, because the questions that you might have have probably already been answered :)
< agneet42> i have found a project which seems interesting to me.
< agneet42> could you tell me where to start off next? Or what to do next once I've understood the core of these two projects?
aashay has quit [Quit: Connection closed for inactivity]