ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
jerianjer has joined #mlpack
UmarJ has joined #mlpack
UmarJ has quit [Quit: UmarJ]
UmarJ has joined #mlpack
< AakashkaushikGit> So these warnings are just depreciation warning and are going to be solved forgot in mlpack 4.0
< AakashkaushikGit> To be solved*
ib07 has quit [Ping timeout: 256 seconds]
ImQ009 has joined #mlpack
< rcurtin[m]> zoq: Gaurav Ghati: I took a look through the list of static analysis warnings; here's a filtered list: https://pastebin.com/E38VBQuM ; only a few needed to be filtered out
UmarJ has quit [Remote host closed the connection]
UmarJ has joined #mlpack
< GauravGhati[m]> okay, I'll open the issue then 👍
UmarJ has quit [Quit: UmarJ]
UmarJ has joined #mlpack
PrateekGuptaGitt has quit [Ping timeout: 260 seconds]
NishantNandanGit has quit [Ping timeout: 260 seconds]
PrateekGuptaGitt has joined #mlpack
NishantNandanGit has joined #mlpack
satyz[m] has quit [Quit: Idle for 30+ days]
< rcurtin[m]> Gaurav Ghati: thanks for opening the issue---do you want to modify the description to make it a bit friendlier for new folks? e.g., add some instructions on what needs to be done, how to start, how to check to see if the issue is resolved, etc.
< rcurtin[m]> (unless you were just planning on handling everything yourself)
< NippunSharmaGitt> @rcurtin can you please take a look at #2787 whenever you have time.
< rcurtin[m]> Nippun Sharma (Gitter): I get email notifications for all activity on the mlpack repository; I saw your responses, and I will get to responding to them when I have time
< NippunSharmaGitt> @rcurtin okay thanks
< NippunSharmaGitt> Also I received the mlpack and ensmallen stickers , they look great ! thanks!
< rcurtin[m]> awesome, glad they came through 👍️
< NippunSharmaGitt> I had a question, why do we not have bindings for the ANN module ?
< rcurtin[m]> there has been an open issue for it for a while, but they have never been completed
< NippunSharmaGitt> Great, can I work on it ?
< NippunSharmaGitt> I have commented on the issue for the same
< rcurtin[m]> I'm not sure if anyone is already working on it; please go search for open issues and PRs about it
< GauravGhati[m]> Sure, I'll edit the description.
< AnushKiniGitter[> I was playing around with the binder notebooks in mlpack/examples and they are great!. I was thinking we could add a few more examples. For starters, I could work on a simple DCGAN for mnist digits since currently there are no examples on how to use the GAN suite in mlpack.
< zoq> AnushKiniGitter[: Great idea, always looking for new examples.
< AnushKiniGitter[> @zoq Is the version of matplotlib-cpp we use in binder updated? I was trying to use the 'imshow' method but I get and error saying
< AnushKiniGitter[> ```no member named 'imshow' in namespace 'matplotlibcpp'```
< AnushKiniGitter[> matplotlib-cpp has recently added support to a lot of the other methods in matplotlib.
< AnushKiniGitter[> (edited) ... in matplotlib. => ... in matplotlib/
< AnushKiniGitter[> (edited) ... in matplotlib/ => ... in matplotlib.
< AnushKiniGitter[> (edited) ... in matplotlib. => ... in matplotlib including 'imshow'.
< AnushKiniGitter[> (edited) ... has recently added ... => ... has added ...
< AnushKiniGitter[> (edited) ... in binder updated? ... => ... in the binder environment updated? ...
< AnushKiniGitter[> (edited) ... 'matplotlibcpp'```
< AnushKiniGitter[> matplotlib-cpp has added support to a lot of the other methods in matplotlib including 'imshow'. => ... 'matplotlibcpp'```
< AnushKiniGitter[> The latest version of matplotlib-cpp has added support the 'imshow' method.
< zoq> AnushKiniGitter[: imshow does not work with the C++ kernel, as a workaround you have to save the plot and use xeus image_from_file.
< zoq> NippunSharmaGitt: Nice :)
< rcurtin[m]> zoq: how are you handling serialization for the ANN boost::visitor refactoring? I started doing the same today for the `NSModel` class, but with virtual inheritance, serialization becomes tricky (especially when templates are involved)
< rcurtin[m]> maybe you haven't gotten to that part yet, I was just curious if you had :)
< zoq> I saved that up :)
< rcurtin[m]> :)
< rcurtin[m]> I am not sure yet, but I think we might need some kind of "factory" specifically for serialization. maybe ther e is a nicer way; cereal supports polymorphic serialization via macros, but you would need one macro for each possible type you would serialize---and with templated types, that's kind of infeasible
< abernauer[m]> rcurtin: Going to revist that open PR i have.
< rcurtin[m]> 👍️ awesome
< zoq> Hm, I guess for the ann part we could get away with arma::mat and arma::fmat.
< zoq> But ideally we can find a better way.
< zoq> There has to be, we can't be the first one running into this.
< rcurtin[m]> That might work for our examples and models code, but then serialization wouldn't work the moment someone tried to use a different matrix type
< rcurtin[m]> And, agreed, there has to be some solution here
< zoq> I guess good thing that I saved this for last :)
< rcurtin[m]> :)
< rcurtin[m]> there's a chapter on Object Factories I remember from many years ago in Modern C++ Design... I'll read through it and see if it's helpful
< zoq> Fingers crossed I have the book here as well.
< rcurtin[m]> I'm not sure it'll be too helpful... the object factory depends on `Register()` being called for each type that might be created
< rcurtin[m]> but in the context of serialization, I'm not sure we will have that information
< rcurtin[m]> zoq: I thought about it. one way or another, we have to have some mapping from a serialized type ID to that type's constructor; that makes registration of each type we want to use unavoidable
< rcurtin[m]> I might suggest, then, that we include a macro to do this registration in each layer's header file, and then for people who want to keep the set of registered types small, they can include only the base layer and the specific layers that they want to use
< rcurtin[m]> we also need to choose what matrix types to use; by default, I imagine we would want to specify `arma::mat, arma::mat` and `arma::fmat, arma::fmat`, for the input and output layer types, but perhaps this can be controlled by the user setting a macro for types they want to use
< rcurtin[m]> this is the best idea I can think of so far... I'm not sure if it could be made cleaner, but we definitely want to avoid the amount of manual registration a user needs to do
< zoq> puh, that makes the puh this makes the whole idea less attractive :(
< zoq> yes, agreed
< rcurtin[m]> yeah, we could also include some header files that include most common layers to make it easier for users
< rcurtin[m]> in some sense, our previous "registration" mechanism was through the `LayerTypes` and `ExtraTypes` types
< rcurtin[m]> it might be possible to retain that type of functionality, and instead of having the registration mechanism be inside of each layer's header file, it could be in the constructor for `FFN`, via iterating over all the types in the "allowed layer types" list
< zoq> true, I guess if we can capsual everything into some macro it's a similair interface
< rcurtin[m]> however, I'm not sure if that's any better (and I'm not sure how easy it will be to guarantee compatibility between different saved FFNs with different allowed layer types)
< zoq> I guess we do the same thing for the NSModel part?
< rcurtin[m]> for the NSModel, I actually have an explicit list of types already so I am going to try a `switch` statement over types
< rcurtin[m]> the list of types is much smaller too---there are ~15 possible types, so it's not that bad
< zoq> Yeah, a lot less types
< rcurtin[m]> but maybe if what we come up for the ANN code is better, we can adapt that solution to NSModel and related classes too
< rcurtin[m]> even a big switch statement for the ANN code is not the end of the world---but it gets a little bit hard if each layer is allowed to have its own template parameters
< rcurtin[m]> hmm, but I'm not sure how we can allow custom layers like that
< rcurtin[m]> for NSModel (and similar classes), there is no custom functionality, since that class exists specifically for the `mlpack_knn` and `mlpack_kfn` bindings, which have restricted functionality to the most common cases anyway
< zoq> haha, I have a solution, if someone likes to use custom layers he has to use boost::variant :)
< rcurtin[m]> :)
ImQ009 has quit [Quit: Leaving]