ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< RishabhGarg108Gi> Yeah you are right. I am not exactly sure, but I was thinking to use something like boost::variant. This way we can store different types of parameters in the `params` list. And then use vector<layer<boost::variant>>.
< RishabhGarg108Gi> As I told I am not exactly sure, so I would try somehing and see if I get lucky. If you have any better idea, then please share :+1:
< rcurtin[m]> I know that mlpack hasn't traditionally used inheritance, but since we are trying to move away from boost, maybe it is a better choice than the visitor paradigm here? I just don't want us to get into a situation where we implement new support with boost, but then have to refactor and redesign as we remove more of our boost usage. Open to other ideas too, of course, just a suggestion 👍️
ImQ009 has joined #mlpack
< AakashkaushikGit> @rcurtin I believe i am very close to running an example with inheritance. only 4 visitors left.
< shrit[m]> @aakash-kaushik great work, very excited to see some benchamark
< AakashkaushikGit> @shrit mee too.
< NippunSharmaGitt> size of the input must be divisible by the number of input channels. Like we are doing in [batch_norm_impl](https://github.com/mlpack/mlpack/blob/0b924c09ea3b52ede50ae8c6e5d1079ca4cfd067/src/mlpack/methods/ann/layer/batch_norm_impl.hpp#L82). Should this be added to these files ?
< RishabhGarg108Gi> Hey @zoq , I tried to use the boost::variant and with it I was successful in creating a config file with different types of parameters and layer types. But there is one issue. Cereal serializes boost::variant in form of an object into something line
< RishabhGarg108Gi> "name_of_variable" : {"which": <some number>, "value": <value associated with the variable>}. Now there is no way to actually change that "which" and "value" to some desired name. So, I think we have actually reached a dead end here.
< RishabhGarg108Gi> > `rcurtin (@ryan:ratml.org)` I just don't want us to get into a situation where we implement new support with boost, but then have to refactor and redesign as we remove more of our boost usage.
< RishabhGarg108Gi> Yeah you are right. We should think of long term so that we don't have to waste time unnecessary refactoring the code again and again.
< RishabhGarg108Gi> Today while I found out that the standard library is providing `std::any` and `std::variant` for c++17 and above that provides similar functionality to `boost::any` and `boost::variant`. So, we can make use of them to reduce our "boost footprint".
< RishabhGarg108Gi> (edited) ... usage.
< RishabhGarg108Gi> Yeah ... => ... usage.
< RishabhGarg108Gi> Yeah ...
< RishabhGarg108Gi> (edited) ... again.
< RishabhGarg108Gi> Today while I found out that ... => ... again.
< RishabhGarg108Gi> Today, I discovered that ...
< shrit[m]> @rishabhgarg108 mlpack can not use C++17, not before 2023 I think
< shrit[m]> Because we still support systems that do not have C++17 inside
< rcurtin[m]> RishabhGarg108 (Gitter): shrit : exactly, unfortunately we are limited to some older compilers because people use older versions of RHEL, etc.; I think it would probably be easier to just use inheritance here, even though it would be the first place in all of mlpack that we are using inheritance :)
< rcurtin[m]> I used to be "scared" of inheritance, thinking virtual functions would lead us to inefficiency, but over the years I have realized inheritance is fine (especially for functions that aren't deep inside inner loops), and all we need to do is be careful to not use inheritance in places where it actually would be problematic
< rcurtin[m]> Aakash kaushik (Gitter): awesome, looking forward to the results of the experiment!
< zoq> If it's possible to mark a parameter as optional in cereal we can get around it without inheritance or boost variant.
< rcurtin[m]> zoq: good point, we can basically do anything we want in serialize() so maybe we can implement it there
< rcurtin[m]> (I'm not sure the details of exactly what needs to be done, I just want to point out that I think it's ok to use inheritance instead of visitors if computational efficiencies are not a concern)
< zoq> Like all we need is float, string and int, so we could have a vector that has like std::vector<std::pair<std::string, parameterTypes> >;
< zoq> parameterTypes is something like struct { std::string stringParam; int intParam; double doubleParam};
< zoq> I don't mind to use inheritance, whatever works best.
< RishabhGarg108Gi> @shrit @rcurtin , that's sad that we can't use C++ 17.
< RishabhGarg108Gi> I am not very good at inheritance. I just know that basic definition that we can inherit a class from some base class. But I don't quite follow how we can use inheritance to make a container that would store arbitrary types. Could you please elaborate it a little more. Thanks.
< zoq> RishabhGarg108Gi: I think there are a bunch of ressource available that can explain inheritance better than we can.
< RishabhGarg108Gi> @zoq, I also tried doing something like what you are telling about "parameterTypes" but I wasn't quite successful. Actually it was making the code look too much clumsy for me (or maybe I was doing it some wrong way).
< RishabhGarg108Gi> So, I guess I will look online for inheritance method and also try the other method you suggested again and see if I am able to do it.
< AyushSingh[m]> After adding a loss function, how should I test whether it is working fine? By building the whole mlpack directory?
< rcurtin[m]> Ayush Singh: you should implement a test for it in `src/mlpack/tests/`, and then build and run the test to make sure it passes :)
< AyushSingh[m]> Okay, thanks, got it.
qur70 has joined #mlpack
< NippunSharmaGitt> size of the input must be divisible by the number of input channels. Like we are doing in [batch_norm_impl](https://github.com/mlpack/mlpack/blob/0b924c09ea3b52ede50ae8c6e5d1079ca4cfd067/src/mlpack/methods/ann/layer/batch_norm_impl.hpp#L82). Should this be added to these files ?
< NippunSharmaGitt> Hi is my observation correct or it should be as is ?
qur70 has quit [K-Lined]
ImQ009 has quit [Quit: Leaving]