<IWNMWEIWNMWE[m]>
Hey guys I would just like to know that if a dataset set had null/nan values in its columns how would it be represented in an armadillo matrix as i would like to find them out while iteration
mohamedalqablawi has quit [Ping timeout: 265 seconds]
SlackIntegration has quit [Ping timeout: 265 seconds]
Janan[m] has quit [Ping timeout: 265 seconds]
mohamedalqablawi has joined #mlpack
Janan[m] has joined #mlpack
SlackIntegration has joined #mlpack
<IWNMWEIWNMWE[m]>
Also i would to get some guidance in a case where I plan to implement another algorithm for an already implemented method where the only difference is the train method of that class then can I directly move to SFINAE where i can make the class such that it takes a string as a template parameter which can then... (full message at <https://libera.ems.host/_matrix/media/v3/download/libera.chat/f4ef10791d0cb986640c2f24aa21d86e34c20c71>)
<IWNMWEIWNMWE[m]>
* Also i would to get some guidance in a case where I plan to implement another algorithm for an already implemented method where the only difference is the train method of that class then can I directly move to SFINAE where i can make the class such that it takes a string as a template parameter which can... (full message at <https://libera.ems.host/_matrix/media/v3/download/libera.chat/09aabf7da00757ac39b36d15c6c611a7da0b54cf>)
<vaibhavp[m]>
> <@iwnmwe-63e50ce56da0373984be2ca3:gitter.im> Also i would to get some guidance in a case where I plan to implement another algorithm for an already implemented method where the only difference is the train method of that class then can I directly move to SFINAE where i can make the class such that it takes a... (full message at <https://libera.ems.host/_matrix/media/v3/download/libera.chat/4ff5f8f922e22bbe5296eb2a4c93b5e91fa949ad>)
<vaibhavp[m]>
Also specialized train mehods will be named as train1 or train2 or something better
<vaibhavp[m]>
* something better.
<IWNMWEIWNMWE[m]>
hmmm makes sense also would reduce the effort of writing a wrapper class for accepting string literals
<vaibhavp[m]>
<IWNMWEIWNMWE[m]> "Hey guys I would just like to..." <- For this, if you want to replace nan or inf values there is [replace](https://arma.sourceforge.net/docs.html#replace) in which you can pass datum::nan(look at the example in the documentation) and if you want to find nan or inf values you can use [find_nonfinite](https://arma.sourceforge.net/docs.html#find_nonfinite). There are also few other functions in the documenation.
<vaibhavp[m]>
> <@iwnmwe-63e50ce56da0373984be2ca3:gitter.im> Hey guys I would just like to know that if a dataset set had null/nan values in its columns how would it be represented in an armadillo matrix as i would like to find them out while iteration
<IWNMWEIWNMWE[m]>
Is there any example of such a case in mlpack where multiple training algorithms for the same method ??
<vaibhavp[m]>
I am not sure but can you elaborate on what you are trying to do?
<IWNMWEIWNMWE[m]>
Ya so I am tryin to add other methods of training the adaboost method (SAMME and SAMME.R) and would like to overload train function such that the user can mention which one he/she wants to use(including the MH algo currently avalible) similar to the scikit learn implementation.
<vaibhavp[m]>
IWNMWE (IWNMWE): I am not aware any examples that fits but considering the limited knowledge of mlpack maybe someone else knows of such method.
<IWNMWEIWNMWE[m]>
<vaibhavp[m]> "IWNMWE (IWNMWE): I am not..." <- thanks for helping anyways bro
<vaibhavp[m]>
rcurtin, zoq Hey! I was thinking a little deeper about the DAG Network and I was brainstorming about all cases where my implementation would work(or not). One of the problem that came into my mind is RNNs. The solution, which would handle the both the cases of FNN and RNN, that I could think of was exposing the Forward method and overloaded after inheriting the DAGNetwork(maybe a better name?) class, such that users could pass the input
<vaibhavp[m]>
data appropriately and return the loss(maybe only output?), and rest of the computation(backward/gradient) could be done by the DAGNetwork class. This solution would be vastly different to the current approach by the ANN module, so I thought I should get your opinion on it. So, does it make sense to lean towards this approach? Please share your thoughts. Thank you!
<vaibhavp[m]>
* method and which could be overloaded after
<vaibhavp[m]>
Something like the pytorch approach.
aadi-rajAdityaRa has joined #mlpack
<vaibhavp[m]>
<vaibhavp[m]> "Something like the pytorch..." <- So instead of the current process of creating the topology then calling Train. We would first create the topology, then specify how the forward pass looks which could perform the forward pass multiple times with different input data and optionally storing the output. And return the output. Now the DAG Network would calculate loss and perform the Backward and Gradient passes.
<vaibhavp[m]>
This could be made backward compatible.
<vaibhavp[m]>
This would increase the flexibility of mlpack(which is its tagline).
<vaibhavp[m]>
* So instead of the current process of creating the topology then calling Train. We would first create the topology, then specify how the forward pass looks which could perform the forward pass(by traversing the graph)multiple times with different input data and optionally storing the output. And return the output. Now the DAG Network would calculate loss and perform the Backward and Gradient passes.
<vaibhavp[m]>
What do you guys think of this idea?