rcurtin_irc changed the topic of #mlpack to: mlpack: a scalable machine learning library (https://www.mlpack.org/) -- channel logs: https://libera.irclog.whitequark.org/mlpack -- NOTE: messages sent here might not be seen by bridged users on matrix, gitter, or slack
<ShubhamAgrawal[m> `error: 'class mlpack::ann::Layer<arma::Mat<double> >' has no member named 'Network'`
<ShubhamAgrawal[m> If I use AddMerge inside AddMerge then I won't be able to access Network()
<ShubhamAgrawal[m> Should I implement blank implementation of Network in `layer.hpp`
<ShubhamAgrawal[m> ?
<rcurtin[m]> can you give more context about the issue?
<ShubhamAgrawal[m> See this testcase
<ShubhamAgrawal[m> `r.Network()[1]->Network()[0]->Parameters().fill(-1.0);` This line
<rcurtin[m]> Try casting the layer to a MultiLayer, then you will have the `Network()` function available
<ShubhamAgrawal[m> <ShubhamAgrawal[m> "`r.Network()[1]->Network()[0]->..." <- Can I do it in this line only?
<ShubhamAgrawal[m> Or do I need to use extra variable?
<rcurtin[m]> Yeah, it will be a long line but you can do it
<ShubhamAgrawal[m> rcurtin[m]: Can you tell meπŸ˜…
<ShubhamAgrawal[m> > <@shubhamag:matrix.org> And which is correct... (full message at https://libera.ems.host/_matrix/media/r0/download/libera.chat/ca35724763b6c4e38714fa2b82a67de4907ca9ae)
<rcurtin[m]> You want the reference version; you don't want to make a copy
<ShubhamAgrawal[m> rcurtin[m]: Ok then there is bug in multi_layer file
<ShubhamAgrawal[m> I have fixed it
<rcurtin[m]> πŸ‘οΈ thank you for finding that!
<rcurtin[m]> the cast will be something like this:
<rcurtin[m]> ((MultiLayer<>*) r.Network()[1])->Network()[0]->Parameters().fill(-1.0);
<rcurtin[m]> or the two-line version:
<rcurtin[m]> MultiLayer<>* child = ((MultiLayer<>*) r.Network()[1]);
<rcurtin[m]> child->Network()[0]->Parameters().fill(-1.0);
<ShubhamAgrawal[m> Thanks
<ShubhamAgrawal[m> Got it
<ShubhamAgrawal[m> Can you tell me when does `SetWeights()` is called?
<ShubhamAgrawal[m> During initialization only?
<rcurtin[m]> whenever `FFN::parameters` changes size; let me find the call to ti
<rcurtin[m]> s/ti/it/
<rcurtin[m]> that's in `CheckNetwork()`, which is called whenever `Forward()`, `Backward()`, `Predict()`, or `Train()` (and maybe a few other functions) are called
<rcurtin[m]> `layerMemoryIsSet` will be false whenever `inputDimensions` may have changed from the last call to `CheckNetwork()` (and it is false when the `FFN` object is created)
<rcurtin[m]> I hope that helps, let me know if I answered the question well
<ShubhamAgrawal[m> What about loading the model?
<rcurtin[m]> yeah, serialization will set `layerMemoryIsSet` to false too
<ShubhamAgrawal[m> rcurtin[m]: I am confused
<ShubhamAgrawal[m> How do I save the weights in FFN?
<rcurtin[m]> I don't know what you mean? you can just use `serialize()`
<rcurtin[m]> (I am in a meeting, responses may be slow from here on out)
<ShubhamAgrawal[m> rcurtin[m]: ok