rcurtin_irc changed the topic of #mlpack to: mlpack: a scalable machine learning library (https://www.mlpack.org/) -- channel logs: https://libera.irclog.whitequark.org/mlpack -- NOTE: messages sent here might not be seen by bridged users on matrix, gitter, or slack
<ShubhamAgrawal[m> <jeffin143> "https://github.com/mlpack/mlpack..."; <- I think this approach will not work as in this, all layers weight initializing occurs.
<ShubhamAgrawal[m> I want approach which will work for one layer only.
AhmedAbdelatty has joined #mlpack
<AnwaarKhalid[m]> I think you should be able to access a particular layer's ( say ith layer) parameters like:
<AnwaarKhalid[m]> `model.Network()[i]->Parameters()`
<ShubhamAgrawal[m> > <@khalidanwaar:matrix.org> I think you should be able to access a particular layer's ( say ith layer) parameters like:
<ShubhamAgrawal[m> > `model.Network()[i]->Parameters()`
<ShubhamAgrawal[m> No I actually want to initialize weights inside layer only
<AnwaarKhalid[m]> what do you mean by "inside layer"?
AhmedAbdelatty has quit [Ping timeout: 276 seconds]
<ShubhamAgrawal[m> <AnwaarKhalid[m]> "what do you mean by "inside..." <- I mean inside BatchNorm layer code.
AkashKumbar[m] has quit [Quit: You have been kicked for being idle]
kuries has quit [Quit: You have been kicked for being idle]
_slack_mlpack_U7 has quit [Quit: You have been kicked for being idle]
_slack_mlpack_U0 has quit [Quit: You have been kicked for being idle]
rcurtin_matrixor has quit [Quit: You have been kicked for being idle]
_slack_mlpack_13 has quit [Quit: You have been kicked for being idle]
<_slack_mlpack_22> Hi everyone, I am Phan Nhat Hoang (John Hoang), a Vietnamese student who is currently studying in Singapore.
<_slack_mlpack_22> I am excited to work on this project with your help. Glad to meet everyone!
<_slack_mlpack_22> A day ago, Google pronounced that my project "Enhance CMA-ES from existing implementation" was accepted. This is a surprise to me since there are so many promissing projects which applied to mlpack.
_slack_mlpack_22 is now known as NhtHongPhan[m]
kuries has joined #mlpack
AhmedAbdelatty has joined #mlpack
AhmedAbdelatty_ has joined #mlpack
AhmedAbdelatty has quit [Read error: Connection reset by peer]
AhmedAbdelatty_ has quit [Quit: Leaving]
<rcurtin[m]> Shubham Agrawal: sorry it took me so long to answer your questions! Here are some thioughts, I hope they are helpful:
<rcurtin[m]> * for initializing with 1 instead of 0, use the `ConstInitialization` class: https://github.com/mlpack/mlpack/blob/master/src/mlpack/methods/ann/init_rules/const_init.hpp ; the constructor takes a parameter, which is the value to set the initial weights to.
<rcurtin[m]> * `Archive::is_loading::value` is a compile-time check for cereal to see, when we are calling `serialize()`, if we are loading or saving the instance of the given class. When loading, often it is necessary to e.g. free memory from whatever the class is currently holding
<rcurtin[m]> * every layer should call its parent constructor in its initialization list of its constructor, so if you see adapted layers where that is not the case, that is a bug and we should fix it :)