rcurtin_irc changed the topic of #mlpack to: mlpack: a scalable machine learning library (https://www.mlpack.org/) -- channel logs: https://libera.irclog.whitequark.org/mlpack -- NOTE: messages sent here might not be seen by bridged users on matrix, gitter, or slack
aakashi2001 has joined #mlpack
aakashi2001 has joined #mlpack
aakashi2001 has quit [Changing host]
aakashi2001 has quit [Remote host closed the connection]
_whitelogger has joined #mlpack
texasmusicinstru has joined #mlpack
<ShubhamAgrawal[m> > <@ryan:ratml.org> Shubham Agrawal: sorry it took me so long to answer your questions! Here are some thioughts, I hope they are helpful:... (full message at https://libera.ems.host/_matrix/media/r0/download/libera.chat/b188574b594cece30af0555c5b9f1cd58021617e)
<ShubhamAgrawal[m> ShubhamAgrawal[m: This will override initialisation for every layer.😓
<ShubhamAgrawal[m> Can't we create some method to know for which layer I need to initialise and for some other layer a call to method to initialise layer
<ShubhamAgrawal[m> ?
<jonpsy[m]> Started learning javascript.....and I thought C++ was hard 🤦‍♂️
CaCode has joined #mlpack
<rcurtin[m]> Shubham Agrawal: perhaps we have to adapt the initialization strategies to something more like the `Initialize()` function from https://github.com/mlpack/mlpack/blob/master/src/mlpack/methods/ann/init_rules/network_init.hpp ?
<rcurtin[m]> then you could write a custom initialization that checks each layer's type and changes its behavior accordingly
texasmusicinstru has quit [Ping timeout: 256 seconds]
CaCode has quit [Quit: Leaving]