rcurtin_irc changed the topic of #mlpack to: mlpack: a scalable machine learning library (https://www.mlpack.org/) -- channel logs: https://libera.irclog.whitequark.org/mlpack -- NOTE: messages sent here might not be seen by bridged users on matrix, gitter, or slack
fieryblade[m] has quit [Quit: You have been kicked for being idle]
_slack_mlpack_25 has quit [Quit: You have been kicked for being idle]
<GopiMTatiraju[m]> I read that NVIDIA went open source with GPU drivers, some parts atleast...
<GopiMTatiraju[m]> Haven't read the whole article but I guess it's a start...
<rcurtin[m]> Anwaar Khalid: sorry for the slow response! I think the first thing to do is define how the `Concat` layer is supposed to work. I never totally decided this and I think this is the reason that the current sort-of-adapted implementation doesn't completely make sense
<rcurtin[m]> right now, the `Concat` layer seems to want to hold a set of layers itself, and so when you call `Forward()`, it doesn't concatenate the input---what it actually does is run that input through each layer independently, and then concatenates the output. so in that sense, the name `Concat` is somewhat inaccurate
<rcurtin[m]> when the DAG-network idea came up, it became clear to me that what might actually be necessary is a *simpler* `Concat` layer that accepts an arbitrary-shape input (as specified with `InputDimensions()`), and then collapses it by adding together everything on one particular axis
<rcurtin[m]> that way, any DAG-network we implement could use this "simpler" `Concat` layer; or, maybe, it makes more sense to simply inline the concatenation implementation into the DAG-network's `Forward()` and `Backward()` methods, meaning that there is no reason to preserve the `Concat` layer
<rcurtin[m]> so... there are lots of options... honestly, I might lean towards the last one (no `Concat` layer that the user can use; the concatenation operation lives entirely in the DAG-network class)
<rcurtin[m]> well... hmm... I'm not sure of that. maybe the simpler `Concat` layer makes sense to use in a regular FFN too? hmm, so maybe that is the better idea... you can see there are lots of tradeoffs and advantages and disadvantages of each approach 😄
<rcurtin[m]> one thing I think I can say for sure is that the current `Concat` layer that is based on `MultiLayer` is probably not the right way to go
<GopiMTatiraju[m]> shrit: I did a new installation of Ubuntu, and then added some ssh keys...
<GopiMTatiraju[m]> When I shifted to i3 these keys are not working, any idea why?