<khalidanwaar[m]>
jonpsy[m]: Nice ! Congrats! When are you graduating?
<jonpsy[m]>
2022
aakashi2001 has joined #mlpack
aakashi2001 has quit [Client Quit]
<jonpsy[m]>
how do I find IRC chatrooms via element?
<JeffinSam[m]>
Copy paste the whole thing
<zoq[m]1>
<khalidanwaar[m]> "Oh, wait, I had one more . zoq..." <- Hello, if you can rebase against the latest master branch, that would be awesome.
<khalidanwaar[m]>
<zoq[m]1> "Hello, if you can rebase against..." <- Cool! I'll do that.
<jonpsy[m]>
<JeffinSam[m]> "Copy paste the whole thing" <- irc://irc.oftc.net/#osrm, so I'm trying to enter here...
<jonpsy[m]>
I tried copying this in "join public room"...in vain
<JeffinSam[m]>
Oh I thought it was mlpack
<JeffinSam[m]>
But here I guess copy pasting the below will work
<JeffinSam[m]>
irc://irc.oftc.net/#osrm
<JeffinSam[m]>
But not sure , for me mlpack one worked
<JeffinSam[m]>
Never tried other room š
<jonpsy[m]>
not found ://
<NabanitaDash[m]>
I think there is no feature like ``flatten()`` in mlpack. Should I try wriitng one? It is required for my PR.
<rcurtin[m]>
Nabanita Dash: maybe `arma::vectorise()` is what you are looking for?
<NabanitaDash[m]>
yes, thank you. I am not good with armadillo.
<rcurtin[m]>
no worries š their docs are great, but sometimes it can be hard to discover what all the functions do (unless you read the whole thing start to finish, but that doesn't sound like a very fun time...)
<NabanitaDash[m]>
I guess writing code with armadillo will make me familiar with the base functions.
<NabanitaDash[m]>
```
<NabanitaDash[m]>
* ```
<NabanitaDash[m]>
```
<NabanitaDash[m]>
The ``model`` is a type of ``FFN`` and not a matrix, hence implementing ``arma::vectorise()`` is not possible on the ``model``. I think introducing a ``vectorise`` class in the ``methods/ann/layers`` which can use ``arma::vectorise()`` on its inputs can be a solution. This won't be a problem while using the ``add`` layers. Any thoughts?
<rcurtin[m]>
I would be hesitant about adding too much new support to the neural network toolkit; we're in the process of refactoring it entirely anyway
<NabanitaDash[m]>
hmm.. It is not possible to write an example of ``adaptive pooling` layers without using ``vectorise``.
<NabanitaDash[m]>
Maybe I will just add move and copy constructors for ``mean`` and ``max`` pooling and leave aside ``adaptive`` pooling for now. We can redo it after refactoring?
<rcurtin[m]>
yeah, I think that would be fine šļø after we merge the ann-vtable refactoring, there will probably be lots of layers that have to be adapted in follow-up PRs