rcurtin_irc changed the topic of #mlpack to: mlpack: a scalable machine learning library (https://www.mlpack.org/) -- channel logs: https://libera.irclog.whitequark.org/mlpack -- NOTE: messages sent here might not be seen by bridged users on matrix, gitter, or slack
texasmusicinstru has joined #mlpack
DirkEddelbuettel has quit [*.net *.split]
M077AADSIV has quit [*.net *.split]
mlpack-inviter[m has quit [*.net *.split]
psydroid has quit [*.net *.split]
AyushKumarLavani has quit [*.net *.split]
HrithikNambiar[m has quit [*.net *.split]
Shadow3049[m] has quit [*.net *.split]
NamanJain[m] has quit [*.net *.split]
Gulshan[m] has quit [*.net *.split]
zoq[m]1 has quit [*.net *.split]
PranshuSrivastav has quit [*.net *.split]
MatrixTravelerbo has quit [*.net *.split]
PranshuSrivastav has joined #mlpack
AyushKumarLavani has joined #mlpack
zoq[m]1 has joined #mlpack
HrithikNambiar[m has joined #mlpack
NamanJain[m] has joined #mlpack
mlpack-inviter[m has joined #mlpack
DirkEddelbuettel has joined #mlpack
Shadow3049[m] has joined #mlpack
Gulshan[m] has joined #mlpack
M077AADSIV has joined #mlpack
psydroid has joined #mlpack
MatrixTravelerbo has joined #mlpack
MohomedShalik[m] has quit [*.net *.split]
ManishKausikH[m] has quit [*.net *.split]
M030AABMPD has quit [*.net *.split]
jeffin143[m] has quit [*.net *.split]
huberspot[m] has quit [*.net *.split]
FranchisNSaikia[ has quit [*.net *.split]
_slack_mlpack_U7 has quit [*.net *.split]
GauravGhati[m] has quit [*.net *.split]
Cadair has quit [*.net *.split]
<_slack_mlpack_U7> I don't really understand what you are saying. do you mean I should set `model.Parameters()` to something else?
<_slack_mlpack_U7> I just tried to use the cifar CNN from the examples repo but always get the error `error: Mat::operator(): index out of bounds`
_slack_mlpack_U7 has joined #mlpack
<_slack_mlpack_U7> Ok, thank you for the quick response. So if I want to try to write a DeepDream program using mlpack I would need to manually define a Gradient with respect to the input or try to use an Optimizer that doesn't need a Gradient, is that correct?
<_slack_mlpack_U7> Yes, I would only need the Gradients with respect to the activations, often the mean of them
<_slack_mlpack_U7> well missing a const in the argument list for Evaluate was a stupid mistake by me.
<_slack_mlpack_U7> * of them, of the layer one choses as the last. For proof of work concept one could only make this choosable at compile time.
<_slack_mlpack_U7> * of them, of the layer one chooses as the last. For proof of work concept one could only make this choosable at compile time.
<_slack_mlpack_U7> Hey there. I recently discovered this library and wondered if it is possible to optimize/compute gradients w.r.t. something different than the weights/parameters of an ANN.
<_slack_mlpack_U7> <zoq[m]1> "I'm somewhat out of the loop..." <- I'm away from my pc at the moment. base code, only slightly modified, found here: <https://github.com/mlpack/examples/blob/master/cifar10_cnn/cifar_train.cpp|https://github.com/mlpack/examples/blob/master/cifar10_cnn/cifar_train.cpp> + <https://github.com/mlpack/examples/blob/master/cifar10_cnn/periodic_save.hpp|https://github.com/mlpack/examples/blob/master/cifar10_cnn/periodic_save.hpp>
<_slack_mlpack_U7> <swaingotnochill[> "If you are getting this error..." <- trainX: 3072x45000... (full message at https://libera.ems.host/_matrix/media/r0/download/libera.chat/131ce123d1ae669bce08830fc583497c57fbfead)
<_slack_mlpack_U7> (edited) ... std::abs(mean(0));... (full message at https://libera.ems.host/_matrix/media/r0/download/libera.chat/496d7a43c0e3201cddc16cf77f9938fbfdcba845)
<_slack_mlpack_U7> this would be a model I tried to use... (full message at https://libera.ems.host/_matrix/media/r0/download/libera.chat/996b515e8e9eb8367c9588ab5351e8afaa6bf0e1)
<_slack_mlpack_U7> > <@_slack_mlpack_U02DYKQRX3R:matrix.org> trainX: 3072x45000... (full message at https://libera.ems.host/_matrix/media/r0/download/libera.chat/ec1712768d7d421f7082ae04da000a6046895fd4)
<_slack_mlpack_U7> (edited) ... something else? => ... something else?... (full message at https://libera.ems.host/_matrix/media/r0/download/libera.chat/cff4c553c95fa184cfabd9a3deb0109d55ef4358)
MohomedShalik[m] has joined #mlpack
FranchisNSaikia[ has joined #mlpack
huberspot[m] has joined #mlpack
jeffin143[m] has joined #mlpack
Cadair has joined #mlpack
GauravGhati[m] has joined #mlpack
ManishKausikH[m] has joined #mlpack
M030AABMPD has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Read error: Connection reset by peer]
texasmusicinstru has joined #mlpack
<heisenbuugGopiMT> I have a function like this, now when I am calling the fucntion with an `int` I am getting narrow conversion error.
<heisenbuugGopiMT> Should I fix the type instead of using `T`?
<heisenbuugGopiMT> `const long double num = 0.234512`
<rcurtin[m]> it seems like that function wouldn't work unless `T` was a floating-point type?
<heisenbuugGopiMT> Yup Yup
<rcurtin[m]> you could force it to compile by doing, e.g., `const T num = T(0.23456512);`, but if `T` is an integral type like `int`, this will result in `num` having a value of `0`
<heisenbuugGopiMT> Yes, then we would get wrong answer.
<heisenbuugGopiMT> It for calculating `Digamma` actually.
<rcurtin[m]> or, if you want it to fail to compile unless `T` is a floating-point type, you could use SFINAE with `std::is_floating_point<T>` or similar
<heisenbuugGopiMT> No, I don't want it to fail, it should execute and it should consider it as `1.0`
<rcurtin[m]> hmm, maybe you should do something like this?
<rcurtin[m]> const T num = (std::is_floating_point<T>::value) ? T(0.23456512) : T(1.0);
<rcurtin[m]> ```
<rcurtin[m]> ```
<rcurtin[m]> I included some extra paranoia with that cast to `T()`
<heisenbuugGopiMT> `const T num = (std::is_floating_point<T>::value) ? T(num) : T(numf);`
<heisenbuugGopiMT> Will this work?
<heisenbuugGopiMT> I meant that if I get any integer I want to convert it to float.
<rcurtin[m]> what is `numf`?
<heisenbuugGopiMT> Using type literals...
<rcurtin[m]> oh, ok
<heisenbuugGopiMT> I think it should be `f.0`?
<heisenbuugGopiMT> Not sure.
<rcurtin[m]> you mean like, e.g., `1.0f`, right?
<heisenbuugGopiMT> Yes.
<rcurtin[m]> if `T` is `int` and you do `T(1.0f)`, you will get an `int` with value `1`, yeah
<rcurtin[m]> but if you do, e.g., `T(0.5f)` and `T` is `int`, then what will happen is that it will be truncated to an integer, so you will end up with an `int` with value `0`
<heisenbuugGopiMT> Oh, I think I should explain the whole situation.
<heisenbuugGopiMT> So to calculate digamma we have some floating point consts(some of which are also less than 0, eg 0.2341325).
<heisenbuugGopiMT> But in boost even when we are passing an int it's working.
<heisenbuugGopiMT> Now when I am passing an int I am getting conversion error.
<rcurtin[m]> maybe are they just casting the given `int` into a `float` or `double`?
<heisenbuugGopiMT> They have there own type, in which they are doing something. Should I cast it? If yes, then is there a way to do it compile time?
<rcurtin[m]> I don't think I understand the situation well enough to give good advice... but you can in general convert an `int` type to a floating-point type simply by casting. if the int is reasonably small (less than something like 2^13 or so) then the conversion will be exact
<heisenbuugGopiMT> Okay, they are using `static_cast`
<heisenbuugGopiMT> Maybe now you can have a better idea of whats going on.
<rcurtin[m]> I hope to be able to take a look eventually, but I definitely don't have time to dig deep on it today :(
<heisenbuugGopiMT> s/ryan:ratml.org/rcurtin/, s/I//
texasmusicinstru has quit [Remote host closed the connection]
<heisenbuugGopiMT> Oh, it's okay. It's not a big deal. I already got an idea of what to do, so your help was enough. I pushed the code just in case you wanna see. I will mostly push the replace for `boost::digamma()` in some hours, hope it passes all the cases this time.
<heisenbuugGopiMT> I tested some locally and we are getting exactly same values as boost.
texasmusicinstru has joined #mlpack
<rcurtin[m]> awesome! it will definitely be great when we can replace that part of boost
<heisenbuugGopiMT> On to it!!!
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
<zoq[m]1> jonpsy: https://github.com/vatlab/jupyterlab-sos/issues/59 let's see if someone has an idea.
<zoq[m]1> Btw. switched to https://notebooks.gesis.org/binder/, faster build times and more memory
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack