ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
muis[m] has joined #mlpack
ImQ009 has joined #mlpack
< OleksandrNikolsk> > <@_slack_mlpack_U01HVTEQU2X:matrix.org> I was able to reproduce this in pytorch<https://colab.research.google.com/drive/1uAIBHYyUBsPcV7PQPGMHCn09ANmL6-dX?usp=sharing>
< OleksandrNikolsk> > But I am not sure how to reproduce the result of `delta` , is it input.grad as in the notebook. Please confirm
< ABHINAVANAND[m]> Seems like the test case of pooling function are not correct. I will take a look.
< jonpsy[m]> Hey Oleksandr Nikolskyy , it looks like your last message got subdued inside the code block.
< jonpsy[m]> Try pastebin to share errors(it has syntax highlight too) ;)
ggalan87[m] has quit [Quit: Idle for 30+ days]
M14WAAH8XR has quit [Ping timeout: 268 seconds]
kartikdutt18Gitt has quit [Ping timeout: 268 seconds]
GauravTirodkar[m has quit [Ping timeout: 260 seconds]
M14WAAH8XR has joined #mlpack
kartikdutt18Gitt has joined #mlpack
_slack_mlpack_47 has quit [Ping timeout: 268 seconds]
M14WAAH8X5 has quit [Ping timeout: 268 seconds]
M7IZAAIHK5 has quit [Ping timeout: 268 seconds]
AnushKiniGitter[ has quit [Ping timeout: 268 seconds]
SaiVamsi[m] has quit [Ping timeout: 268 seconds]
kshitijandmojoGi has quit [Ping timeout: 268 seconds]
_slack_mlpack_44 has quit [Ping timeout: 268 seconds]
YashMishraGitter has quit [Ping timeout: 268 seconds]
sdev_7211[m] has quit [Ping timeout: 268 seconds]
birm[m] has quit [Ping timeout: 268 seconds]
aadarsh-asthanaG has quit [Ping timeout: 268 seconds]
AbhishekNimje[m] has quit [Ping timeout: 268 seconds]
SeverinoTessarin has quit [Ping timeout: 260 seconds]
Gauravkumar[m] has quit [Ping timeout: 268 seconds]
AbhinavvermaGitt has quit [Ping timeout: 268 seconds]
AbdullahKhilji[m has quit [Ping timeout: 268 seconds]
alan__[m] has quit [Ping timeout: 268 seconds]
DillonKipke[m] has quit [Ping timeout: 268 seconds]
VanshBansalGitte has quit [Ping timeout: 268 seconds]
HARSHCHAUHAN[m] has quit [Ping timeout: 268 seconds]
AtreyaMajumdarGi has quit [Ping timeout: 268 seconds]
AbhinavGudipati[ has quit [Ping timeout: 268 seconds]
PranavReddyP16Gi has quit [Ping timeout: 268 seconds]
DivyanshKumar[m] has quit [Ping timeout: 268 seconds]
M14WAAH8YS has quit [Ping timeout: 260 seconds]
MatheusAlcntaraS has quit [Ping timeout: 260 seconds]
MayankRaj[m] has quit [Ping timeout: 268 seconds]
say4n has quit [Ping timeout: 268 seconds]
RudraPatil[m] has quit [Ping timeout: 268 seconds]
TejasviGuptaGitt has quit [Ping timeout: 268 seconds]
AvikantSrivasta4 has quit [Ping timeout: 268 seconds]
ArunavShandeelya has quit [Ping timeout: 268 seconds]
_slack_mlpack_43 has quit [Ping timeout: 268 seconds]
_slack_mlpack_10 has quit [Ping timeout: 268 seconds]
_slack_mlpack_47 has joined #mlpack
M14WAAH8X5 has joined #mlpack
GauravTirodkar[m has joined #mlpack
SaiVamsi[m] has joined #mlpack
AnushKiniGitter[ has joined #mlpack
M7IZAAIHK5 has joined #mlpack
_slack_mlpack_10 has joined #mlpack
_slack_mlpack_43 has joined #mlpack
AbhishekNimje[m] has joined #mlpack
YashMishraGitter has joined #mlpack
aadarsh-asthanaG has joined #mlpack
_slack_mlpack_44 has joined #mlpack
birm[m] has joined #mlpack
sdev_7211[m] has joined #mlpack
kshitijandmojoGi has joined #mlpack
PranavReddyP16Gi has joined #mlpack
SeverinoTessarin has joined #mlpack
DillonKipke[m] has joined #mlpack
AbdullahKhilji[m has joined #mlpack
VanshBansalGitte has joined #mlpack
HARSHCHAUHAN[m] has joined #mlpack
DivyanshKumar[m] has joined #mlpack
alan__[m] has joined #mlpack
Gauravkumar[m] has joined #mlpack
AbhinavvermaGitt has joined #mlpack
AbhinavGudipati[ has joined #mlpack
M14WAAH8YS has joined #mlpack
AtreyaMajumdarGi has joined #mlpack
AvikantSrivasta4 has joined #mlpack
ArunavShandeelya has joined #mlpack
MayankRaj[m] has joined #mlpack
TejasviGuptaGitt has joined #mlpack
RudraPatil[m] has joined #mlpack
say4n has joined #mlpack
MatheusAlcntaraS has joined #mlpack
< ABHINAVANAND[m]> I was going through the code of max-pooling. There I saw `kernalHeight` and `kernalWidth` . Does `kernalHeight` means number of rows in the kernal and `kernalWidth` means number of columns in the kernal, right?
< zoq> ABHINAVANAND[m]: correct.
< ABHINAVANAND[m]> <zoq "ABHINAV ANAND: correct."> If this is correct then the pooling and unpooling operation of maxunpooling seems to be wrong.
< ABHINAVANAND[m]> > <undefined>
< ABHINAVANAND[m]> (edited) If this is correct then the pooling and unpooling operation of maxunpooling seems ... => zoq If this is correct then the pooling and unpooling operation of maxpooling seems ...
< zoq> ABHINAVANAND[m]: Happy to take a look at the open PR, once I have a chance.
< ABHINAVANAND[m]> <zoq "ABHINAV ANAND: Happy to take a l"> Okay, I will let you know when the pr is complete.
< zoq> ABHINAVANAND[m]: Super, thanks!
< zoq> OleksandrNikolsk: Nice that you solved the issue.
< zoq> OleksandrNikolsk: At some point I also wrote a python server - https://gist.github.com/zoq/7c1f8d29e7ec8229b67acdaa608e265a
< zoq> OleksandrNikolsk: So in case you prefer Python over Elixir I think you can use that one as well.
< OleksandrNikolsk> cool, I'll consider to change later, now I'm just happy to have fixed everything and able to experiment a little bit
< zoq> OleksandrNikolsk: You will run into the same problem, you might want to decrease the number of parameters, or split the model up into multiple parts.
ib07 has quit [Ping timeout: 245 seconds]
ib07 has joined #mlpack
< OleksandrNikolsk> <zoq "Oleksandr Nikolskyy: You will ru"> Hm I try to reduce the number of parameters.
< jeffin143[m]> Found this as a comment in one of the issue
< jeffin143[m]> Also, I build mlpack regularly (which I kind of think like a stress test for RAM and processor :))
< jeffin143[m]> 😂😂😂
< jeffin143[m]> Stress testing
< RishabhGargGitte> Actually I didn't mean anything 😅. It was just a pun I used.
< RishabhGargGitte> Hope I didn't hurt the community. :)
ib07 has quit [Ping timeout: 245 seconds]
< rcurtin[m]> Rishabh Garg (Gitter): I can't figure out what you're referring to, but I'm always up for a good pun 😃 maybe some messages didn't make it to the matrix room?
ImQ009 has quit [Quit: Leaving]
federicor has joined #mlpack
federicor has quit [Client Quit]
< ShahAnwaarKhalid> Hi zoq ! I was working on the dual optimizer for WGANGP. I've a doubt in how gradient penality has been currently implemented.
< ShahAnwaarKhalid> 1. Epsilon is supposed to be learned parameter ( in torch you'd do something like torch.rand(len(real), 1, 1, 1, device=device, requires_grad=True)). Is there an mlpack equivalent of requires_grad?
< ShahAnwaarKhalid> 2. I'm not sure how the gradient of this regularization term will be computed:
< ShahAnwaarKhalid> `res += lambda * std::pow(arma::norm(normGradientDiscriminator, 2) - 1, 2);`