ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< RishabhGarg108Gi> @Aakash-kaushik thanks :)
< AakashkaushikGit> Just reread my last message and definitely looks like i was sleepy. Anyways happy to help :D
< turska79Gitter[m> About out of heap thing again. Just to clarify that i didnt build mlpack myself. I was just using it via vcpkg. I got out of heap errors while building my project which was using mlpack. Altough 32bit build tools did work with clang, but not with msvc.
< turska79Gitter[m> And i did modify cartpole example quite a lot. I was using it as a first time reference how to use mlpack.
< AakashkaushikGit> Just got my mlpack stickers, thanks @zoq and everyone.
< RishabhGarg108Gi> I have a very silly doubt. For every layer in mlpack, let say a layer x, we have its corresponding `x.hpp` and a `x_impl.hpp`. In `x.hpp` we are including `x_impl.hpp` and inside `x_impl.hpp` we are including `x.hpp`. So, shouldn't it cause an endless loop. Because its like a circle, and compiler should be like including their content again and again.
ImQ009 has joined #mlpack
yuvraj_2701[m] has quit [Quit: Idle for 30+ days]
< AakashkaushikGit> @RishabhGarg108 we have include guards for that. the line that says `#ifndef` checks if that thing has been included before and if so it skips over the whole file.
< RishabhGarg108Gi> I see. That's why those #indefs are there. Thanks :D
< turska79Gitter[m> out of heap started to happen to me few visual studio updates ago and now it chomps about 10 gigs of ram while compiling. but it seem to crap out with template heavy code
< turska79Gitter[m> when i compile with clang it takes only only 3-4 gigs of ram at peak
< shrit[m]> @zoq, I am getting matrix multiplication error when using Adding Dropout layer for Neural network architecture in SAC algorithm
< shrit[m]> is this normal?
ib07 has quit [Ping timeout: 256 seconds]
ib07 has joined #mlpack
ib07 has quit [Ping timeout: 240 seconds]
ib07 has joined #mlpack
ib07 has quit [Ping timeout: 258 seconds]
< RishabhGarg108Gi> Hey, I was looking at the code of methods/ann/layer/linear3d.hpp and i found two private objects called `weights` and `weight`. Can someone please point out the difference between them. Thanks!
< zoq> shrit[m]: No, dou you have an example?
< zoq> RishabhGarg108Gi: Weights references all weights including the bias, weight is an alias that references all weights but without the bias.
< zoq> turska79Gitter[m: Yeah, we have seen some huge memory allocations on the windows build as well.
< turska79Gitter[m> but works with 64bit toolset so i havent bothered to investigate too much
< zoq> turska79Gitter[m: alright, that is good to know
< RishabhGarg108Gi> Ok. Thanks @zoq . Just to clarify, if I need to access bias, can I say `layer_name.weights.bias` ?
< zoq> RishabhGarg108Gi: layer.Bias() inside the layer you can just use bias.
ib07 has joined #mlpack
ib07 has quit [Max SendQ exceeded]
ib07 has joined #mlpack
ib07 has quit [Max SendQ exceeded]
ib07 has joined #mlpack
ImQ009 has quit [Quit: Leaving]
ib07 has quit [Max SendQ exceeded]
ib07 has joined #mlpack
< RishabhGarg108Gi> Okay @zoq :)
< shrit[m]> zoq I had this problem in my code, I will try to reproduce it in examples
< zoq> shrit[m]: Thanks a lot.