rcurtin_irc changed the topic of #mlpack to: mlpack: a scalable machine learning library (https://www.mlpack.org/) -- channel logs: https://libera.irclog.whitequark.org/mlpack -- NOTE: messages sent here might not be seen by bridged users on matrix, gitter, or slack
<zoq[m]> Hello Naman Jain in case you haven't seen it already https://www.mlpack.org/community.html has a getting involved section, that should get you started.
<zoq[m]> There is also https://www.mlpack.org/gsoc.html which has some further informations as well.
<zoq[m]> If you struggle at a specific point don't hesitate to ask, happy to help.
aakashi2001 has joined #mlpack
aakashi2009 has joined #mlpack
aakashi2001 has quit [Ping timeout: 240 seconds]
aakashi2009 has quit [Quit: Leaving]
aakashi2001 has joined #mlpack
<jonpsy[m]> > what about reshape?
<jonpsy[m]> its just setting it to 0
<jonpsy[m]> like, making it a column and setting everything else to 0
aakashi2001 has quit [Quit: Leaving]
aakashi2001 has joined #mlpack
aakashi2001 has quit [Remote host closed the connection]
aakashi2001 has joined #mlpack
aakashi2001 has quit [Remote host closed the connection]
aakashi2001 has joined #mlpack
aakashi2001 has quit [Ping timeout: 268 seconds]
aakashi2001 has joined #mlpack
aakashi2001 has quit [Ping timeout: 245 seconds]
aakashi2001 has joined #mlpack
aakashi2001 has quit [Remote host closed the connection]
aakashi2001 has joined #mlpack
aakashi2001 has quit [Remote host closed the connection]
aakashi2001 has joined #mlpack
aakashi2001 has quit [Ping timeout: 245 seconds]
aakashi2001 has joined #mlpack
aakashi2001 has quit [Ping timeout: 240 seconds]
aakashi2001 has joined #mlpack
aakashi2001 has quit [Remote host closed the connection]
<shrit[m]> heisenbuug (Gopi M Tatiraju): Let us have a meeting today
<shrit[m]> I did not have time yesterday, I was too busy
<shrit[m]> Let me know if this works for you
aakashi2001 has joined #mlpack
aakashi2001 has quit [Remote host closed the connection]
aakashi2001 has joined #mlpack
aakashi2001 has quit [Client Quit]
aakashi2001 has joined #mlpack
aakashi2001 has quit [Remote host closed the connection]
aakashi2001 has joined #mlpack
<zoq[m]> <jonpsy[m]> "like, making it a column and set" <- Maybe I don't get what you are trying to do, I thought you have some vector and want to bring it into the format you mentioned.
<shrit[m]> rcurtin: I was thining instead of inlining functions inside the dists/ directory, we can convert the class to template classes that takes a MatType and a VecType? What do you think?
<shrit[m]> * rcurtin: I was thining instead of inlining functions inside the dists/ directory, we can convert the class to template class that takes a MatType and a VecType? What do you think?
<rcurtin[m]> yeah, absolutely, that seems totally reasonable to me 👍️
aakashi2001 has quit [Ping timeout: 268 seconds]
<jonpsy[m]> > Maybe I don't get what you are trying to do, I thought you have some vector and want to bring it into the format you mentioned.
<jonpsy[m]> yeah thats what i wnat
<ABHINAVANAND[m]> <zoq[m]> "On my list, for today." <- zoq Hey, if it was gradient explosion shouldn't it show in when Ctest is run.
<ABHINAVANAND[m]> But here the build is failing.
<zoq[m]> Was building fine on my local system.
<zoq[m]> You can ignore all the github action builds, only the azure build is interesting.
<zoq[m]> was building just fine.
<zoq[m]> Looks like there are three instances from the first row and 2 for the second one?
<zoq[m]> * I see, I don't think we have a direct function to do that, maybe a combination of `repmat` and a for loop or only a for loop could work just fine?
<jonpsy[m]> oh dont mind that
<jonpsy[m]> <zoq[m]> "Looks like there are three insta" <- dont mind the count hhe, was jst lazy typin the last one
<zoq[m]> I see, but still there is no armadillo function that would format it right away.
<jonpsy[m]> nw
aakashi2001 has joined #mlpack
<jonpsy[m]> i've a question on the ```Forward``` thing
<jonpsy[m]> based on our last disucssion, how do we calculate Loss?
<jonpsy[m]> because we do ```target_Q - Q``` right/
<jonpsy[m]> * because we do `Q - gamma * targetQ `
<zoq[m]> Correct, since we use `Forward` we you need to calculate the loss yourself. Since the output layer is skipped.
<zoq[m]> So this step is skipped.
<zoq[m]> Which usually returns the loss, given the input and target.
<jonpsy[m]> hmm, this is shaping out to be interesting..
<jonpsy[m]> when we do
<jonpsy[m]> ```
<jonpsy[m]> arma::mat target
<zoq[m]> Q in `Q - gamma * targetQ`
<jonpsy[m]> yessir
<jonpsy[m]> now
<jonpsy[m]> the R.H.S side is ``` gamma * targetQ ```
<zoq[m]> So you could save the `Q` before the `gamma * targetQ` step to compute the loss with `Q` right?
<jonpsy[m]> yes but they haven't
<jonpsy[m]> after this they've directly calculated ```Backward```
<jonpsy[m]> > yessir
<jonpsy[m]> * after this they've directly calculated `Backward`
<jonpsy[m]> not suer what's hapening in the code ..... hehe
<zoq[m]> I mean the function I just referenced, performs a forward step, gets the loss, with the given target and performs the backward step using the calculated loss.
<zoq[m]> on the learning network
<jonpsy[m]> ah!!!
<jonpsy[m]> wow, okay
<jonpsy[m]> very roundabout way, but i geti t
<jonpsy[m]> * very roundabout way, but i get it.
<jonpsy[m]> <zoq[m]> "So you could save the `Q` before" <- so, he's putting ```gamma*targetQ``` in ```Q``` right? but its only put for the selected action. The pass with backward would automatically find the loss and get gradients, perfect!
<ABHINAVANAND[m]> zoq If the PR is failing due to gradient explosion then it should happen in Ctest, but the PR is failing in the buidl itself.
<ABHINAVANAND[m]> Not sure what is happening
aakashi2001 has quit [Ping timeout: 252 seconds]
<zoq[m]> Do you have a link?
<heisenbuugGopiMT> Sorry @shrit:matrix.org I went somewhere today, didn't saw your message
<heisenbuugGopiMT> Can we have it tomorrow if you are free
<heisenbuugGopiMT> Also I started making the project report, I will send it to you soon
<jonpsy[m]> zoq: ping me when you're free
<zoq[m]> jonpsy: ping
<jonpsy[m]> pong
<jonpsy[m]> can you check the diary once?
<jonpsy[m]> the very latest one
<zoq[m]> "How does mlpack handle this?" this one?
<jonpsy[m]> yeah, i think you understood what i was saying
<zoq[m]> yep
<jonpsy[m]> okay, now im going to explain the dilemma with EQL
<jonpsy[m]> in the doc
<jonpsy[m]> zoq: hey, sorry i need one small detail in our current Backward()
<jonpsy[m]> can you hop in the doc?
<jonpsy[m]> We should prolly adopt a similar [culture](https://stackoverflow.com/a/482129) as well!
<shrit[m]> <jonpsy[m]> "We should prolly adopt a similar" <- these are perfect to look at to kill time when compiling mlpack
<jonpsy[m]> haha yeap
<shrit[m]> We already have,
<shrit[m]> We have a comment that in mlpack
<jonpsy[m]> really? :D show showww
<jonpsy[m]> btw im calling dibs on "hours wasted on optimizing counter" comment
<rcurtin[m]> I know it may reduce the humor in the codebase but one of my biggest hopes about making mlpack header-only is that we can remove that file 😃
<shrit[m]> The same when I read it last year
<shrit[m]> I mean, the comment is so good, that I would like to keep then comment even if we remove the file
<jonpsy[m]> // somedev1 - 6/7/02 Adding temporary tracking of Login screen
<jonpsy[m]> // somedev2 - 5/22/07 Temporary my ass
<rcurtin[m]> There was once some code inside of mlpack where we did `#define private public` to test some internal functionality. I suppose it *might* still be there 😱
<shrit[m]> do you remember where it was?
<shrit[m]> I just open this one https://github.com/mlpack/mlpack/pull/3037/files, I thought it might take me an hour to convert all the dists/ dir, but I ended up several hours for only one file 😂
PranshuSrivastav has joined #mlpack
<PranshuSrivastav> Hey I was trying to build mlpack from source and I received these errors. Can someone help me to fix them?
<shrit[m]> Why are you building mlpack as a static library?
<shrit[m]> You are trying to link statically with armadillo, which is dynamic on your machine
<shrit[m]> that is the reason you are getting this erro
<shrit[m]> * That is the reason you are getting this error