ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< chopper_inbound4> Can someone have a look in issue https://github.com/mlpack/mlpack/issues/2360 . I am getting confused about what actually is 'gy'.
ImQ009 has joined #mlpack
< PrinceGuptaGitte> gy is the error that backpropagates from the layers ahead of the current layer
< chopper_inbound4> I meant the current implementation of log_softmax looks little confusing in that sense. (In reference to the issue mentioned above). It should be actually Kronecker-Delta function but gy is used in current implementation. It is not digestible to me :(
< chopper_inbound4> if this calculation is correct.
< JoelJosephGitter> @mrityunjay-tripathi is the softmax implementation in your PR complete? 'cause i used tried those files and they were not working
< chopper_inbound4> No...no
ImQ009 has quit [Quit: Leaving]
ImQ009 has joined #mlpack
< chopper_inbound4> The backward function has to be completed. The forward function is correct. (If you want that much only :))
< chopper_inbound4> Also some work has to be commited
< JoelJosephGitter> ok :)
< JoelJosephGitter> good luck, hope it gets implemented soon 'cause i really need it
< chopper_inbound4> I am also trying that. Most of us need softmax layer in their work :)
favre49 has joined #mlpack
< favre49> zoq On the Elish PR, I saw you said we should squash and merge it. Why was that?
< sreenik[m]> freenode_gitter_prince776[m]: I am not sure if I clearly understood what exact change you wanted to make, but I'll be happy to take a look.
< sreenik[m]> #2243 isn't the one I think, is it?
< zoq> favre49: To have a more clean commit history, that PR had a lot of commits with messages like update file, style changes, etc. So I thought it makes sense to combine those commits.
< favre49> Oh alright, that's what I thought. Do you have some sort of threshold over which you choose to do that? Or do you just eyeball it? Or perhaps I should always squash and merge?
favre49 has quit [Remote host closed the connection]
< zoq> That depends on the PR, if there is an important change, it's nice to have a commit or several commits for that in the history; usually, I don't squash the PR.
< zoq> Sometimes I try to encourage contributors to use descriptive commit messages, but I often fail to check the messages, in an early stage.
ImQ009 has quit [Quit: Leaving]
< hemal[m]> Could I stop the azure pipeline execution for my PR ?
< hemal[m]> I know there are some code updates and the current build and test would not be useful on the azure pipeline.
< hemal[m]> Basically, that compute time and resources are wasted, which I would like to avoid.
< himanshu_pathak[> hemal: May be you can try adding [skip ci] in your commits. I did't tried it may be it will work but not sure
< zoq> hemal[m]: yes, [skip ci] shoudl work just fine.
< himanshu_pathak[> Hey zoq: I am trying to add copy constructor for BRNN but I am facing problem in copying forwardRNN and backwardRNN. I am thinking of changing them to pointer instead of class object. what do you suggest
< hemal[m]> Thank you !
< zoq> himanshu_pathak[: Hm, I would have to look at the PR.
< himanshu_pathak[> zoq: I was trying it locally I will update it to my pr.
< PrinceGuptaGitte> Hello sreenik, I've completed the redesigning of layer names visitor to make it generalized. It's #2362 .
< jeffin143[m]> People here are night owls
< jeffin143[m]> Message at 3:00 am
< PrinceGuptaGitte> yes :)
< zoq> totally
< himanshu_pathak[> :)
< metahost> Haha.
< metahost> Its 4:24 am rn.