ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< rcurtin>
shrit[m]: can you tell me how you ran valgrind and what it showed?
< rcurtin>
shrit[m]: I used gdb and caught the exception in BinarySpaceTreeTest; it looks like the datasets are all either NULL pointers or randomly initialized memory
< rcurtin>
according to my output here it seems only `jsonTree` has a valid dataset, and the others have not deserialized correctly
< rcurtin>
(I'm also not certain what has happened to `tree.dataset`
ImQ009 has joined #mlpack
adwaitbhope has joined #mlpack
adwaitbhope has quit [Ping timeout: 245 seconds]
adwaitbhope has joined #mlpack
gtank___ has quit [Ping timeout: 272 seconds]
gtank___ has joined #mlpack
< shrit[m]>
rcurtin it is exactly like you, I have tried to run DistributionTest, everthing seems to be fine. However, when I run it with valgrind, I got a possible memory loss with 2 errors
< shrit[m]>
RectangleTreeTest is running perfectly fine, both with and without valgrind
adwaitbhope has quit [Ping timeout: 245 seconds]
< shrit[m]>
Considering the BinarySpacetreeTest there is a memory leak in valgrind, but I am not able to find what is causing that right now
travis-ci has joined #mlpack
< travis-ci>
mlpack/ensmallen#1013 (2.14.1 - 0431e31 : Marcus Edel): The build passed.
< rcurtin>
shrit[m]: try running the BinarySpaceTreeTest with gdb and catching the exception, and then you can inspect and see that the pointers are taking strange and unexpected values
< AakashkaushikGit>
hey i would like to take up a loss function from the issue #2200(https://github.com/mlpack/mlpack/issues/2200) but there seems to be a lot of confusion on who has taken up which and the list doesn't seems to be updated can someone help me in choosing one ?
< zoq>
AakashkaushikGit: There is no PR for the MultiLabelMargin Loss so if you like to work on that feel free.
< AakashkaushikGit>
sure will take that, should i mention it on the issue or is it fine ?
< zoq>
AakashkaushikGit: Let's leave a simple comment on the issue.
< AakashkaushikGit>
done.
< zoq>
AakashkaushikGit: great
< anjishnu[m]>
https://github.com/mlpack/mlpack/issues/2200#issuecomment-644205201 This comment on that issue lists the currently available functions which dont have any open PRs. R Aravind was working on Multi Label Margin Loss a few weeks back. Maybe he didn't complete it.
< zoq>
anjishnu[m]: Yeah, since there is no open PR for like the last 10 days, he is probably not working on it.
< R-Aravind[m]>
@anjishnu:matrix.org: I am still working on it. Sorry for being late. I am getting familiar with the code base by going through the other loss functions which were already implemented
< AakashkaushikGit>
As @R-Aravind is working on multi label margin loss i am taking up on mean absolute percentage error as suggested by @zoq
< anjishnu[m]>
Great! Feel free to ask if you guys need any help :)