verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
yaswagner has quit [Quit: Page closed]
Atharva has quit [Ping timeout: 260 seconds]
witness_ has quit [Ping timeout: 256 seconds]
gtank has quit [Ping timeout: 256 seconds]
gtank_ has joined #mlpack
witness_ has joined #mlpack
witness_ has quit [Ping timeout: 240 seconds]
gtank_ has quit [Ping timeout: 256 seconds]
Atharva has joined #mlpack
witness_ has joined #mlpack
gtank_ has joined #mlpack
< Atharva>
sumedhghaisas: Hey Sumedh
< Atharva>
I have been trying to debug the gradent check for the reconstruction loss since yesterday with no luck. Will you please check the LogProbBackward function once?
< Atharva>
I checked it multiple times and can't find any mistake
< Atharva>
zoq: Do you think there can be any other reasons for gradient check to fail?
< Atharva>
sumedhghaisas: zoq: No worries, it just passed ! :D
< zoq>
Atharva: Good, how did you solve the issue?
< Atharva>
Turns out I changed the loss function from NegativeLogLikelihood to ReconstructionLoss but I was still using the LogSoftMax activation. Also, I wasn't applying softmax before using the standard deviation
< Atharva>
zoq: I had a doubt.
< Atharva>
I want to add this gradient check test for the reconstruction loss, should I add it in the layer test file or the loss test file?
< Atharva>
If the loss test, then we would have to define the checkgradient function again.
< Atharva>
softplus*
< zoq>
We could write a new file something like ann_test_tools.hpp which implements the gradient check function and include the file inside the test, but I'm fine with either one.
< Atharva>
Yes, I think the ann_test_tools.hpp is a good option. That way it can be used both in ann_layer_test and loss_function_test
manish7294 has joined #mlpack
< manish7294>
zoq: Is it necessary to have labels in integer format for mlpack?
< zoq>
manish7294: That depends on the method, some store the labels as arma::row<size_t>
< zoq>
manish7294: You could map the labels before passing them and remap the results afterwards, not sure that is an option.
< manish7294>
zoq: Thanks, I am doing that for now.
< manish7294>
zoq: Then it seems strange why lmnn is throwing labels related error for the letters and balance dataset, while working on the integer format versions of the same.
< manish7294>
I shall look more into it, as why it's happening
< zoq>
manish7294: For the balance dataset isn't the label a string?
< manish7294>
zoq: It's a char
< zoq>
manish7294: And if you load the dataset it's converted to int?
< manish7294>
zoq: verifying it.
< manish7294>
zoq: somehow after normalize(), all labels are turning to 0.
< manish7294>
Before normalize too rawLabels have all 0 enteries.
< zoq>
are you pasing a seperate labels file or do you use the last column?
< rcurtin>
I saw your post, let me finish this other thing first. just at a quick glance it looks great so far (but I need to look closer)
< manish7294>
rcurtin: sure
manish7294 has quit [Ping timeout: 260 seconds]
manish7294 has joined #mlpack
< manish7294>
rcurtin: I saw your comment, Additionally I would like to say that --- we have quite a number of parameters and I strongly feel that we can get accuracy comparable to shogun's by some tuning.
< manish7294>
And then there's balance dataset on which mlpack performs totally on other level :)
< manish7294>
and I think shogun is using pca for distance initialization process since I have been passing anything to shogunLMNN
< manish7294>
*not been passing
manish7294 has quit [Ping timeout: 260 seconds]
manish7294 has joined #mlpack
< manish7294>
zoq: rcurtin: matlab doesn't seems to accessible from benchmarks. The error this time is [FATAL] Exception: 'MATLAB_BIN' , Can you help with this?
< manish7294>
zoq: I saw the matlab scripts and looking at the way you implemented them, I think it is possible to take out lmnn implementation out of drtoolbox and include it similar to existing ones.
< rcurtin>
try 'export MATLAB_BIN=/opt/matlab/bin/matlab', I think that will fix it
< rcurtin>
and I agree, I think we can just take lmnn.m and drop it into place
< rcurtin>
I agree with your comments too---I don't think we need to exactly match shogun's accuracy everywhere
< rcurtin>
just get an idea that we perform roughly the same and get an idea that we could tune to match the accuracy
< manish7294>
rcurtin: Thanks! It looks like there is some progress now.
< manish7294>
Ah! finally letter comes to a stop with a timing of 6416.926464s against our's 19.975593 with accuracies almost same :)
manish7294 has quit [Ping timeout: 260 seconds]
< ShikharJ>
rcurtin: Could you review the BatchSupport PR, so that we may merge it?
< rcurtin>
sure, give me a little while and I will do that
sumedhghaisas_ has joined #mlpack
< sumedhghaisas_>
Atharva: Hi Atharva, could you fix the third static code check error so that we can merge that PR? :)
< Atharva>
sumedhghaisas_: Yes, I will do it right now.
< Atharva>
Do you mean this one - "Not all members of a class are initialized inside the constructor." ?
< zoq>
Atharva: Yes, that's the one.
< Atharva>
zoq: Okay, so in the constructor I will just initialize them to zero.
< zoq>
Atharva: or you can use the constructor list
< Atharva>
Okay
ImQ009 has quit [Quit: Leaving]
< Atharva>
sumedhghaisas_: Are you free right now?
sumedhghaisas_ has quit [Ping timeout: 260 seconds]