< oldbeardo>
naywhayare: I remembered one more thing that we had discussed earlier
< oldbeardo>
any optimizer in the package looks for Evaluate(arma::mat& parameters) and Gradient(arma::mat& parameters, arma::mat& gradient) functions for an algorithm's implementation
< oldbeardo>
instead of this we should have a default like CostGradient(arma::mat& parameters, arma::mat& gradient), since many of the methods involve common computations for both the functions
< oldbeardo>
also, splitting it into two effectively achieves nothing (there's no reason that comes to mind immediately)
< oldbeardo>
I know that making this change will mean a lot of refactoring, but I think this is something worth doing now, it can potentially run methods at twice the speed
oldbeardo has quit [Quit: Page closed]
cuphrody has joined #mlpack
marcus_z1q has joined #mlpack
marcus_z1q is now known as marcus_zoq
oldbeardo has joined #mlpack
< naywhayare>
oldbeardo: I think there are some cases where Evaluate() is called more often than Gradient() so it makes sense to split them up
< naywhayare>
but you are right that in some cases simultaneously evaluating the objective function and the gradient is more efficient
< oldbeardo>
naywhayare: okay, in which cases does that happen?
< naywhayare>
I can't remember right now. I'd have to look it up
< oldbeardo>
okay, because as I see it, the optimizer for every iteration uses the Evaluate() and Gradient() functions only once
< oldbeardo>
but I haven't seen every algorithm in the library, so you must be right
< naywhayare>
well, it's possible that I'm wrong. but I seem to remember some instances where the objective function is desired and the gradient is not
< naywhayare>
it makes sense to find some way to avoid duplicating work, though, so there is definitely some improvement that we could do
< oldbeardo>
okay, then we could probably associate a flag with the function, 0 says only objective, 1 says only gradient and 2 says both
< naywhayare>
maybe... flags are runtime, though. it'd be better to have all this figured out at compile-time, though
< naywhayare>
I have to step out for a little while... I'll be back later
< oldbeardo>
sure
< naywhayare>
ok, so I think one important thing is that we don't modify the FunctionType abstraction too much
< naywhayare>
we don't want to have so many methods that a user has to implement... ideally just Evaluate() and Gradient()
< naywhayare>
but, I think what we can do is use some template metaprogramming techniques to use an EvaluateGradient() function if the user implemented it in their FunctionType class
< naywhayare>
(the EvaluateGradient() function might need a better name that indicates it's giving both the objective and gradient as results)
< oldbeardo>
okay, that seems fair, won't require refactoring, just writing a new function if needed
< naywhayare>
actually implementing the template metaprogramming bit might be a little bit complicated though
< naywhayare>
I'm not sure what exactly that will entail yet
< oldbeardo>
yeah, well, you know the intricacies better, I just thought of an improvement
< oldbeardo>
though I would like to see the solution to the problem
< oldbeardo>
also, I have written a Softmax Regression implementation, should I open a ticket?
< naywhayare>
sure, feel free
< naywhayare>
I am pretty busy for the next few weeks so I don't know when I'll get to it, but I'll take a look
< naywhayare>
you do have commit access too now, so you could just check it into a directory and leave it uncompiled in trunk/ for now
< oldbeardo>
okay, well I wanted you to have a quick look, haven't written the tests yet
< naywhayare>
ok; still, you can commit it to svn; that's what trunk/ is for
< naywhayare>
if we do a release soon and it isn't done, I'll just pull it out of the release
< naywhayare>
and leave it in trunk/ for future work
< oldbeardo>
okay, for now I will just upload the files to the ticket, haven't acquainted myself with svn yet
< naywhayare>
ok, whatever you like :)
< oldbeardo>
when are you planning for a release?
< naywhayare>
I was hoping to get it done sometime this month
< naywhayare>
but I have a paper deadline on June 6th and I'm pretty far away from having the paper done...
< naywhayare>
so I don't know if I'll have time to do it before mid-June
< naywhayare>
there are a couple minor things to clean up -- Saheb patched most of the dual-tree algorithm constructors but not all of them, so I need to finish that
< naywhayare>
lots of other little things too; I just can't remember them all right now. I think I could look through the open tickets and that would show a bunch of them
< naywhayare>
the first step toward a release is sitting down and writing down everything that needs to be finished before that particular release can happen, and I haven't done that yet :)
< oldbeardo>
okay, that sounds like mid June then :)
< naywhayare>
yeah, probably...
cuphrody has quit [Ping timeout: 276 seconds]
< oldbeardo>
just opened the ticket, can you have a quick look?
< naywhayare>
I don't really have the time to take a good look right now
< naywhayare>
can you detail how it is meant to be used? you use softmax regression after training a set of stacked sparse autoencoders, right?
< naywhayare>
i.e. you take the output of the last sparse autoencoder and then use softmax regression, I think. I could be wrong; I'm not sure
< oldbeardo>
yes, so this is how it's done
< oldbeardo>
you decide on a number of layers that you want in your network
< oldbeardo>
you train that many autoencoders greedily one after the other
< oldbeardo>
then you attach a classifier(Softmax Regression) to the last layer, and finetune the parameter weights
< naywhayare>
ok
< oldbeardo>
apart from this you can also use Softmax independently as a classifier
< naywhayare>
so I see this from the UFLDL wiki:
< naywhayare>
"In these notes, we describe the Softmax regression model. This model generalizes logistic regression to classification problems where the class label y can take on more than two possible values. "
< oldbeardo>
yes
< naywhayare>
mlpack already has logistic regression for two-class problems; would it make more sense to generalize the logistic regression code to be softmax regression?
< naywhayare>
so that we don't have separate logistic regression and softmax regression implementations
< naywhayare>
I don't know the answer to this of course because I have done very little reading on softmax regression. you are the expert :)
< oldbeardo>
well, Softmax is the generalization, so what you are suggesting is removing Logistic Regression
< naywhayare>
yeah; either removing logistic regression or merging the two implementations in a way that provides both logistic regression and softmax regression
< naywhayare>
if two-class softmax regression is equivalent to logistic regression, then it doesn't make sense to have a separate logistic regression implementation, in my opinion
< oldbeardo>
ummm, you won't be merging the two for sure, there's nothing in the Logistic Regression module that will improve the Softmax implementation
< naywhayare>
true -- probably the only part that could be merged is the logistic_regression_main.cpp executable and the documentation it contains
< oldbeardo>
the two are equivalent, read the comment in the Evaluate() function in softmax_regression_function.cpp, just above the 'probabilities' calculation
< naywhayare>
okay
< naywhayare>
I don't have too much time I can put into this today... unfortunately I have to do some grading work, which is time-consuming and tedious...
< oldbeardo>
sure, I was going to end the discussion but you asked me about Stacked Autoencoders :)
< naywhayare>
yeah; thank you for the explanation
< naywhayare>
that will help my understanding of what the code does and how it will be used
< oldbeardo>
no problem
oldbeardo has quit [Quit: Page closed]
witness___ has quit [Quit: Connection closed for inactivity]
ryde has joined #mlpack
< ryde>
hello. Anyone here can provide hints on compiling mlpack with blas (Intel MKL) living on non-standard location?
< ryde>
armadillo was a bit of challenge with MKL, but eventually found the right paths. Now mlpack on the other hand find armadillo but is unable to locate MKL included. Has anyone had similar problems in the past?
< naywhayare>
ryde: I think the issue is that when you use armadillo with MKL you have to compile with -lmkl, but mlpack doesn't automatically do that