verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
Rodya has quit [Ping timeout: 272 seconds]
Rodya has joined #mlpack
rsv has joined #mlpack
< rsv> hi again, i've been attempting to implement L1 regularization in the logistic regression class, modifying the Evaluate function is straightforward enough, but (as expected) one has to decide how to implement the gradient since it is undefined at x=0
< rsv> i am wondering if there is a standard implementation for this that is done in other LR packages
< rsv> i've been doing research with this, and i *think* it's appropriate to do something like slide 20 here: http://www.mit.edu/~9.520/spring09/Classes/class11_sparsity.pdf
< rsv> i know this isn't really a mlpack question, but i wanted to see if anyone had an opinion on this