naywhayare changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
< jbc__>
naywhayare: use your Getch email address? gth671b…?
< naywhayare>
jbc__: yeah, that's fine, thanks
< naywhayare>
sumedhghaisas: I can implement sign() for sp_mat soon
< naywhayare>
unless you want to :-)
Anand has joined #mlpack
< Anand>
Marcus : So basically we want different tables each sorted on a particular metric. We will have 7-8 tables then. Did I get you right?
< marcus_zoq>
Anand_: Okay, the code sorts by the first metric, but it would be great if we could sort the results in descending order.
< Anand_>
Marcus : Sure we can do that. I will make the change
< Anand_>
And I am still looking at the responsivness thing.
sumedhghaisas has quit [Read error: Connection reset by peer]
< Anand_>
Marcus : Can you run the reports once and let me know if it is still working for you? I think I messed up some code.
< Anand_>
You need not pull code
< Anand_>
Ok working now!
Anand_ has quit [Ping timeout: 246 seconds]
< jenkins-mlpack>
Starting build #2077 for job mlpack - svn checkin test (previous build: SUCCESS)
sumedhghaisas has joined #mlpack
< sumedhghaisas>
naywhayare: you there??
Anand has joined #mlpack
< naywhayare>
sumedhghaisas: yeah; did you see the email from Conrad?
< sumedhghaisas>
yes... I am solving that problem only...
< naywhayare>
okay, thanks
< sumedhghaisas>
if the index is zero and user calls operator--
< sumedhghaisas>
can I set it to max value of uword??
< sumedhghaisas>
actually I want to see what is followed in stl libraries... but for that I have to see the source code...
< naywhayare>
you already have the source code to the STL libraries; it's all templated :)
< naywhayare>
probably somewhere like /usr/include/c++/4.7/bits/stl_vector.h
< sumedhghaisas>
ahh yes...
< naywhayare>
anyway, I would look at what Conrad does for arma::Mat<eT>::row_iterator
< sumedhghaisas>
yes,.. thats a better solution...
sumedhghaisas has quit [Ping timeout: 272 seconds]
sumedhghaisas has joined #mlpack
< sumedhghaisas>
naywhayare: okay I did this...
< sumedhghaisas>
if(internal_row != 0)
< sumedhghaisas>
{
< sumedhghaisas>
current_pos--;
< sumedhghaisas>
internal_row--;
< sumedhghaisas>
}
< sumedhghaisas>
else if(internal_col != 0)
< sumedhghaisas>
{
< sumedhghaisas>
current_pos--;
< sumedhghaisas>
internal_col--;
< sumedhghaisas>
internal_row = M->n_rows - 1;
< sumedhghaisas>
}
< sumedhghaisas>
is this okay??
< naywhayare>
yeah, although I'd use > instead of != for clarity, although the two are equivalent for unsigned types (and uword is unsigned in this case)
< naywhayare>
can you add a short test to arma_extend test to make sure that if you decrement an iterator from the beginning, it does not actually move backwards?
< naywhayare>
it'll be like four lines, but simple test cases like that are helpful in ensuring that everything actually works
< naywhayare>
you'll want to test both prefix and postfix decrement for row_col_iterator and const_row_col_iterator
< sumedhghaisas>
okay I will do that...
< sumedhghaisas>
okay right now I am restructuring CF class...
< sumedhghaisas>
I don't see a point in copying the data passed by the user...
< sumedhghaisas>
CF class is storing that data matrix...
< sumedhghaisas>
and also cleanedData matrix...
< naywhayare>
yeah, if it is not necessary, we shouldn't store it
< naywhayare>
when you make changes, try to make little changes and check them in as you go, if you can work in a way where that's relatively easy to do
< sumedhghaisas>
also I moved the factorization to the constructor...
< naywhayare>
this way Siddharth (or I) can look through the commit messages and digest them as simple changes and say "ah, ok, I understand" instead of getting everything all at once and going "uh, wow, that's a lot of changes" :)
< naywhayare>
ok, yeah. I don't think he had a problem with that
< naywhayare>
when you commit the factorization change, you can resolve the ticket associated with it
< sumedhghaisas>
hehe... yes... sure :)
< naywhayare>
:)
< sumedhghaisas>
naywhayare: and what about plain_svd tests...
< sumedhghaisas>
and yes... there was a bug in plain svd...
< sumedhghaisas>
CF convention of SVD factorization is V = W * H right?
< jenkins-mlpack>
* Ryan Curtin: Oops, I fixed this backwards. The actual error is the output.
< jenkins-mlpack>
* Ryan Curtin: Fix incorrect check (how did this happen?).
< naywhayare>
let me finish what I'm doing and I'll think about the plain_svd tests... probably 25 to 30 minutes
sumedhghaisas has quit [Ping timeout: 272 seconds]
Anand has quit [Ping timeout: 246 seconds]
< jenkins-mlpack>
Starting build #2078 for job mlpack - svn checkin test (previous build: SUCCESS)
sumedhghaisas has joined #mlpack
< sumedhghaisas>
naywhayare: and how to resolve a ticket??
< naywhayare>
sumedhghaisas: open the 'modify ticket' section, then set its status to 'resolved', and add a comment or something indicating the svn revision that resolved it and the fix that you applied
sumedhghaisas has quit [Ping timeout: 240 seconds]
udit_s has joined #mlpack
sumedhghaisas has joined #mlpack
< naywhayare>
sumedhghaisas: not sure if you got my message -- open the 'modify ticket' section, then set its status to 'resolved', and add a comment or something indicating the svn revision that resolved it and the fix that you applied
< sumedhghaisas>
yes... I got it now only... okay will do that...
< jenkins-mlpack>
* added documentation to termination policies
< jenkins-mlpack>
* minor fix of PlainSVD module
< sumedhghaisas>
naywhayare: should we shift numUsersForSimilarity from constructor to gerRecommendations??
< naywhayare>
no, I don't think a user should have to specify that every single time they call GetRecommendations()
< naywhayare>
they might call GetRecommendations() lots of times for different target users
< sumedhghaisas>
default parameter??
< naywhayare>
so we should set numUsersForSimilarity in the constructor, and then the user can change it with CF::NumUsersForSimilarity()
< sumedhghaisas>
ohh okay I get it..
< naywhayare>
yeah, do you think that is a decent approach or can you think of something better?
< sumedhghaisas>
its basically a user preference... so if it is changed frequently then it should be the function parameter...
< sumedhghaisas>
but if its not then constructor parameter seems fine...
< naywhayare>
I don't think it would be changed frequently
< naywhayare>
I think the most common use case for CF will be something like this:
< sumedhghaisas>
I don't know how frequently its changed ... so I would go with the current style...
< naywhayare>
CF c(parameters); while (a user submits a request for recommendations) { c.GetRecommendations(thatUser); }
< naywhayare>
i.e. in the situation where you're running a website where you can give preferences to a user, you tend to only get recommendations for one user at a time
< naywhayare>
like when a user clicks on a button or something
< naywhayare>
I can't say I'm completely sure how people will use CF in practice, but that would be my best guess for now :)
< sumedhghaisas>
hmmm... that does sound right... okay we leave it as it is...
Anand has joined #mlpack
< Anand>
Marcus : I was trying the js way to make the code responsive but then I found a better way to do it. I am not sure if it works as expected but I have used bootstrap to do it. Bootstrap has inbuilt responsiveness feature and hence should work for us.
< sumedhghaisas>
naywhayare: any thoughts on plain_svd test??
< naywhayare>
yeah; so I think we should do this
< naywhayare>
first we should remove the math::RandomSeed(10) since that makes the test non-portable
< naywhayare>
but we can still do a randomized test
< naywhayare>
our objective is to make sure that SVD is providing a good decomposition
< marcus_zoq>
Anand: Sounds good, this already works?
< naywhayare>
which we can check by reconstructing the original matrix and making sure the frobenius norm of the difference between the two (arma::norm(test - reconstructedTest, "fro")) is small
< naywhayare>
I'd actually use the normalized frobenius norm... arma::norm(test - reconstructedTest, "fro") / arma::norm(test, "fro")
< naywhayare>
and I would check that the normalized frobenius norm is below 0.01 (i.e. less than 1% elementwise error)
< naywhayare>
and then you could run this test a few times, for matrices of decent size (20x20? 50x50? something like that)
< naywhayare>
and you can do the same thing for the case where the rank is lower by constructing a low-rank matrix
< naywhayare>
which you might do by randomly initializing a low-rank W and H matrix and then multiplying them to get the test matrix
< Anand>
Marcus : I havent tested it yet.
< naywhayare>
I have to run for a little while... let me know if I can clarify my idea or anything
< jbc__>
naywhayare: got your email. Still a bit confused, seems pretty clear that the Bayes input format is column major (last column is the label) though the code does indicate row shedding. check out https://gist.github.com/jcarlson23/6ce8497b3c5cf7657e9f
< jbc__>
no worries
< jbc__>
Oh, ok, I think I understand that transposition now.. :)
< Anand>
Marcus : Doesn't completely work as expected. Looks better though. I dont think it will be easy to fit the complete graph on a mobile screen! I am trying though. Not finding anything appropriate.
< naywhayare>
jbc__: yeah, the data are stored on disk as row-major but data::Load() transposes so anything loaded by data::Load() is column major if it was row major on disk
< jbc__>
cool - just got little disoriented while playing ;)
Anand has quit [Quit: Page closed]
< sumedhghaisas>
naywhayare: As every apply function returns some index... I have returned the normalized frobenius norm for plain_svd Apply() apply functions...
< sumedhghaisas>
I have checked that norm against .01 as you suggested for 20* 20 and 50 * 50 matrix..
< sumedhghaisas>
and another test for row rank matrix factorization...
< sumedhghaisas>
30 * 40 matrix...
< jbc__>
naywhayare: did you reproduce the zero filled results for the bayes classifier? If use x= linspace(0,6.28,20); [sin(x); cos(x)] -> [[0],[1]] as the training/label set, then I get zeros if I pass back in the training set as the test set…
< naywhayare>
jbc__: no, I haven't had a chance yet. but I will use what you've suggested to try it
sumedhghaisas has quit [Ping timeout: 255 seconds]
< jbc__>
naywhayare: actually, I think I might understand… let me write it up and pass it your way tomorrow
< jenkins-mlpack>
Starting build #2079 for job mlpack - svn checkin test (previous build: SUCCESS)
sumedhghaisas has joined #mlpack
< jbc__>
be back later
jbc__ has quit [Quit: jbc__]
sumedhghaisas has quit [Ping timeout: 272 seconds]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Ping timeout: 244 seconds]