naywhayare changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
jbc_ has quit [Quit: jbc_]
jbc_ has joined #mlpack
andrewmw94 has quit [Quit: Leaving.]
jbc_ has quit [Quit: jbc_]
< jenkins-mlpack> Yippie, build fixed!
< jenkins-mlpack> Project mlpack - nightly matrix build build #564: FIXED in 7 hr 15 min: http://big.cc.gt.atl.ga.us:8080/job/mlpack%20-%20nightly%20matrix%20build/564/
< jenkins-mlpack> * Ryan Curtin: How did I accidentally remove two lines? I'm not actually sure.
< jenkins-mlpack> * Ryan Curtin: First pass: make things 80 characters, minor style fixes.
< jenkins-mlpack> * Ryan Curtin: When Conrad took the patch, he stripped out a lot of compatibility between
< jenkins-mlpack> iterators, so... take the tests depending on that functionality out.
< jenkins-mlpack> * Ryan Curtin: Don't add row_col_iterator support after 4.349 (currently svn trunk) since
< jenkins-mlpack> Conrad accepted our patches.
< jenkins-mlpack> * Ryan Curtin: Typing failure.
< jenkins-mlpack> * Ryan Curtin: First pass: make lines 80 characters long, tabs to spaces, and bracket surgery
< jenkins-mlpack> (or something).
< jenkins-mlpack> * andrewmw94: X tree commit
jbc_ has joined #mlpack
< jenkins-mlpack> Starting build #2097 for job mlpack - svn checkin test (previous build: FIXED)
< marcus_zoq> Hello, maybe someone has an idea and can explain this; I get different signs for pca with svd and pca with eig_sym. I've compared the results with matlab and scikit only first column has another sign (using svd).
< naywhayare> marcus_zoq: I'm not sure this is an issue; a "negative" principal component is still that same principal component
< naywhayare> that is, it still describes the same one-dimensional subspace
< naywhayare> or have I misunderstood?
< marcus_zoq> No this is right, but as pointed out by ftrovato, pca and kpca with a linear kernel should have the same result, right?
govg has quit [Ping timeout: 255 seconds]
< naywhayare> yeah, but I would say if the principal components are just negative, then it is the same result
< naywhayare> whether or not the principal components are negative is the function of the eigendecomposition implementation, or the SVD implementation
< marcus_zoq> matlab uses a sign convention on the coefficients "the largest element in each column will have a positive sign."
< naywhayare> I don't think Armadillo's eigensolvers or SVD implementations make the same guarantee
< naywhayare> although we could add code to give the same guarantee as MATLAB, I'm not sure it's necessary for correctness
< jenkins-mlpack> Project mlpack - svn checkin test build #2097: SUCCESS in 1 hr 30 min: http://big.cc.gt.atl.ga.us:8080/job/mlpack%20-%20svn%20checkin%20test/2097/
< jenkins-mlpack> * Marcus Edel: Use the maxIterartion as template parameter.
< jenkins-mlpack> * Marcus Edel: Scale the transformed data matrix.
< jenkins-mlpack> * Marcus Edel: Center the reconstructed approximation and the kernel matrix.
< marcus_zoq> The result is just unexpected: https://urgs.org/kpca_pca_plot.pdf, but you are right the sign shouldn't matter.
< naywhayare> where is the blue kpca-linear plot? I don't see it
< naywhayare> maybe it's exactly the same as one of the others and gets overwritten by it?
< marcus_zoq> yeah, KPCA (linear - naive) and matlab pca have the same values
< naywhayare> so PCA (mlpack) appears to have a flipped axis with respect to PCA (matlab)
< naywhayare> so I would call that, and KPCA (naive - linear) correct
< naywhayare> but I can't say whether or not the Nystroem implementation is right. because it is sampling points, it won't necessarily give exactly the same result as PCA or naive KPCA
< marcus_zoq> yeah, right, it is just a approximation
< naywhayare> marcus_zoq: thanks for taking the time to respond to #361 :)
< marcus_zoq> naywhayare: no thanks needed
< naywhayare> :)