ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< shrit[m]>
rcurtin, is there any mlpack svg logo available?
< ka_shGitter[m]>
Hi everyone, I'm trying to solve sparse decomposition problem using mlpack. Since now I've been trying to solve it using mlpack::sparse_coding::SparseCoding but unable to get desired results, I want to solve X ~= coeff * D equation which X dim(1 x n) is input data, D dim(m x n) another input data on which coeff will be solved and coeff dim(1 x m) will be the output of the solver.
< ka_shGitter[m]>
The output codes I got is a vector of dimension `nAtoms` which is expected and with values ranging btw (-1,1), however it gave 0.5 as first element and rest values are not at all close to zero
< rcurtin>
ka_shGitter[m]: I don't follow why you are doing step 3; shouldn't you just be able to pass the data in directly?
< rcurtin>
also, with sparse coding, the dictionary is not necessarily sparse---instead, each data point is encoded as a sparse linear combination of elements in the dictionary
< ka_shGitter[m]>
Step 3 is like a testing thing for me as in my application I will be passing more unknown Y(nPoints,1) matrices to get the codes(nAtoms,1) for the given dictionary
rishishounakGitt has joined #mlpack
< rishishounakGitt>
could someone suggest issues for a newcomer
< rcurtin>
ka_shGitter[m]: also, maybe you need a larger lambda1 parameter?
< ka_shGitter[m]>
I've tried with as large as 100.0 but results aren't expected, with 0.1 I'm getting 0.65700 and with 100.0 I'm getting 0.66011
< rcurtin>
what do the 0.65700 and 0.66011 represent?
< ka_shGitter[m]>
(edited) ... getting 0.66011 => ... getting 0.66011 for first atom
< rcurtin>
ok, I see, and you are saying that the rest of the elements in the encoded vector also have all nonzero values?
< ka_shGitter[m]>
Yup
< rcurtin>
can you provide a minimal reproducible example?
< rcurtin>
like some code that I can run on my end to see what happens
< ka_shGitter[m]>
Sure
< ka_shGitter[m]>
(edited) ... `
< ka_shGitter[m]>
double ... => ... `
< ka_shGitter[m]>
double ...
Earendil_14 has quit [Ping timeout: 260 seconds]
< rcurtin>
ka_shGitter[m]: thanks, let me try it; I ran the sparse coding tests and ensured that at least for those test cases, the result given was sparse
< rcurtin>
I didn't tune any parameters; I just ensured that the number of rows in `dataset` was the dimensionality of the data (31), and the number of columns was the number of points (136)
< rcurtin>
sorry I got that backwards! the dimensionality is 136, and the number of points is 31
< rcurtin>
so `dataset` has size `136x31` to give those results, and `Y` has size `136x1`
< rcurtin>
did I overlook something in the example? that seems like it is performing correctly
< ka_shGitter[m]>
Codes looks perfect to me, I'll give it a try and get back in an hour
Earendil_14 has joined #mlpack
Earendil_14 has left #mlpack []
< ka_shGitter[m]>
Thank you very much for help, it worked for me as well :)
< rcurtin>
awesome, I guess just the matrices were transposed or something?
< ka_shGitter[m]>
True, also I was directly loading it from the application interface which was a bit messed up so didn't notice the correctness of dimensions
< rcurtin>
ok, great that you got it worked out though
< rcurtin>
but even though it had the wrong dimensions, it still ran correctly?
< rcurtin>
(I'm just trying to figure out if there is anything to fix here)
< rcurtin>
(if SparseCoding let you run on invalid data and still gave a result, it may be that we can issue a warning in such a situation to have saved you the debugging time)
ImQ009 has quit [Quit: Leaving]
< NippunSharmaGitt>
Hi all, if anyone is free can you please review #2704 ?
pradkrish has joined #mlpack
pradkrish has quit [Remote host closed the connection]
pradkrish has joined #mlpack
< abernauer[m]>
Technical Python Assessment with EY went well today.
pradkrish has quit [Ping timeout: 245 seconds]
< rcurtin>
abernauer[m]: awesome, hoping for the best