ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< rcurtin>
I don't know if anyone else was watching the SpaceX launch, but there are now four more people in space... really cool to see!
< zoq>
yep
< zoq>
stream still open :)
< zoq>
so cool
< rcurtin>
:)
< rcurtin>
awesome that they stuck the drone ship landing perfectly
< rcurtin>
although, these days, that seems to be the norm!
< zoq>
yeah, unfortunately it is super dark, so could't see much.
ImQ009 has joined #mlpack
francode_7 has joined #mlpack
francode_7 has quit [Remote host closed the connection]
francode_7 has joined #mlpack
francode_7 has quit [Remote host closed the connection]
jenkins-mlpack2 has joined #mlpack
< rcurtin>
okay... got jenkins back online and restored from backup
< rcurtin>
it seems like everything is restored correctly, but I will save that backup .tar.gz just in case this ever happens again :)
< rcurtin>
I still have no idea what went wrong
jenkins-mlpack2 has quit [Remote host closed the connection]
jenkins-mlpack2 has joined #mlpack
nalinwadhwa has joined #mlpack
ImQ009 has quit [Quit: Leaving]
nalinwadhwa has quit [Remote host closed the connection]
gtank___ has quit [Read error: Connection reset by peer]
gtank___ has joined #mlpack
taqd has joined #mlpack
< taqd>
Hello! New to mlpack, so please excuse my noviceness :) First, I've been playing around with mlpack_perceptron and while being able to get it to run, take in my data, train, and make predictions, the model file it saves (using --output_model_file) never changes. By all appearances it seems to be saving the model to file, and it does create what looks to be a proper model file, but all the "elem"'s are always 0.0, which I am assume should
< taqd>
be model weights... I tried using mlpack_preprocess_scale to see if that helped anything, but it did not, it did however save a model file as one would expect. I'm using most recent source, built on ubuntu. And, just to check, data input should be attributes-on-columns with instances/examples-on-rows right?
< taqd>
Also, does mlpack expect a header on the files? Using --verbose, it seems that it is trying to use a 1x0 labels matrix? Thanks for any help :)
< zoq>
taqd: Indeed the values you see should be the weights, and if the model works fine I expect at least nonzero values.
< zoq>
Do you only use the cli or the C++ interface as well?
< zoq>
If you use the C++ interface maybe you can print the weights before saving using model.Weights().print();
< taqd>
Just been using the cli for now. And not sure what changed but weights are changing now, but loads are segfaulting... but I'll figure that. My last question (for now), is: does adding -O3 optimizations to the build lead to any issues?
< zoq>
Usally no, the default mlpack release build uses O3
< zoq>
I can test the perceptron cli later, maybe I can reproduce the issue.
< taqd>
No no, I am sure it is my own notice incompetence still :) Now that I remembered the --verbose flag, I have more to work with. Thank you for your help though! Are you a maintainer?
< zoq>
Yep, there are some tests for the perceptron including tests for the cli, but looks like there is none that checks the weights.
< taqd>
Cool, I would like to contribute eventually, but it'll take me some time to learn the codebase. I really love what I've seen so far, the organizational structure and project ethos really speaks to me :) <3