ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
ayesdie has joined #mlpack
< ayesdie> rcurtin: for the `LinearSVM`, I was thinking of adding the bias term `b_i` similar to the way it has been added to `SoftmaxRegression` by using a boolean parameter `fitIntercept`. Would it be fine?
< ayesdie> Also for the test, that has a tolerance value of `1e-2`, reducing that value causes the tests to fail sometimes.
< rcurtin> yeah, I think that's totally reasonable
< rcurtin> I feel like 0.01% is too large a tolerance; see my comment about the gradient implementation... there could be an error
< ayesdie> Ah, ok. I'll see what the issue might be.
govg has joined #mlpack
bhavya01 has joined #mlpack
vivekp has quit [Ping timeout: 255 seconds]
vivekp has joined #mlpack
bhavya01 has quit [Ping timeout: 255 seconds]
chaithu has joined #mlpack
chaithu has quit [Client Quit]
bhavya01 has joined #mlpack
< Soonmok> Hi! I'm trying to add Gan application on mnist dataset in mlpack/models. But it couldn't pass travisCI build test because all code in mlpack/models doesn't updated to use ensmallen.hpp file. So that I updated it and added ensmallen into build process (CMakeList.txt). I sent a pull request but travis said "The build phase is set to "MSBuild" mode (default), but no Visual Studio project or solution files were found in the root
< Soonmok> directory. If you are not building Visual Studio project switch build mode to "Script" and provide your custom build command."
< Soonmok> Should I add something into CMakeList.txt for Vs Studio users?.
< Soonmok> my pull request is here [ https://github.com/mlpack/models/pull/22 ]
travis-ci has joined #mlpack
< travis-ci> Soonmok/models#4 (master - 2dec6a6 : Soonmok): The build is still failing.
travis-ci has left #mlpack []
bishwa has joined #mlpack
bishwa has quit [Client Quit]
< jenkins-mlpack2> Project docker mlpack nightly build build #255: UNSTABLE in 3 hr 29 min: http://ci.mlpack.org/job/docker%20mlpack%20nightly%20build/255/
< ShikharJ> Soonmok: It could be an issue with the Travis script for the models repository. Travis is primarily used for Linux builds, and Appveyor for MS builds.
< Soonmok> Okay ! I will look it up!
< Soonmok> Thank you for your reply
< Soonmok> I have another question. is it going to add ensmallen headers into /src/mlpack/core/optimizers ?
< ShikharJ> You'll have to wait for Marcus/Ryan to reply, for queries regarding ensmallen :)
< ShikharJ> rcurtin: I was thinking of maybe opening an issue for minimizing the use of casts, and using C++ style casts across the codebase, wherever applicable? What do you think?
< ShikharJ> zoq: I'd like your input on that as well?
bhavya01 has quit [Ping timeout: 255 seconds]
< Soonmok> Thank you for letting me know it !
ayesdie has quit [Quit: Connection closed for inactivity]
abhiyad has joined #mlpack
abhiyad has quit [Quit: Page closed]
KimSangYeon-DGU has quit [Ping timeout: 256 seconds]
ahmedmaher has joined #mlpack
aman_p has joined #mlpack
travis-ci has joined #mlpack
< travis-ci> Soonmok/models#6 (master - c0c6422 : Soonmok): The build has errored.
travis-ci has left #mlpack []
lozhnikov has quit [Quit: ZNC 1.7.1 - https://znc.in]
ayesdie has joined #mlpack
KimSangYeon-DGU has joined #mlpack
saksham189 has joined #mlpack
aman_p has quit [Ping timeout: 255 seconds]
travis-ci has joined #mlpack
< travis-ci> Soonmok/models#7 (master - af9ef5b : Soonmok): The build was fixed.
travis-ci has left #mlpack []
aman_p has joined #mlpack
ronit_ has joined #mlpack
rajmon has joined #mlpack
rajmon has quit [Client Quit]
aman_p has quit [Ping timeout: 244 seconds]
KimSangYeon-DGU has quit [Quit: Page closed]
aman_p has joined #mlpack
ayesdie has quit [Quit: Connection closed for inactivity]
lozhnikov has joined #mlpack
rajiv_ has joined #mlpack
< rajiv_> zoq: When I add Dense<> to model->Add<>, I get a following output in the build and it fails: https://pastebin.com/xPA00QCv Am I missing some header file or some definition?
rajiv_ has quit [Client Quit]
i8hantanu has joined #mlpack
saksham189 has quit [Ping timeout: 256 seconds]
bhavya01 has joined #mlpack
ayesdie has joined #mlpack
riaash04 has joined #mlpack
< riaash04> Hi, I was wrting tests for the isomap pr, while checking if the output of classical multidimensional scaling is equal to matlab's cmdscale I found there are very slight differences in the precision of matrix multiplications and other operations output (107.999998 in matlab to 108 .00 in this)
< riaash04> due to this the algorithm sometimes leaves out one dimension in the final output (as eigen values have to be 1.89e-16). although that dimension has values close to 0 (like 0.00001). Is this reasonably equal? Or can I do something so that the precision matches?
< riaash04> my implementation for one example input gives as output 1 dimension whose values are same as the first dimension output of matlab's implementation but it also as another dimension with values very close to 0
< ayesdie> rcurtin: hey there, you suggested me to use subset of the dataset, for the evaluation of the Loss in LinearSVM, but for that, I also need to use a subset of `groundTruth` which is `sp_mat` and using `cols()` on a sparse matrix is turning everything to `0`. Do I need to convert `groundTruth` to `arma::mat`?
< ayesdie> Making such conversions on every call of `Evaluate()` and `Gradient()` may slow things down... (Not completely sure about this)
ayesdie has quit [Ping timeout: 256 seconds]
ronit_ has quit [Quit: Connection closed for inactivity]
ayesdie has joined #mlpack
bhavya01 has quit [Quit: Ex-Chat]
i8hantanu has quit [Quit: Connection closed for inactivity]
< rcurtin> ayesdie: I'll respond in the PR when I have a chance
< rcurtin> riaash04: right, so there are things like "BOOST_REQUIRE_CLOSE()" that can help deal with small floating point issues
< rcurtin> but based on what you described (and I have not reviewed the code or the algorithm recently) it sounds like your implementation is working correct
< rcurtin> Soonmok: the current CMake configuration downloads ensmallen if it isn't available on the system, and installs into build/deps/ensmallen-*/
< rcurtin> CMake will also make sure that <ensmallen.hpp> is in the compiler include search path, so anywhere in mlpack "#include <ensmallen.hpp>" should work fine
< rcurtin> ShikharJ: no issue from my end; what's the compelling reason to switch from C-style casts?
< rcurtin> Soonmok: sorry, I only described the main repository. there is another PR open in the models repository to switch to using ensmallen correctly, but it's not 100% done yet
< ayesdie> Alright, I'll keep on adding changes.
< riaash04> rcurtin: but since the dimensionalities of the matrices are not same BOOST_REQUIRE_CLOSE() will not pass (I think). And so should I put this test? Maybe we can discuss this after you review. I'll try and add other tests till then.
riaash04 has quit [Quit: Page closed]
bishwa has joined #mlpack
vinayak has joined #mlpack
suji has joined #mlpack
suji has quit [Client Quit]
< vinayak> Hello Everyone, I am Vinayak Tyagi final year student from Computer Science Engineering. I am intersested in contributing or knowing more about on following topics 1. Neuro Evolution 2. Reinforcement learning 3. String Processing Utilities I would love to Know were to start from especially about these topics. Thanks and regards Vinayak Tyagi
vinayak has quit [Quit: Page closed]
omarsgalal has joined #mlpack
AyushS has joined #mlpack
< AyushS> Hello, can someone help me get started ?
< AyushS> Thanks a lot @zoq
AyushS has quit [Ping timeout: 256 seconds]
omarsgalal has quit [Quit: Page closed]
rajiv_ has joined #mlpack
< rajiv_> zoq: Yes, I have added it to layer_types.hpp, you can see here: https://github.com/Rajiv2605/mlpack/blob/master/src/mlpack/methods/ann/layer/layer_types.hpp
rajiv_ has quit [Client Quit]
bishwa has quit [Quit: Page closed]
Nisarg has joined #mlpack
< zoq> rajiv_: It's not part of 'using LayerTypes = boost::variant<...'
< Nisarg> Hello
Nisarg has quit [Client Quit]
Nisarg has joined #mlpack
Nisarg_ has joined #mlpack
< zoq> Nisarg_: Hello there!
ayesdie has quit [Quit: Connection closed for inactivity]
aman_p has quit [Ping timeout: 240 seconds]
Nisarg has quit [Ping timeout: 256 seconds]
< Nisarg_> Hello I am interested in the Particle Swam Optimisation Project. As per my understanding of the project, we have to create a class and implement the PSO algorithm and make it functional for all the algorithms. As a start to the project, is it good to start with implementing linear regression with PSO and sending a pull request for the same
Nisarg_ has quit [Quit: Page closed]
< zoq> Nisarg_: There is already an open PR for a basic implementation, but there are plenty of different variants; I think adding another version, extending an existing optimizer is a good way to get familiar with the codebase.
Suryo has joined #mlpack
< Suryo> zoq: I've found the emails you were talking about. I'll go through them in detail today.
ayesdie has joined #mlpack
< Suryo> I'm kind of inclined towards following the same paradigm for constrained optimization as in my paper that I had shared with you earlier, but I'll weigh in various factors and update you
Suryo has quit [Client Quit]
ahmedalamir has joined #mlpack
ahmedmaher has quit [Read error: Connection reset by peer]
< rcurtin> Suryo: do note that there is also some support for constrained optimization in ensmallen... we should try and keep the API the same if possible
rajat has joined #mlpack
rajat has quit [Client Quit]
aman_p has joined #mlpack
aman_p has quit [Ping timeout: 244 seconds]
Yoda360 has joined #mlpack
Yoda360 has quit [Client Quit]
ayesdie has quit [Quit: Connection closed for inactivity]
SinghKislay has joined #mlpack
< SinghKislay> Hi
< zoq> SinghKislay: Hey there!
< SinghKislay> I'am Kislay Kunal Singh, I'am interested in GSOC2019 and I want to contribute. I have a few questions I would like to ask.
< zoq> Sure go ahead.
< SinghKislay> Add class in layers API has forward method, backward method and gradient method. I'am a confused as to how exactly use them. I have seen a code example/tutorial on mlpack website of FNN in which a FNN model is initialized and the model has add methods. I'am trying to implement ResNet so I want control over the forward flow. Is there an implementation of layers API, u can point me to.
< zoq> SinghKislay: http://mlpack.org/docs/mlpack-git/doxygen/anntutorial.html could be helpful here, also you could take a look at the linear layer for a more straightforward example: https://github.com/mlpack/mlpack/blob/master/src/mlpack/methods/ann/layer/linear_impl.hpp
< SinghKislay> and also can I contribute this to mlpack/models repository if I get it right(accurate :) ). I mean to ask is there already an implementation because I would like to contribute this before 27th march. Thank you.
< zoq> SinghKislay: Sure, would be great to have a working example.
< SinghKislay> Thank you for your time sir.
SinghKislay has quit [Quit: Page closed]
< zoq> rcurtin: Maybe you can tell right away if I do something stupid here, if I use "mlpack_fastmks -r optdigits.csv -v -k 10 -K hyptan" it returns segmentation fault, only for the hyptan kernel and if "-k" is set.
< rcurtin> strange... but that looks like a correct invocation to me
< rcurtin> want to file a bug report? it seems like a nice starter issue for someone :)
< rcurtin> at the very least even if that is a correct command it should give an error not a segfault
Suryo has joined #mlpack
< Suryo> rcurtin: oh yeah, absolutely. But I was talking more from an algorithm point of view than an API point of view.
Suryo has quit [Client Quit]
< rcurtin> Suryo: ah ok. sounds good