ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
lupulo has quit [Ping timeout: 260 seconds]
< rcurtin> zoq: okay, I looked at the STB headers more closely... I was surprised to see that all files actually have the implementations, *not* marked inline
< rcurtin> then I realize those are in a section that is only accessed if, e.g., the STB_IMAGE_IMPLEMENTATION macro is defined
< rcurtin> so, this means that we can only include <stb_image.hpp> in *one* place in the entire mlpack codebase and use STB_IMAGE_IMPLEMENTATION
< rcurtin> zoq: I opened https://github.com/mlpack/mlpack/pull/2355, let me know what you think
ImQ009 has joined #mlpack
togo has joined #mlpack
< chopper_inbound4> zoq: Can you please provide some reference to JacobianTest implemented in the test/ann_test_tools.hpp. I don't quite understand this.
< naruarjun[m]> <chopper_inbound4 "zoq: Can you please provide some"> Basically this formula
< naruarjun[m]> It computed that and compares the value to the output given by backward
< naruarjun[m]> Both should be relatively close
< naruarjun[m]> h is the perturbation
< chopper_inbound4> ohh....thanks naruarjun
< kartikdutt18Gitt> Hey zoq, ensmallen does work for training both LSTMs and MNIST. In the new notebook, I was able to train as well as get the final predictions for both LSTM and MNIST. However the MSE still gives nan in LSTM example because one or two datapoint in predictions have nans. A solution that works sometimes was removing dropout layer and using different initialization(He_init).
lupulo has joined #mlpack
lupulo has quit [Read error: Connection reset by peer]
ImQ009 has quit [Read error: Connection reset by peer]
< zoq> kartikdutt18Gitt: Can you export the notebook?
< PrinceGuptaGitte> I added it to CMakeLists.txt
< zoq> ANd you also have to include the visitor, but other than that nothing else is needed.
< PrinceGuptaGitte> I also included it.
< PrinceGuptaGitte> (edited) I also included it. => I included it too.
< zoq> And I guess, it has a similar interface as the exsisting visitors?
< PrinceGuptaGitte> Yes, it is pretty much copy of outputParameterVisitor
< zoq> Hm, you might have to push the code to your fork, if interface is the same, there is something else we are missing.
< PrinceGuptaGitte> I did it. It's available here https://github.com/prince776/mlpack/commits/redesign-layer-names
< PrinceGuptaGitte> changes are in last 2 commits
< zoq> PrinceGuptaGitte: Looks okay to me, let me build the code.
< zoq> PrinceGuptaGitte: You have to make sure, the layer implements the Name function, before you can call it:
< zoq> You can check the other visitors for some examples.
< PrinceGuptaGitte> Oh. That is the reason
< PrinceGuptaGitte> Thanks.
< PrinceGuptaGitte> Hello @zoq , I modified the visitor to include checking if function exists, but it still terminates with same error :(
< zoq> PrinceGuptaGitte: Maybe network[0] instead of network.begin() works?
< PrinceGuptaGitte> Yes this was the problem.It should've been network[i] or network.front() , .begin() returns an iterator.
< zoq> PrinceGuptaGitte: right
< PrinceGuptaGitte> Thanks for all the help
< AnjishnuGitter[m> In PR #2345 , which adds a Loss function, one of the tests for Linear SVM failed randomly (it is not related to the work in the PR). The error message was ```fatal error: in "LinearSVMTest/LinearSVMParallelSGDTwoClasses": difference{0.0245902} between testAcc{0.97599999999999998} and 1.0{1} exceeds 2%```. Just reporting this here for further discussion.
< himanshu_pathak[> Anjishnu (Gitter): Unrelated to your pr you can ignore it
togo has quit [Quit: Leaving]