ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
ImQ009 has joined #mlpack
cjlcarvalho has quit [Ping timeout: 250 seconds]
akhandait has joined #mlpack
< akhandait> zoq: In the Sequential object, the destructor only deletes the layers in the Sequential object if `model == false`. Can I know the reason why it is this way? When we use the Sequenial object with `model == true`, the memory of the layers within gets leaked.
< akhandait> zoq: Should we add a public function to the Sequential object which allows us to manually delete all the layers as the destructor won't do it. This way, we won't have to change the recurrent layer implementation.
ShikharJ_ has joined #mlpack
ImQ009 has quit [Read error: Connection reset by peer]
ShikharJ_ has quit [Remote host closed the connection]
davida has joined #mlpack
ShikharJ_ has joined #mlpack
< jenkins-mlpack2> Project docker mlpack nightly build build #149: STILL UNSTABLE in 6 hr 52 min: http://ci.mlpack.org/job/docker%20mlpack%20nightly%20build/149/
ShikharJ_ has quit [Ping timeout: 246 seconds]
ShikharJ_ has joined #mlpack
ShikharJ_ has quit [Ping timeout: 246 seconds]
ShikharJ_ has joined #mlpack
< jenkins-mlpack2> Project mlpack - git commit test build #64: STILL UNSTABLE in 33 min: http://ci.mlpack.org/job/mlpack%20-%20git%20commit%20test/64/
< jenkins-mlpack2> noreply: Merge pull request #1583 from rcurtin/cmake-fixes
ShikharJ_ has quit [Ping timeout: 245 seconds]
ShikharJ_ has joined #mlpack
travis-ci has joined #mlpack
< travis-ci> mlpack/mlpack#5657 (master - b5d3720 : Ryan Curtin): The build has errored.
travis-ci has left #mlpack []
ShikharJ_ has quit [Quit: Leaving]
< jenkins-mlpack2> Project mlpack - git commit test build #65: STILL UNSTABLE in 33 min: http://ci.mlpack.org/job/mlpack%20-%20git%20commit%20test/65/
< jenkins-mlpack2> noreply: Merge pull request #1591 from rcurtin/travis-keyserver
< zoq> rcurtin: Do you think if we restart jenkins, anonymous users can see the build status again? the option is enabled.
ShikharJ_ has joined #mlpack
< zoq> akhandait: It's because one layer could be hold by multiple layer. So there is not a single deconstruction chain.
< zoq> You could set the model parameter and call 'delete object' to delete the layer.
< zoq> Isn't that the same, maybe I missed something.
travis-ci has joined #mlpack
< travis-ci> mlpack/mlpack#5658 (master - 7e05c30 : Ryan Curtin): The build passed.
travis-ci has left #mlpack []
< akhandait> zoq: To be clear, are you saying that we can set `model = true` and then delete the layers that we added to the sequential object individually?
< rcurtin> zoq: let's try it and see. thanks for looking into it :)
jenkins-mlpack2 has quit []
jenkins-mlpack2 has joined #mlpack
< rcurtin> zoq: looks like no success, I'm still not able to look at jobs anonymously
< rcurtin> the plane internet is too bad for me to try and debug further
< zoq> akhandait: Right, I agree this is not the best option.
< akhandait> zoq: Yeah, we can't do that when we don't maintain pointers to those layers and directly add them to the Sequential object like this
< akhandait> Sequential<> a = new Sequential<>();
< akhandait> a->Add<Linear<> >(a, b);
< zoq> right
< akhandait> Now we have no way to free the memory of that Linear layer
< akhandait> I think we can call the delete visitor on the Sequential layer for that, but I am not sure
< zoq> have to think about that, but if you like to test it out with a couple of examples, please feel free :)
< akhandait> Okay, I will try and get back.
< zoq> Sounds good to me.
< akhandait> zoq: Do you think the Transposed conv PR is ready to be merged? #1493
< zoq> akhandait: Will take a look at the PR, later today.
< akhandait> zoq: Sure, whenever you are free
davida has quit [Read error: Connection reset by peer]
sayan_ has joined #mlpack
sayan_ has quit [Quit: Page closed]
< ShikharJ_> akhandait: I would appreciate if we could wait a bit on that. Though I'm confident on most of the aspects of the PR, there are some things that I wish to test out, before we merge that if it's okay with you?
ShikharJ_ has quit [Remote host closed the connection]
davida has joined #mlpack
akhandait has quit [Quit: Connection closed for inactivity]