ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
varuns has quit [Ping timeout: 250 seconds]
varuns has joined #mlpack
Suryo has joined #mlpack
< Suryo> zoq: A quick question regarding PSO in ensmallen. As per a previous discussion, the consensus was that in the most plain implementation of PSO, we should probably not include the gradient descent hybrid.
< Suryo> Since we don't have to compute the gradients within vanilla LBestPSO and just evaluate every point in the swarm, would it be okay to cast an input objective function as ArbitraryFunctionType instead of FullFunctionType?
< Suryo> Chintan's code casts the input function to FullFunctionType. I don't think that we actually need it to begin with...
< Suryo> And another general question and this is probably really stupid but I don't fully know know how the process of licensing and authorship of code works.
< Suryo> More specifically: right now, I've read Chintan's code, tweaked it and moved it to a branch of my fork of ensmallen. So if I submit a pull request, would the author of the code be only Chintan, or only me, or both?
< Suryo> Once again, I'm sorry if this last question is really stupid but I need some insights. Thank you.
varuns has quit [Ping timeout: 268 seconds]
varuns has joined #mlpack
soonmok has joined #mlpack
Suryo has quit [Ping timeout: 256 seconds]
soonmok has quit [Remote host closed the connection]
soonmok has joined #mlpack
soonmok has quit []
soonmok has joined #mlpack
varuns has quit [Ping timeout: 250 seconds]
vivekp has quit [Ping timeout: 244 seconds]
vivekp has joined #mlpack
< jenkins-mlpack2> Project docker mlpack nightly build build #231: STILL UNSTABLE in 3 hr 31 min: http://ci.mlpack.org/job/docker%20mlpack%20nightly%20build/231/
picklerick has joined #mlpack
picklerick has quit [Ping timeout: 250 seconds]
KimSangYeon-DGU has quit [Quit: Page closed]
soonmok has quit [Remote host closed the connection]
soonmok has joined #mlpack
soonmok has quit [Remote host closed the connection]
soonmok has joined #mlpack
< zoq> Suryo: Using ArbitraryFunctionType instead of FullFunctionType makes sense.
< zoq> Suryo: I thinks it's fair to list Chintan and you as the authors, I see this as some sort of collaboration. Once the PR is open, we should ask him for input. Let me know what you think.
varuns has joined #mlpack
Suryo has joined #mlpack
< Suryo> zoq: Once again, thank you for your response on this. For the most part, what I've been really learning through this process is about the codebase.
< Suryo> Honestly, as a plain implementation, everything that was a part of the mlpack codebase was enough -- I didn't have to do much in a technical sense.
< Suryo> But of course, this is just a starting point and I'm already considering how to bring in constraints with PSO. I'll be gathering my ideas soon.
< zoq> Suryo: It's a really nice project to get familiar with the codebase.
< Suryo> Now that you've told me about the code authorship, I will be pushing my changes to GitHub.
< zoq> Suryo: Looking forward to review/merge PSO into the repo.
< zoq> Great :)
< Suryo> But before I submit the PR, I still want to see how to get more flexibility into this...
< Suryo> For instance, PSO is something that should work for any arbitrary objective.
< Suryo> But what happens if we want to do a PSO and gradient descent hybrid? In that case, we have to make sure that the function is a differentiable function.
< Suryo> Also, do we want to have the GD hybridization as a part of PSO, or do we want to split it into two parts at a high level?
< Suryo> zoq, I think that you missed a few of my chat logs from a few days back. There were a lot of important discussions going on and I was trying to reach you though it. It was on 6th feb
< Suryo> So yes a jist... At point, arbitrary function type is working, I got Chintan's PSO to work within ensmallen. And I am guessing that the only thing remaining in between now and submitting the PR is writing the tests.
< Suryo> But that's not going to be a problem.
< zoq> Suryo: I think it makese sense to split the two, gradient and non gradient.
< zoq> Ohh, I think I did miss the messages, let's see.
< Suryo> I agree. I'll take some time to understand how to incorporate different function types for different variants of PSO. So just give me a few more days. I will be submitting a PR soon.
< Suryo> Also, there were a couple of mistakes I made previously with regards to understanding how the optimum point is being returned. So don't worry about that when you read it up.
< Suryo> Since this is still quite early, and it hasn't been too long that I've been working on it, I actually don't want to rush it. Different variants of PSO might work on different function types.
< Suryo> So I want my inputs to be such that incorporating different update and initialization methods becomes easy.
< Suryo> Thank you again for your response, zoq. I'll keep you updated.
< zoq> About the particle class, agreed ideally we can use all the nice functionality that armadillo has, so if we can avoid an extra class we should probably do that.
< zoq> On some level you can configure armadillo to use OpenBLAS, which does some parallelization; another idea might be to use OpenMP, we did this in some places, see https://github.com/mlpack/mlpack/blob/2326bf8bd0b380c17eed6b5b1c34fbfab1563774/src/mlpack/methods/reinforcement_learning/async_learning_impl.hpp#L100 for an example.
< zoq> Really like all the thoughts you put into the idea.
< zoq> Agreed, no need to rush anything.
< Suryo> Okay, so here are the things I'll be doing next. Mostly, just gathering my ideas on (i) abstraction of objectives and optimization procedures (ii) parallelization.
< Suryo> Thanks again! :)
varuns has quit [Ping timeout: 244 seconds]
< zoq> Sounds good :)
Suryo has quit [Quit: Page closed]
soonmok has quit [Remote host closed the connection]
soonmok has joined #mlpack
soonmok has quit [Remote host closed the connection]
varuns has joined #mlpack
varuns has quit [Ping timeout: 272 seconds]
soonmok has joined #mlpack
vivekp has quit [Ping timeout: 244 seconds]
soonmok has quit [Ping timeout: 250 seconds]
KimSangYeon-DGU has joined #mlpack
Prinshu_Kumar has joined #mlpack
Prinshu_Kumar has quit [Quit: Page closed]
KimSangYeon-DGU has quit [Quit: Page closed]
KimSangYeon-DGU has joined #mlpack
varuns has joined #mlpack
gitty-boi has joined #mlpack
gitty-boi has quit [Client Quit]
varuns has quit [Ping timeout: 250 seconds]
KimSangYeon-DGU has quit [Quit: Page closed]
varuns has joined #mlpack
KimSangYeon-DGU has joined #mlpack
soonmok has joined #mlpack
soonmok has quit [Ping timeout: 268 seconds]
varuns has quit [Ping timeout: 240 seconds]
varuns has joined #mlpack
soonmok has joined #mlpack
varuns has quit [Ping timeout: 268 seconds]