ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at
< zoq> FawwazMaydaGitte: just commented on the proposal.
< rcurtin[m]1> shrit: I think I've centered in on a bug in the cover tree recursion... not exactly certain of what the fix is yet, but I think I have the cause of the infinite recursion pinpointed
< rcurtin[m]1> of course, there could be many issues :) but at least I am getting somewhere, more quickly than I expected too
< AnushKiniGitter[> Hi @rcurtin , @zoq . I had submitted a draft proposal a few days ago on the GSoC website. Would love to hear your comments on the same.
< zoq> AnushKiniGitter[: I thought I already reviewed that one, I'll take a look in the next minutes.
< RishabhGargGitte> Hii, I also haven't received any comments on my draft proposal other than from @jeffin143. Is it yet to be reviewed? Thanks!
< AnushKiniGitter[> @zoq Thanks, just got your comments. Will work on reflecting the same in the proposal.
Guesst has joined #mlpack
Guesst has left #mlpack []
gitter_vstark21[ has joined #mlpack
< gitter_vstark21[> Hello everyone, I have submitted a draft proposal for GSOC 21. Looking forward to hearing your feedback.
HavshxjdnaggzGit has quit [Ping timeout: 248 seconds]
almost_strange[m has quit [Ping timeout: 248 seconds]
JoelJosephGitter has quit [Ping timeout: 248 seconds]
tarunjarvis5Gitt has quit [Ping timeout: 248 seconds]
ArijitRoyGitter[ has quit [Ping timeout: 248 seconds]
DirkEddelbuette4 has quit [Ping timeout: 248 seconds]
zoq[m] has quit [Ping timeout: 248 seconds]
AyushSingh[m] has quit [Ping timeout: 248 seconds]
RohitKartikGitte has quit [Ping timeout: 248 seconds]
kartikdutt18Gitt has quit [Ping timeout: 248 seconds]
turska79Gitter[m has quit [Ping timeout: 248 seconds]
_slack_mlpack_22 has quit [Ping timeout: 248 seconds]
DillonKipke[m] has quit [Ping timeout: 248 seconds]
siddhant2001Gitt has quit [Ping timeout: 248 seconds]
pruthvirajjadhav has quit [Ping timeout: 248 seconds]
SaiVamsi[m] has quit [Ping timeout: 248 seconds]
RishabhGargGitte has quit [Ping timeout: 248 seconds]
7ITAACQ6L has joined #mlpack
7F1AAAJN9 has joined #mlpack
M94KAAAH0U has quit [Ping timeout: 248 seconds]
_slack_mlpack_17 has quit [Ping timeout: 248 seconds]
_slack_mlpack_22 has joined #mlpack
HavshxjdnaggzGit has joined #mlpack
almost_strange[m has joined #mlpack
JoelJosephGitter has joined #mlpack
tarunjarvis5Gitt has joined #mlpack
ArijitRoyGitter[ has joined #mlpack
DirkEddelbuette4 has joined #mlpack
AyushSingh[m] has joined #mlpack
RohitKartikGitte has joined #mlpack
kartikdutt18Gitt has joined #mlpack
zoq[m] has joined #mlpack
_slack_mlpack_17 has joined #mlpack
M94KAAAH0U has joined #mlpack
turska79Gitter[m has joined #mlpack
SaiVamsi[m] has joined #mlpack
siddhant2001Gitt has joined #mlpack
DillonKipke[m] has joined #mlpack
pruthvirajjadhav has joined #mlpack
RishabhGargGitte has joined #mlpack
ImQ009 has joined #mlpack
ImQ009 has quit [Ping timeout: 248 seconds]
ImQ009 has joined #mlpack
ImQ009 has quit [Read error: Connection reset by peer]
ImQ009 has joined #mlpack
< Roshan[m]> Ryan Curtin @shrit Below is my proposal for GSoC. Please give some feedback to it:
< Roshan[m]> <|>
< Roshan[m]> Thanks
< rcurtin[m]1> I agree, that seems like it would be the right approach here 👍
< RishabhGargGitte> Also I have opened a PR on it so I would like to know would you prefer having these long discussion there or it is fine here on the IRC?
< rcurtin[m]1> either is fine, maybe on the PR is better :)
< OleksandrNikolsk> hi ryan, can I use a visitor from ann/visitors to update the parameters of a model?
< OleksandrNikolsk> In my case I want to update parameters of a multilayer neural network, while they are stored in a N-dim vector.
< OleksandrNikolsk> Assuming that an optimizer from ensmallen is able to optimize a network, it should be possible right?
< RishabhGargGitte> > `rcurtin (` either is fine, maybe on the PR is better :)
< RishabhGargGitte> Yeah, I too was thinking the same. It is easir to keep track of thoughts there. Then I will continue our discussion there :-)
< OleksandrNikolsk> (edited) hi ryan, can ... => hi, can ...
< rcurtin[m]1> Oleksandr Nikolskyy that seems like it would work; also, you could just access the parameters matrix directly from the FFN class I suppose
< zoq> OleksandrNikolsk: Yes, you can use a visitor to get the network parameters and update them as well.
< OleksandrNikolsk> cool. Does it copy them?
< rcurtin[m]1> in that case, that would be copied---use a move constructor? `x.Weight() = std::move(w);`
< rcurtin[m]1> however, on second thought, it may not be a good idea to do that
< rcurtin[m]1> the reason is that the FFN and RNN classes lay out the memory for each layer very explicitly, so, if inside of a network, `x.Weight()` will correspond to a very specific memory region
< rcurtin[m]1> in this case, it would be best to work directly on `x.Weight()` instead of copying another armadillo object into it, if possible
< OleksandrNikolsk> hm, did not get the last part.
< OleksandrNikolsk> explicitly in terms of
< OleksandrNikolsk> hm, did not get the last part.
< OleksandrNikolsk> so I don't get around copying?
< NippunSharmaGitt> Hey @rcurtin can you take a look at the draft proposal I submitted and provide some feedback. There are some things that I have pointed out in the proposal that may cause some issues while refactoring, such as a when multiple functions of name mlpackMain (one from "fit_main.cpp" and other from "predict_main.cpp") are wrapped in the same cython file then that would cause redefinition error. I am not able to think what
< NippunSharmaGitt> the best solution for this can be, we can have different functions for fit and predict that is for fit, we can have mlpackMainFit, and for predict we can have mlpackMainPredict. For this we would have to make some changes to mlpack_main.hpp file. What do you think about this? Any particular solution that you have in mind?
< zoq> OleksandrNikolsk: You can start with a copy, it's just not as fast as it could be, once you have something that works, we can improve it.
< zoq> Roshan[m]: Just left some comments on the proposal.
< zoq> Roshan[m]: Nevermind, that was another one, I'll take a look at yours next.
< zoq> Btw. did you upload the proposal to the GSoC dashboard?
< Roshan[m]> No...I was waiting for the feedbacks and then after changes, I thought I will upload
< zoq> Roshan[m]: You can upload drafts to the dashboard as well.
< zoq> Roshan[m]: Makes it easier for us to track who asked for feedback.
< AnmolpreetSinghG> Hi @zoq can you have a look at draft proposal submitted by me a couple of days ago on Adding Image Quality metrics. Hope to get feedback :)
< Roshan[m]> <zoq "swaingotnochill: You can upload "> By dashboard, does it means the GSoC official dashboard? I didn't do because I thought there are limited times one can upload a proposal
< zoq> Roshan[m]: There is no limit, it's the same as posting a link to the google doc here.
< zoq> AnmolpreetSinghG: I'll take a look in a moment.
< zoq> Roshan[m]: Alright, left some high level comments, thanks for putting this together!
< zoq> AnmolpreetSinghG: Comment on the proposal as well, hope anything I said is helpful.
< Roshan[m]> <zoq "swaingotnochill: Alright, left s"> zoq Thanks, that was valuable. If you have a couple of minutes, I have responded to your comments as I have some doubts about some. It will be really helpful if you could help me clear some of them.
< zoq> Roshan[m]: Yep I have some minutes.
< zoq> Roshan[m]: Done.
< Roshan[m]> Thanks zoq
< NippunSharmaGitt> @zoq can you please provide some feedback on my draft proposal too, it would be great to have some reviews to know if there are any major additions that I need to do before the deadline.
< rcurtin[m]1> Nippun Sharma (Gitter): I'll try to take a look, sorry I have not gotten to it yet
< rcurtin[m]1> note that the reason there is no problem that every function is currently called `mlpackMain()` in the Python bindings is that they are all each individually compiled in their own translation unit, so you don't get multiple definition errors
< rcurtin[m]1> so, e.g., you get ``, which contains an `mlpackMain()` function, but you also get ``, which contains a different `mlpackMain()` function
< rcurtin[m]1> then from Cython we use each generated .so individually
< rcurtin[m]1> so, we could preserve `mlpackMain()` by simply splitting a `train` and `predict` binding into completely separate `.so`s just like that
< NippunSharmaGitt> Okay, so you are suggesting that we should keep everything as is with respect to mlpackMain and generate seperate .so's and then call these .so's from a wrapper class's methods (fit, predict, etc.) ?
< NippunSharmaGitt> I can see how that can solve this issue, we would have completely different pipelines for each and then we just add a wrapper at the end
< NippunSharmaGitt> did i get it right?
< NippunSharmaGitt> [here]( I have split the random forest bindings and generated each .so separately (named as random_forest_fit, random_forest_predict), and then created a RandomForestPy class to wrap these.. is this what you had in mind?
< jonpsy[m]> zoq Hey, sorry for the delay. College assignments came up so I couldn't address the comments, but i am making progress. Can i request one last review when I've fixed it? I'll ping you :D
< jonpsy[m]> Also, styled the NSGA2 PR.
< jonpsy[m]> traits PR is giving me a little trouble (nothing I can't fix) but I'll ping you in the PR when it's done.
< zoq> jonpsy[m]: no worries
< OleksandrNikolsk> zoq thx
< rcurtin[m]1> Nippun Sharma (Gitter): yeah, that seems reasonable, but ideally we wouldn't write the RandomForestPy class by hand---we would manage to auto-generate it too 😃
< NippunSharmaGitt> Yes I understand, this is just for demonstrating, we would have to write a function like PrintPYX() which prints this code... I was thinking to include this code into my proposal just as a proof of concept...will that be okay? I can also try to write a wrapper for Random Forest in other languages just for demonstrating... will that be a good idea?
< rcurtin[m]1> yeah, of course, that is just fine as a proof of concept :) it's probably a good idea for your proposal, though, to think through how you might automatically generate that class that holds both the bindings 👍️
< NippunSharmaGitt> Yes, I will spend some time to think about it and write my ideas in the proposal. Thanks for clearing up these doubts😀.
< rcurtin[m]1> 👍️
< jonpsy[m]> <zoq "jonpsy: no worries"> Thanks! Fixed both of them. Review them whenever :). I was thinking we should port @favre49s indicator code. This will help me in my test_function_tools.hpp PR.
< NippunSharmaGitt> @rcurtin I have added a wrapper for Random Forest in Julia similar to what I had shown for Python, I have added it in my proposal, you can take a look there. Hopefully I did it right, as it was working on my local system.
< OleksandrNikolsk> Hi, I see no tanh in the ann layers, would you mind if I add it and make a PR?
< zoq> OleksandrNikolsk: It's implemented as an activation function and uses the base layer.
< OleksandrNikolsk> okay, I see.
< OleksandrNikolsk> how can I connect an activation function to a linear layer?
< zoq> Either add BaseLayer<TanhFunction> after the Linear layer, or you can use TanHLayer instead which is an alias.
ImQ009 has quit [Read error: Connection reset by peer]