ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
ibtihaj has quit [Ping timeout: 260 seconds]
kyrre has quit [Quit: Connection closed for inactivity]
< Param-29Gitter[m>
Hello, I wanted to work on Profiling for Parallelization for GSOC 2020 are there any specific bugs / resources or task which i should try to work on to understand requirement of this project ?. Also if I can have email id for mentor of this project, it would really be helpful.
< zoq>
Param-29: Hello, at the moment we don't have an open issue that is related, but what you could do is to search the codebase and see if you can find a method that you think could be improved e.g. by using openMP.
< zoq>
Param-29: About the mail, we like to communicate via public channels, that way the whole community can participate in the discussion, provide feedback etc: https://www.mlpack.org/mailman/listinfo/mlpack
AryanJ has joined #mlpack
AryanJ has quit [Remote host closed the connection]
< Param-29Gitter[m>
@zoq I am just a beginner in openMP so if there are any specific algorithms / first issues which I could work with, please let me know about it .
ImQ009 has joined #mlpack
hunting-party104 has joined #mlpack
vancao82 has joined #mlpack
< hunting-party104>
Hi everyone so i was looking at `sequential.hpp` and `sequential_impl.hpp` and observed that the constructor takes an argument `const bool model` could anyone please elaborate on what this argument does ,since the comment "Expose the all network modules." in the code isnt helping much. What does that statement mean ?
< hunting-party104>
Does it mean that if it is set to false the entire sequential layer itself acts as a layer to some model ?
< zoq>
Param-29: For an example you could look into the regularized SVD function, also the SIMD code Ryan wrote for the decision tree might be interesting as well, check the gini_gain.hpp file for some more details.
< zoq>
hunting-party104: Right, if true the seq class acts as an layer and will expose every layer that was added, so e.g. the weight initalization will be done by e.g. the FFN class, since Model() will return all layers in this case, if set to false, none of the layers are exposed. This can be helful if a layer is used multiple times, and e.g. the weight initalization is already handled. Hope that makes sense.
< hunting-party104>
<zoq "hunting-party10: Right, if true "> Exposed as in all layers and their weights can be seen ? So if a layer is being used again it will just copy the pre existing one?
< zoq>
We don't copy the weights or layers, here is an example: LinearLayer layerA; SequenceLayer(true) layerB; layerB.Add(layerA); model.Add(layerA); model.Add(layerB).
< zoq>
In this case we added the layerA to the model twice, directly by using model.Add(layerA) and another time via model.Add(layerB), if model = true, the the FFN class will go through the model and first initalize layerA and then layerB, since layerB wraps and exposes layerA the weights are set again, it's the same layer (shared) so it will change/overwrite the first initalization.
vancao82 has quit [Remote host closed the connection]
< hunting-party104>
zoq: Correct me if im wrong, so when an FFN is created with the above configuration it first sees the Direct layer A which is added, Then sees layer B which shows layer A in it so this indirect layer A is pointing to the first one created ?
< hunting-party104>
Im not able to understand why this is helpful since after training in any case both the layer A's would have different weights so wont they just overwrite over each other ?
< zoq>
They don't have different weights they have the same weights, as initalization happens only once, before training. The configuration above isn't useful just an example what model is for, but a useful example would be if you like to branch-out one example is the inception model, in this case you would use multiple layers of type seq layer, and you only like to each layer once.
< Nakul[m]>
Hey zoq i haven't change anything in Softmax_regression_test.cpp but it fails in 2 test i just curious is it was fine before or i just did some mistake? Since callback is finally fixed :)
< zoq>
Nakul[m]: Will check later.
< hunting-party104>
zoq: I see ,ill check out the inception model. Thanks a lot !
< hunting-party104>
Atleast that resolves a lot of doubts which i had
< Nakul[m]>
basically these 2 test suits ```SoftmaxRegressionFitIntercept``` and ```SoftmaxRegressionOptimizerTrainTest```
< Nakul[m]>
> Nakul: Will check later.
< Nakul[m]>
fine
ibtihaj has joined #mlpack
< Nakul[m]>
Hey guys ```models``` repo is currently in which version ? just needed in writting cmake
< Nakul[m]>
i observe that when we pass ```fitIntercept``` then this hpothesis is being calculated and and we are getting the error basically some dimensionality error example ```incompatible matrix dimensions: 2x6 and 5x1000``` .That is why thought to ask about it.
< Nakul[m]>
in this both ```SoftmaxRegressionFitIntercept and SoftmaxRegressionOptimizerTrainTest``` test we are passing fiteintercept as true.
kyrre has quit [Quit: Connection closed for inactivity]
kyrre has joined #mlpack
ibtihaj has joined #mlpack
< ibtihaj>
Hey I am new in machine learning but really want to participate in gsoc for mlpack can anyone suggest me what project should I choose in gsoc 2020 ??
< zoq>
ibtihaj: The advice I can give you is to go with the project you are excited about; we will add more ideas in the coming days, you are also free to propose own ideas.
< ibtihaj>
So can i propose my idea here just to verify is it legit or not ?
< zoq>
Here or on the mailinglist works just fine.