ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
kay has joined #mlpack
kay is now known as Guest54148
Guest54148 has quit [Remote host closed the connection]
tanvi has joined #mlpack
tanvi has quit [Remote host closed the connection]
< birm[m]1>
Nakul: Ive copied modofied versions of the azure pipelines and travis ci on my forks to help some with that.
< SaraanshTandonGi>
Also where can I find the implemetation of the training procedure?
< SaraanshTandonGi>
I looked in ffn_impl.hpp but there is just the call to the optimizer.
< khimrajGitter[m]>
Hi @zoq @ShikharJ I want to contribute in GAN model under Essential Deep Learning Modules project. I have gone through mlpack code base and I have sufficient understanding how it works. Please can you suggest me some current issues in GAN implementation so that I can work on those before submitting proposal for GSOC 2020.
< khimrajGitter[m]>
(edited) ... @zoq @ShikharJ I ... => ... @zoq @ShikharJ, I ...
< PrinceGuptaGitte>
Hi @kartikdutt18 , about refactoring activation function code to remove `Fn()` and `Deriv()`, it is only for those which are implemented as a layer like `elu.hpp` right
< PrinceGuptaGitte>
or is it also for those in activation_functions folder
< PrinceGuptaGitte>
because ISRU is implemented as layer and SQNL like the normal ones in activation_functions folder
< kartikdutt18Gitt>
Yes it is for those activation functions implemented as layers. I think I left a comment on every PR to change to the newer implementation.
< PrinceGuptaGitte>
@kartikdutt18 should I remove `Inv` function as well? Because having `Inv()` makes the `Deriv()` code clear
< SaraanshTandonGi>
Also it might be helpful to see the implementation of the Train function. Can someone point me towards its location. I seem to be a bit lost. TIA
< kartikdutt18Gitt>
Hi @prince776, using Fn, Deriv and Inv does make the code cleaner however implementing them directly in Forward and Backward saves space, also it becomes easy for anyone to understand the use of forward and backward function. For `inverse` function, Since it's a private member function it doesn't have much use for a user so we can implement it directly as part of Backward.
< PrinceGuptaGitte>
ok, I'll do it that way
< kartikdutt18Gitt>
@saraansh1999, Can I have a look at the code(you could share the link using an online ide), I am not really sure what is causing the error.
< SaraanshTandonGi>
Nvm, I got it. Thanks anyways. :)
< kartikdutt18Gitt>
Great.
ImQ009 has joined #mlpack
< PrinceGuptaGitte>
Hi @kartikdutt18, I've fixed the comment style issue of `layer_names.hpp` in PR #2243
< kartikdutt18Gitt>
Great, Thanks.
prudhvi-hack has joined #mlpack
prudhvi-hack has quit [Remote host closed the connection]
< SaraanshTandonGi>
Is there any way to trace the function calls inside the mlpack library. Like I'm calling Split function and getting a Mat error inside it somewhere. But gdb doesn't give a trace inside the function by default. It gives something like:
< SaraanshTandonGi>
(edited) ... to 10000, the ... => ... to 1000, the ...
< SaraanshTandonGi>
Also any suggestions as to how to approach these problems so that I don't have to ask again and again would be appreciated
< GauravSinghGitte>
Hey @saraansh1999, is the value of ITERATIONS_PER_CYCLE same as the number of columns in your dataset?
Omar93 has joined #mlpack
OmarWagih1Gitter has joined #mlpack
< OmarWagih1Gitter>
Hey all, just a heads up, i think the slack channel link is broken on the community page
Omar93 has quit [Remote host closed the connection]
< SaraanshTandonGi>
> Hey @saraansh1999, is the value of ITERATIONS_PER_CYCLE same as the number of columns in your dataset?
< SaraanshTandonGi>
No, I have around 30000 cols
< SaraanshTandonGi>
50 per batch
< rcurtin>
OmarWagih1Gitter: thanks for pointing that out, let me debug it :)
< rcurtin>
fixed! the docker container simply wasn't running :)
< Nakul[m]>
@rcurtin:matrix.org: would I allowed to open pr related to cmake in model as part of refactoring before GSoC.
< Nakul[m]>
@rcurtin:matrix.org: am I allowed to open pr related to cmake as part of refactoring in mlpack repo. Before GSoC
favre49 has joined #mlpack
< favre49>
Just out of question, our site footer says that we our copyright is 2007-2019. I know nothing about copyrights and licenses, why does that end in 2019?
< favre49>
Also, LICENSE.txt says the copyright is 2007-2018
< favre49>
I have no clue if this is something that matters at all or if i'm just being nitpicky here
favre49 has quit [Remote host closed the connection]
< rcurtin>
favre49: I think because nobody updated it to say 2020 :)
< rcurtin>
if you wanted to do that feel free! I just overlooked it
< zoq>
rcurtin: Looks like the auto approval bot isn't working ... maybe another API update
eadwu has joined #mlpack
< rcurtin>
blah, let me check on it
< eadwu>
For GSoC, what's like the bare minimum cutoff of dumbness tolerated pre-proposal (since studying is a thing before the summer). For reinforcement learning, I haven't done any of that, the deepest I've gone in neural networks is having a simple and straightforward ANN layout whose values/weights were controlled by a genetic algorithm.
< zoq>
eadwu: We all have to start somewhere, so we don't expect a student to be an expert; however a strong knowledge about the topic is definitely helpful.
< rcurtin>
okay, I've fixed some errors with mlpack-bot, but it doesn't seem like it's doing the stale sweep...
< rcurtin>
just a wonderful reminder of how painful upgrading things in JS world is
< rcurtin>
:)
< rcurtin>
let's see if that fixes it... I guess we should know in a handful of hours
< zoq>
crazy
< zoq>
thanks for looking into it
< zoq>
so much fun everytime there is an issue with the bot
< rcurtin>
yeah, really :)
< rcurtin>
the only problem with automation is fixing it when everything inevitably goes wrong :)
< rcurtin>
Nakul[m]: sure, I don't see any issue with it, but if you do part of your project before GSoC even starts make sure that there is still enough left in the timeline to fill the rest of the summer
< PrinceGuptaGitte>
Hi @zoq , about 'summary()` function in Neural Network. I now understand the concept of serialization/de-serialization with boost thanks to the resources shared by you. But I am confused about how and why to use that in `summary()` function. We still have to manually access the member variable of the class after de-serializing, but then that could be done using the specific visitors as being done throughout the
< PrinceGuptaGitte>
`FFN` or `RNN` class.
< PrinceGuptaGitte>
Another thing to do would be to just get all data members through de-serialization and Log them, but that would be unnecessary amount of data, especially because member variables like `delta` are not at all useful in summarizing the model.
< PrinceGuptaGitte>
(edited) ... about 'summary()` function ... => ... about `summary()` function ...
< PrinceGuptaGitte>
Or am I missing something important?
< Nakul[m]>
> Nakul: sure, I don't see any issue with it, but if you do part of your project before GSoC even starts make sure that there is still enough left in the timeline to fill the rest of the summer
< Nakul[m]>
> Nakul: sure, I don't see any issue with it, but if you do part of your project before GSoC even starts make sure that there is still enough left in the timeline to fill the rest of the summer
< Nakul[m]>
Well if work ends early I would love to work( or help if someone is already working ) on visualization my favorite one idea in idea list.
< eadwu>
Is the BLAS requirement for Armadillo referring to OpenBLAS or BLAS
< eadwu>
Oh ignore that question, was answered in the next few words in the README
< rcurtin>
eadwu: perfect, that means we did a good job with the README if it correctly predicted your next question :)
< SaraanshTandonGi>
I was trying to trace back a Train call in the codebase but I am stuck at DeterministicSetVisitor. I see that for boost is applying the visitor to each layer, but where is the implementation of the visit or accept functions?
< zoq>
PrinceGuptaGitte: We don't have to access it, all the information is already there, maybe a first step it so serialize a model and take a look at the output, txt, xml, etc.
< SaraanshTandonGi>
I see this but where do I trace this back to?
< SaraanshTandonGi>
Where is the code which calls the forward visitor
< zoq>
PrinceGuptaGitte: Also you are right, not all of the data is useful, we would have to filter it, but we could also provide like weight variance for each layer etc.
< zoq>
SaraanshTandonGi: You mean the FFN Train function?
ImQ009 has quit [Quit: Leaving]
< zoq>
SaraanshTandonGi: haven't had time yet to take a look at you other messages, so maybe you already answered question or maybe there is a simple solution for your problem.
< SaraanshTandonGi>
Now i know that the forward visitor calls the layer forward function
< SaraanshTandonGi>
@zoq I want to know this irrespective of the problem
< SaraanshTandonGi>
I'll try to figure out a solution on my own once I know this
< SaraanshTandonGi>
> Now i know that the forward visitor calls the layer forward function
< SaraanshTandonGi>
So how do we go from the deterministicsetvisitor to the forwardvisitor
< SaraanshTandonGi>
Also what exactly is the point of a deterministicsetvisitor. Why isn't there a direct call to the forward visitor?
< zoq>
SaraanshTandonGi: Unfortunately you can't just access/modify a std::variant/boost::variant, you have to use a vistor to do so.
< zoq>
The vistor implementation (apply, etc. ) is part of boost.
< SaraanshTandonGi>
So what exactly happens after this call
< SaraanshTandonGi>
Or simply how does the DeterministicSetVisitor ultimately lead to the forward call.
< SaraanshTandonGi>
Any resources to help me through this?
< zoq>
You can search for boost::static_visitor for more details, but at the end apply_visitor calls the DeterministicSetVisitor on one layer, which calls LayerDeterministic and that one sets the Deterministic value if that layer implements the method.
< PrinceGuptaGitte>
Thanks for your input @zoq , I'll see how the serialized model looks as a string and then try to filter out the important parts, or something else depending on what it shows
< zoq>
You are probably looking for ForwardVisitor, which calls tyhe Forward function of a layer if the layer implements that function.