ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< nishantkr18[m]>
Hey zoq I had a question. If i've defined a FFN network, and I want to call a function of a Layer defined in that FFN network, would I have to use visitor? Is there any other option?
< zoq>
nishantkr18[m]: If you know the type of the layer you could cast the layer, otherwise you have to use a visitor.
< nishantkr18[m]>
zoq: Could u provide an example or a link where the casting is used?
< nishantkr18[m]>
*a casting
< zoq>
nishantkr18[m]: I don't really have an example, we always used a visitor, but boost::get<TypeOfTheLayer>(&layer)->MyFunction(); should work.
< nishantkr18[m]>
zoq: Alright no problem, I'll try that out. thanks
< rcurtin>
any objection if I hit merge on #1884? (go bindings) :)
< jeffin143[m]>
rcurtin (@freenode_rcurtin:matrix.org): was just about to say that please merge it
< jeffin143[m]>
It's more than a week
< jeffin143[m]>
jeffin143 (@jeffin143:matrix.org): very excited :) to see it get merged
< rcurtin>
jeffin143[m]: yeah, very exciting, that was a GSoC 2018 project :)
< jeffin143[m]>
2 years :)
< zoq>
nishantkr18[m]: Just tested it on a simple network -> std::cout << boost::get<Linear<>*>(model.Model()[0])->InputSize() << std::endl;
< nishantkr18[m]>
zoq: Exactly what I was looking for. Thanks a lot :)
< zoq>
rcurtin: If you can wait two more days, I'd like to just skim over the code.
< rcurtin>
zoq: no problem at all, that's why I asked, to check if anyone wanted to take a look :)
< abernauer[m]>
rcurtin: How was your vacation?
< rcurtin>
abernauer[m]: it was good, thanks for asking---nice to put things down for a while
< rcurtin>
I don't think I'm very good at relaxing but I tried my best :)
< abernauer[m]>
Taking breaks is important for maintaining long term success and avoiding burn out regardless. Though good to hear you enjoyed it.
< HimanshuPathakGi>
Hey, @saksham189 I think we should not add test for full mnist dataset with RBFN because it is taking too much time for training with the full dataset what do you suggest?
< saksham189Gitter>
Yes we cannot add that because it is too big. We will only have the test on the smaller dataset.
< HimanshuPathakGi>
> Yes we cannot add that because it is too big. We will only have the test on the smaller dataset.
< HimanshuPathakGi>
Ok, I will try to fix other things this can you give a short review to my pr.
< HimanshuPathakGi>
(edited) ... things this can you give a short review to my ... => ... things and write a blog can you give a short review on my ...
ImQ009 has quit [Quit: Leaving]
< zoq>
yashwants19[m]: Thanks for the update, interesting read.
< jeffin143[m]>
zoq (@freenode_zoq:matrix.org): you there ??
< jeffin143[m]>
In azure builds if i am installing protobuf everytime it will take unnecessary time to install it every time it is triggered
< jeffin143[m]>
Isn't it possible to speed up or cache it accoross all builds
< jeffin143[m]>
I mean don't start fresh
< zoq>
jeffin143[m]: Why not use the debian package?
< jeffin143[m]>
Debian package as in ???
< jeffin143[m]>
zoq (@freenode_zoq:matrix.org): change the image ?
< zoq>
jeffin143[m]: I don't think you have to change the image apt install protobuf should do the trick.
< jeffin143[m]>
Ohh you meant that ok
< jeffin143[m]>
Just a sec I will give it a try
< jeffin143[m]>
I have no clue how it was pre installed in my lap
< jeffin143[m]>
I have never install protobuf
< jeffin143[m]>
It was Aldeardy there
HeikoS has quit [Quit: Leaving.]
< RyanBirminghamGi>
I think tensorflow installs protobuf too, right?
< shrit[m]>
Ryan Birmingham (Gitter) Yes, I did not follow discussion, but sure tensorflow has protobuf inside, since yesterday I was facing a protobuf issue, each time I searched for protobuf, tensorflow source code was yelling at me with protobuf source code