verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
chenzhe has quit [Remote host closed the connection]
chenzhe has joined #mlpack
chenzhe1 has joined #mlpack
chenzhe has quit [Ping timeout: 260 seconds]
chenzhe1 is now known as chenzhe
mikeling has joined #mlpack
chenzhe has quit [Ping timeout: 240 seconds]
chenzhe has joined #mlpack
s1998 has joined #mlpack
chenzhe has quit [Quit: chenzhe]
s1998_ has joined #mlpack
s1998_ has quit [Client Quit]
s1998 has quit [Remote host closed the connection]
s1998 has joined #mlpack
s1998 has quit [Quit: Page closed]
Trion has joined #mlpack
sgupta has quit [Ping timeout: 245 seconds]
Trion has quit [Quit: Have to go, see ya!]
Trion has joined #mlpack
Trion has quit [Quit: Have to go, see ya!]
Trion has joined #mlpack
shikhar has joined #mlpack
vivekp has quit [Ping timeout: 246 seconds]
Trion has quit [Quit: Have to go, see ya!]
vivekp has joined #mlpack
vivekp has quit [Ping timeout: 268 seconds]
vivekp has joined #mlpack
vivekp has quit [Ping timeout: 260 seconds]
vivekp has joined #mlpack
Trion has joined #mlpack
kris1 has joined #mlpack
kris1 has quit [Quit: kris1]
sgupta has joined #mlpack
shikhar has quit [Quit: WeeChat 1.7]
< zoq> rcurtin: any news about gekko?
< rcurtin> not yet, they said they are investigating it
shikhar has joined #mlpack
vivekp has quit [Ping timeout: 255 seconds]
vivekp has joined #mlpack
Trion has quit [Quit: Have to go, see ya!]
vpal has joined #mlpack
vivekp has quit [Ping timeout: 255 seconds]
vpal is now known as vivekp
kartik_ has joined #mlpack
< kartik_> here is a small test of the cmaes
< kartik_> it is able to give minima most of the time ..
< zoq> okay, I was going to ask about a test
< kartik_> im working forward to correct that
< zoq> sounds good, I'll go through the code and make comments
< kartik_> zoq .. maybe u can do that tomorrow
< kartik_> im planning to do some major changes
< kartik_> i think that the constraints to terminate are too heavy
< zoq> okay, sure if you think I should wait one more that's fine for me :)
< zoq> also, it looks like you do some c-style coding here and here and there, maybe you can take a look at other methods to adapt the style
< kartik_> ok
< zoq> I'm kinda picky about code style, so expect some probably pedantic comments :)
< kartik_> i as seeming that the test is already with the sgd optimizers ..
< kartik_> hw is that ?
< zoq> not sure what you mean
< kartik_> the numFunction() what does that return
< kartik_> the dimensions ??
< cult-> why boost test is better than gtest? or vice versa?
< zoq> The number of points/functions, e.g. E.g. for a 2x4 matrix of coordinates, where each col is a sample input, it will return 4.
< kartik_> the total number of training data ?..
< zoq> yes, another example is let#s say yyou have 10 images and you like to train a method on the 10 images, NumFunction is 10
< kartik_> why is evaluate calling the function made on test.cpp file with numFunction 3 ?
< zoq> take a look at the data: arma::mat("6; -45.6; 6.2")
kartik_ has quit [Ping timeout: 260 seconds]
< zoq> cult: I wouldn't say boost is better as gtest, both have pros and cons, but I think if you already have boost as a dependency it makes sense to stick to boost as the test framework.
kartik_ has joined #mlpack
< kartik_> thats right .. what im confused about is.. why is the test right inside this ?
< cult-> zoq: thanks
< kartik_> for the files in methods folder we have a seperate place..
< kartik_> ill add this way only ..
< zoq> It's not directly a test, it's just a simple task/function used to test the SGD optimizer, the actual test is in mlpack/tests/sgd_test.cpp, if someone likes to implement his own optimizer and just likes to link against mlpack he could reuse the function.
< zoq> You could also reuse the SGDTestFunction class for your test.
< zoq> Take a look at mlpack/tests/sgd_test.cpp.
kartik_ has quit [Ping timeout: 260 seconds]
KARTIK_ has joined #mlpack
< KARTIK_> right zoq .. that looks like a test
< KARTIK_> last question
< KARTIK_> then why u wrote that test file
< zoq> You mean sgd_test.cpp?
s1998 has joined #mlpack
< KARTIK_> yes
< KARTIK_> no
< KARTIK_> inside the optimizer folder
< zoq> SGDTestFunction isn't a test, it's just a function we use for the actual test, there is no particular reason for putting SGDTestFunction inside the SGD folder.
< shikhar> Maybe to serve as a reference for future developers on how to write a seperable cost function which SGD could optimize.
< zoq> yes, that's what I said above, if you write your own optimizer or whatever and you link against mlpack you can use that function.
shikhar has quit [Quit: WeeChat 1.7]
kris1 has joined #mlpack
kris1 has quit [Client Quit]
KARTIK_ has quit [Ping timeout: 260 seconds]
< rcurtin> confirmed that gekko now has Windows installed on it
< rcurtin> there is no particular reason given
< rcurtin> I might get more information on what happened in the future, but I'm not sure I'll get that today
kris1 has joined #mlpack
s1998 has quit [Quit: Page closed]
kris1 has quit [Quit: kris1]
kris1 has joined #mlpack
< zoq> rcurtin: would be nice to hear more, maybe in the future
< rcurtin> yeah, absolutely, why in the hell did somebody touch a system that wasn't theirs and go so far as to install windows on it
< rcurtin> definitely I am going to have to make sure to set up backups on each of these systems
sgupta has quit [Ping timeout: 240 seconds]
< rcurtin> it'll probably be a few days until I get gekko back up, probably can have it done by Monday
< zoq> Definitely frustrating, I wonder if that happens again, and if that's the case is it your job to get the systems back? I guess it's safer, if they would do it they come up some weird configurations like no swap ...
< rcurtin> yeah I don't want them to do the reinstall, I can do that even though it takes time
< rcurtin> I keep logs of the steps I did for the benchmark servers so it is easy to replicate
govg has joined #mlpack
govg has quit [Quit: leaving]
govg has joined #mlpack
govg has quit [Client Quit]
mikeling has quit [Quit: Connection closed for inactivity]
govg has joined #mlpack
govg has quit [Client Quit]
< kris1> Dosen’t layertypes provide pointer to the classes so shoudn’t we use -> rather than . confused here
< lozhnikov> kris1: I misunderstood your question. Can you explain the question more detailed?
< kris1> Layertypes concat = new Concat<>() is pointer to Concat type right
< kris1> so why are we concat.Add()
< lozhnikov> no
< lozhnikov> LayerTypes is a boost::variant (if I remember right)
< kris1> Yes but i read about boost::variant but i did not understand then why we Concat<arma::mat, arma::mat>* there
< kris1> instead of Concat<arma::mat, arma::mat>
< lozhnikov> hmm... I still don't understand you. Can you send the link to the gist?
< kris1> Okay so for example boost::variant<int, double> v would allow v to be either int or string right
< kris1> similarly boost::variant<Concat<arma::mat,arm::mat>,….> v would allow v be of type Concat but in the layertypes.hpp we define the boost::variant<Concat<arma::mat, arma::mat>*,…..> v this would allow v to be of type Concat<…,…>*
govg has joined #mlpack
govg_ has joined #mlpack
govg_ has quit [Client Quit]
< lozhnikov> I think because LayerTypes is not a pointer
< lozhnikov> And you have to use a visitor if you define concat as LayerTypes
< kris1> layer types just a typedef right boost::variant<Concat<arma::mat, arma::mat>*>
< lozhnikov> right
< kris1> lozhnikov: just a quick quesion we can only do boost::apply_visitor on boost::variant type or can we do apply_visitor on Classtype also
< lozhnikov> do you mean 'can Classtype be an arbitrary class?'?
< kris1> Yes
< lozhnikov> I think boost::apply visitor is designed only for boost::variant
< kris1> Yes that’s what i thought too. So Here is where i am stuck add
< kris1> LayerTypes concat = new Concat<>(). would not allow access to Add method in the concat class unless we create a visitor for it
< lozhnikov> So, you have two options:
< lozhnikov> 1. LayerTypes concat = .........;
< lozhnikov> boost::apply_visitor(AddVisitor(.......), concat);
< lozhnikov> 2. Concat<> concat();
< lozhnikov> concat.Add(........);
< lozhnikov> kris1: yeah, boost::variant requires a visitor
< kris1> and Concat<>* concat = new Concat<>() will not allow apply_visitor()
< kris1> There is no AddVisitor right now
< kris1> i think
< lozhnikov> I think no. (but I didn't look through boost::apply_visitor source code)
< kris1> ok no there is Add visitor
< lozhnikov> hmm... ann/visitor/visitor.hpp
< kris1> available sorry about that
< lozhnikov> *add_visitor.hpp
< kris1> Concat<> concat() concat.Add() would not allow we me to ResetVisitor. So i think it’s not a viable alternative
< kris1> right
< lozhnikov> There are no Reset() and Model() functions in the Concat class
< kris1> The reset function is call on indivisual layer of the concat class
< kris1> concat has a model() functio
< kris1> Line 113 of concat.hpp
< lozhnikov> If you define concat as Concat<> you can just invoke concat.Reset()
< kris1> I don’t see a reset function in concat class
< lozhnikov> I mean you can do the same that the reset layer does
< lozhnikov> i.e. apply ResetVisitor to each sublayer
govg has quit [Quit: "switching PCs"]
< kris1> What do you think
< lozhnikov> Why do you don't want to use a static model? Both visible and hidden layers are strictly defined for a particular RBM type.
< kris1> Hmmm…. i am not sure i understand…..correctly what do you mean by the static model
< kris1> You mean we define the LinearLayer and Sigmoid Layer in the visible layer itself
< kris1> >.
< kris1> ?
< lozhnikov> yeah
< kris1> I guess we could do that but concating the visible linear sigmoid layer is also not a bad idea
< kris1> Does static have speed advantage over the concat solution
< lozhnikov> How do you want to use the concat layer?
< kris1> Inside the visible layer concating the linear and sigmoid layer.
< kris1> I guess i can statically define the layers as well in the visbile layer constructor
< kris1> there in no added advantage of using the concat unless we are allowed to have more layer which in standard rbm is not the case.
< lozhnikov> hmm... In that case you'll obtain:
< lozhnikov> O1 = Linear(input)
< lozhnikov> O2 = Sigmoid(input)
< lozhnikov> return (O1, O2)
< lozhnikov> what is the sense?
< kris1> Visible layer could now use o1 nad o2 in the forward pass by using the forward funtions for o1 and o2.
< kris1> Ohh sorry do you mean to say the concat solution
< kris1> in concat we would the jsut use forward vistor on concat and we woult the preacitvations
< kris1> Did you get what i am trying to do nw
< lozhnikov> you should use the concat layer if you want to pass the output of one or more layers to another layer's input
< kris1> Hmmm… okay
< lozhnikov> Did you think something else?
< kris1> Well the only reason for not using static defined layer was that i would have to call the forward function of layers indivisually. Meaning i would have to a forward vistior linear and forward visitor sigmoid layer. With concat i could have just used forward visitor concat. 2. Also the concat layer enables us to have any number of layer in the visible layer( which is really not needed right now)
< kris1> When you say statically define the Linearlayer and SigmoidLayer you mean to define them inside the VisibleLayer Constructor Right ?/
< kris1> Also on your comment regardin using the layertype api should i also then have the backward function and the other function for the visible layer. They would have to undefined but i would atleast declare them
< lozhnikov> I think there is no need to define functions those you are not planning to use
< kris1> Hmmm okay.
< lozhnikov> The concat layer does not call the Forward() methods sequentially. It calls the methods simultaneously instead
< kris1> No see here boost::apply_visitor(ForwardVisitor(std::move(input), std::move(
< kris1> boost::apply_visitor(outputParameterVisitor, network[i]))),
< kris1> network[i]);
< lozhnikov> yes, each sublayer acts on the same input
< lozhnikov> and then all outputs are merged into the single one
< kris1> Oh i see
< kris1> Yes i get you point i guess i thought that concat layer call the forward function for the layer sequentially.
< kris1> Okay then i would defient the layer statically inside the constructor and indivisually call the forward functions for them
< kris1> Though i think the defination of the concat layer is counter intutive
< kris1> It should be called concat_horizontal or something
< lozhnikov> Why? I think 'concat ' means 'concatenation'
< kris1> Okay then i would defient the layer statically inside the constructor and indivisually call the forward functions for them. Is this the correct way??
< kris1> lozhnikov
< lozhnikov> kris1: right, that's correct
< lozhnikov> It's too late, so I'll go sleep. goodnight
mentekid has quit [Quit: Leaving.]
mentekid has joined #mlpack
< kris1> Good night
kris2 has joined #mlpack
kris2 has quit [Client Quit]
mentekid has quit [Quit: Leaving.]
kris1 has quit [Quit: kris1]
kris1 has joined #mlpack
kris1 has quit [Quit: kris1]
kris1 has joined #mlpack
kris1 has quit [Quit: kris1]
kris1 has joined #mlpack