ChanServ changed the topic of #mlpack to: "Due to ongoing spam on freenode, we've muted unregistered users. See http://www.mlpack.org/ircspam.txt for more information, or also you could join #mlpack-temp and chat there."
petris has joined #mlpack
cjlcarvalho has joined #mlpack
caiojcarvalho has joined #mlpack
cjlcarvalho_ has quit [Ping timeout: 272 seconds]
davida has joined #mlpack
cjlcarvalho_ has joined #mlpack
caiojcarvalho has quit [Ping timeout: 268 seconds]
ayesdie has joined #mlpack
petris_ has joined #mlpack
< ayesdie> Heya, I updated #1546 and `zoq` restarted the Travis build for it, but it seems to have failed at `sudo apt-key adv` part for one of the build.
petris_ has quit [Client Quit]
petris_ has joined #mlpack
petris has quit [Quit: Bye bye.]
caiojcarvalho has joined #mlpack
cjlcarvalho_ has quit [Ping timeout: 268 seconds]
petris_ has quit [Quit: Bye bye.]
petris has joined #mlpack
cjlcarvalho has quit [Ping timeout: 246 seconds]
govg has quit [Ping timeout: 260 seconds]
cjlcarvalho_ has joined #mlpack
caiojcarvalho has quit [Ping timeout: 268 seconds]
govg has joined #mlpack
davida has quit [Remote host closed the connection]
davida has joined #mlpack
davida has quit [Remote host closed the connection]
davida has joined #mlpack
davida has quit [Remote host closed the connection]
davida has joined #mlpack
vedantbassi has joined #mlpack
< vedantbassi> Hi, for the FFN class, could someone explain the ordering of the mat parameters returned by the Parameters() method and how to make sense of it, thanks :)
davida has quit [Ping timeout: 272 seconds]
davida has joined #mlpack
vedantbassi has quit [Ping timeout: 256 seconds]
govg has quit [Ping timeout: 268 seconds]
govg has joined #mlpack
vivekp has quit [Read error: Connection reset by peer]
vivekp has joined #mlpack
vedantbassi has joined #mlpack
govg has quit [Ping timeout: 252 seconds]
govg has joined #mlpack
vedantbassi has quit [Quit: Page closed]
cjlcarvalho has joined #mlpack
cjlcarvalho has quit [Ping timeout: 268 seconds]
cjlcarvalho has joined #mlpack
caiojcarvalho has joined #mlpack
caiocarvalho has joined #mlpack
cjlcarvalho has quit [Ping timeout: 240 seconds]
caiojcarvalho has quit [Ping timeout: 240 seconds]
caiocarvalho has quit [Ping timeout: 240 seconds]
< zoq> ayesdie: Let's restart the build one more time.
davida has quit [Quit: Leaving]
< ayesdie> I think it's built successfully now.
davida has joined #mlpack
< zoq> ayesdie: Agreed, looks good.
davida has quit [Remote host closed the connection]
davida has joined #mlpack
davida has quit [Remote host closed the connection]
govg has quit [Ping timeout: 264 seconds]
davida has joined #mlpack
caiocarvalho has joined #mlpack
caiocarvalho has quit [Ping timeout: 246 seconds]
govg has joined #mlpack
< davida> zoq: If I wanted to print a cost curve as I am training, I believe I need to run model.Evaluate(). If correct, my question is how do I pass in the data to evaluate as there doesn't seem to be any parameter for the data in the function definition. This is particularly important with mini-batch gradient descent since I would want to calculate the loss on the full dataset and not just the last bactch I ran.
< zoq> davida: You have to manually run model.Evaluate(..) on the dataset, there is an open issue that adds the feature to do that.
< davida> zoq: How do I manually run it and add the dataset? This is where I am a bit lost,
< zoq> davida: The FFN class implements: double Evaluate(arma::mat predictors, arma::mat responses);
< zoq> davida: Do you think that could work?
< davida> zoq: I don't see that function. I see 3 versions of Evaluate all need to pass parameters.
< zoq> FFN or RNN?
< davida> zoq: FFN for now. I just went into the code and I see the function you are referring to. It is not documented on the docs page and that is why I missed it.
< zoq> ahh, have to step out, back later
ImQ009 has joined #mlpack
< davida> zoq: I am at the ResNET stage of the Coursera examples. I have built the ResNET50 structure in Keras. I can build separate FFNs representing the Identity blocks and the Convolutional blocks, but I am not sure how to link them together in one mode. In fact, I need to link about 17 blocks together with pre and post stages. What is the right way to build this in MLPACK?
< davida> *mode = model
ImQ009 has quit [Quit: Leaving]