ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
akfluffy has joined #mlpack
< akfluffy> hey so I wanna make a custom fitness function for my NN. I want to have it evaluate the whole thing before determining fitness, so how would I do that? I plan to use the CNE optimizer
< akfluffy> because if I change Evaluate() I think it steps through every point. I want to have it just return the entire network's "fitness"
akfluffy has quit [Remote host closed the connection]
pd09041999 has joined #mlpack
< johnsoncarl[m]> hey zoq
pd09041999 has quit [Ping timeout: 250 seconds]
seewishnew has joined #mlpack
seewishnew has quit [Ping timeout: 240 seconds]
pd09041999 has joined #mlpack
zhxj9823 has joined #mlpack
zhxj9823 has quit [Ping timeout: 256 seconds]
pd09041999 has quit [Quit: Leaving]
< jenkins-mlpack2> Project docker mlpack nightly build build #290: STILL UNSTABLE in 3 hr 34 min: http://ci.mlpack.org/job/docker%20mlpack%20nightly%20build/290/
jeffin143 has quit [Read error: Connection reset by peer]
jeffin143 has joined #mlpack
jeffin143 has quit [Read error: Connection reset by peer]
pd09041999 has joined #mlpack
pd09041999 has quit [Max SendQ exceeded]
pd09041999 has joined #mlpack
< ShikharJ> zoq: Oh okay, I'll open one then. Feel free to add details when you can :)
heisenbug_ has joined #mlpack
heisenbug_ has quit [Client Quit]
pd09041999 has quit [Ping timeout: 245 seconds]
hardikJ has joined #mlpack
< hardikJ> Hey, I am a beginner in mlpack. I have installed mlpack earlier when 2.2.5 was the latest version, but now I want to upgrade, to the latest version, how can I do that?
< zoq> hardikJ: Remove the libmlpack in the mlpack install path and include in the include path.
< hardikJ> Thanks @zoq
hardikJ has quit []
yashMustak has joined #mlpack
hardikJ has joined #mlpack
hardikJ has quit [Ping timeout: 256 seconds]
yashMustak has quit []
pd09041999 has joined #mlpack
pd09041999 has quit [Max SendQ exceeded]
pd09041999 has joined #mlpack
Masstran_ has quit [Read error: Connection reset by peer]
pd09041999 has quit [Max SendQ exceeded]
vivekp has joined #mlpack
abhinavsagar has joined #mlpack
akfluffy has joined #mlpack
< akfluffy> hey, does doing model.Parameters() represent all of the weights and biases as well as connections? Optimizers only look at those, right?
riaash04 has joined #mlpack
riaash04 has quit [Quit: Page closed]
lozhnikov has quit [Ping timeout: 246 seconds]
akfluffy has quit [Remote host closed the connection]
lozhnikov has joined #mlpack
seewishnew has joined #mlpack
sreenik has joined #mlpack
seewishnew has quit [Remote host closed the connection]
seewishnew has joined #mlpack
< sreenik> akfluffy: model.Parameters() returns an arma matrix having only the weights and biases that too if I remember correctly, in a single column (or row maybe), nothing else
sreenik has quit [Quit: Page closed]
seewishnew has quit [Ping timeout: 264 seconds]
abhinavsagar has quit [Quit: Connection closed for inactivity]
akfluffy has joined #mlpack
< akfluffy> sreenik: thank you. and does an optimizer optimize an arma::mat? are those what get passed to Evaluate()?
akfluffy has left #mlpack []
vivekp has quit [Ping timeout: 246 seconds]
pd09041999 has joined #mlpack
< zoq> akfluffy: Right, all the network parameter are combined in models.Parameter() and are passed to the Optimize function.
pd09041999 has quit [Quit: Leaving]
robertohueso has quit [Remote host closed the connection]
jeffin143 has joined #mlpack
jeffin143 has quit [Read error: Connection reset by peer]
jeffin143 has joined #mlpack
jeffin143 has quit [Ping timeout: 246 seconds]