ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< birm[m]> jeffin143 I'll take a look tonight or tomorrow!
ImQ009 has joined #mlpack
ImQ009 has quit [Ping timeout: 265 seconds]
ImQ009 has joined #mlpack
< jeffin143[m]> Gpt-3 released by open ai
< jeffin143[m]> 175 billion parameters
ImQ009_ has joined #mlpack
ImQ009 has quit [Ping timeout: 272 seconds]
favre49 has joined #mlpack
< favre49> jeffin143[m]: wow, that's an over 10 fold increase
< favre49> There are some pretty fun twitter bots that use GPT-2 actually
< jeffin143[m]> favre49: yes too much of resource consumption in my view
favre49 has quit [Read error: Connection reset by peer]
jonpsy[m] has joined #mlpack
favre49 has joined #mlpack
< jeffin143[m]> Are we going to remove boost serialisation ??
< jeffin143[m]> Why did we try cereal ??
< jjb[m]> So, linking with boost is problematic for getting an _R_ package listed on CRAN because it doesn’t exist on the build systems for macOS. Hopefully, my PR to add `boost` to the CRAN macOS build system will be accepted: <https://github.com/R-macos/recipes/pull/12>
< shrit[m]> jeffin143 are you asking me??
< jeffin143[m]> Yes shrit :) sorry didn't tag you saw your pr
< shrit[m]> jeffin143 Yes, this might be a possibility. SInce boost serialization is consuming a lot of size
< jeffin143[m]> Oh I see
< shrit[m]> I will open a discussion issue, but I just want to see if it is possible, and we are going to have some major gain
< shrit[m]> jeffin143 Thanks this one is a good comparison
< shrit[m]> Cereal is a header only, this will reduce building complexity
< jeffin143[m]> Yes i second that header is good
< jeffin143[m]> But time complexity to serialise
< jeffin143[m]> I will read more about cereal
favre49 has quit [Remote host closed the connection]
< jeffin143[m]> Spacex launch attempt :)
< jeffin143[m]> Wohhh done
< zoq> yes :)
< HimanshuPathakGi> Yeah cool :)
< jeffin143[m]> 19 hours eta
ImQ009_ has quit [Quit: Leaving]
< rcurtin> yeah, the launch was awesome, really happy to see it was a succes
< rcurtin> success*
< rcurtin> RyanBirminghamGi: nope, that looks accidental... want to remove it?
< HimanshuPathakGi> Hey, @zoq @saksham189 I tried to change the number centres while training the RBFN and after setting the values to 75 I got the classification error of 0.144 on my local computer not sure about online builds though. We can improve it more by tweaking parameters in the paper they were using 1000 centres for classifying full mnist dataset.
< zoq> HimanshuPathakGi: 0.144 sounds good, what is the training time on your local machine?
< HimanshuPathakGi> I was just talking about the sample 4,9 dataset, not the full dataset I will train. Sorry, my previous message was not quite clear.
< HimanshuPathakGi> Now I will try to go with full dataset.