ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
jeffin143 has joined #mlpack
< jeffin143> Lozhnikov : so let's decide on a API for the user , so from your comments , what I understood is we are providing the template argument ( a type of callback type), so Everytime before calling this function either they should build a class which has operator () and then pass that object as argument* or provide a lamda function
< jeffin143> Example , what stl does : for_each( start_ptr, end_ptr , callbackfunction)
< jeffin143> Call function could be a class with operator () , or lambda function*
< jeffin143> So our endpoint will be dictencode ( vector<string> input , call back function)..??
< jeffin143> Call back function as in for the tokenizer*
< rcurtin> Suryo: yeah, GetInitialPoint() isn't required for a FunctionType, it's mostly just a convenience for our own internal testing
jeffin143 has quit [Ping timeout: 246 seconds]
jeffin143 has joined #mlpack
jeffin143 has quit [Ping timeout: 244 seconds]
Suryo has joined #mlpack
jeffin143 has joined #mlpack
< Suryo> zoq: the functions that PSO will optimize would not require gradients. So in these 10 functions, would you like me to program the Gradient() function wherever it may exist?
< Suryo> Also, how would you like to test these test functions?
Suryo has quit [Quit: Page closed]
< lozhnikov> jeffin143: Yes, I think it's reasonable. Moreover I guess we should implement some standard tokenizers.
sreenik has joined #mlpack
jeffin143 has quit [Ping timeout: 252 seconds]
jeffin143 has joined #mlpack
< zoq> Suryo: Yes, let's add a Gradient function as well if possible.
< jenkins-mlpack2> Yippee, build fixed!
< jenkins-mlpack2> Project docker mlpack nightly build build #332: FIXED in 3 hr 59 min: http://ci.mlpack.org/job/docker%20mlpack%20nightly%20build/332/
saksham189 has joined #mlpack
< saksham189> ShikharJ: sorry for the late response, I was occupied with some practical exams.
< saksham189> My last exam is tomorrow and then I'll be moving from hostel and will be free on 23rd. So maybe we can discuss the plan then.
< saksham189> In the meantime I will try to setup an IRC bouncer as you suggested.
< ShikharJ> saksham189: Okay, please take your time.
< ShikharJ> saksham189: Our pre-planned work is mostly done in your case, so it's all good up till the coding phase.
saksham189 has left #mlpack []
jeffin143 has quit [Excess Flood]
jeffin143 has joined #mlpack
sreenik has quit [Quit: Page closed]
jeffin143 has quit [Ping timeout: 250 seconds]
jeffin143 has joined #mlpack
jeffin143 has quit [Ping timeout: 248 seconds]
jeffin143 has joined #mlpack
sreenik has joined #mlpack
robertoh1eso has joined #mlpack
jeffin143 has quit [Ping timeout: 264 seconds]
jeffin143 has joined #mlpack
jenkins-mlpack2 has joined #mlpack
jeffin143 has quit [Ping timeout: 252 seconds]
jeffin143 has joined #mlpack
Suryo has joined #mlpack
< Suryo> rcurtin and zoq: thanks for clearing my doubts.
Suryo has quit [Client Quit]
jeffin143 has quit [Ping timeout: 276 seconds]
jeffin143 has joined #mlpack
< rcurtin> Suryo: happy to help :)
jeffin143 has quit [Ping timeout: 250 seconds]
Toshal has joined #mlpack
jeffin143 has joined #mlpack
< Toshal> ShikharJ: Sorry for the late response. I am just busy with my exams.
KimSangYeon-DGU_ has joined #mlpack
< Toshal> Regarding the meet it would be great if we meet on Friday. Please let me know your view regarding the same.
< KimSangYeon-DGU_> sumedhghaisas_: Hi, Sumeth, I've completed 3D plotting of the probability space of the classical GMM, and I'll plot the QGMM's
< KimSangYeon-DGU_> sumedhghaisas_: You can check this at https://github.com/KimSangYeon-DGU/GSoC-2019/tree/master/Research/Probability_Space
< KimSangYeon-DGU_> Thanks!! :)
< Toshal> Regarding my schedule, My Dual Optimizer PR is almost ready with GAN and WGAN. Some minor changes are required. I would fix it out. Then it would be good to go. We could start testing it out. I am somewhat confused about WGANGP at this moment.
< Toshal> So I am currently thinking to work on it in a seprate PR.
< sumedhghaisas_> KimSangYeon-DGU_: Hi Kim
< KimSangYeon-DGU_> sumedhghaisas_: After drawing them, as we discussed, let's compare the interference phenomena
< KimSangYeon-DGU_> Hi :)
< sumedhghaisas_> I am busy today till night. I will check it at night and get back to you. :)
< KimSangYeon-DGU_> Thanks :)
Toshal_ has joined #mlpack
Toshal has quit [Ping timeout: 258 seconds]
Toshal_ is now known as Toshal
< Toshal> ShikharJ: It would be great if we merge saksham's #1823 and #1813 as it reflects some important changes.
jeffin143 has quit [Ping timeout: 252 seconds]
< Toshal> Also I was thinking that I would finish the Dual Optimizer PR before coding period begins. And then we could start with our own proposal. But, I guess it may get delayed.
< Toshal> We could still keep on track as mostly reviewing and merging is remaining. Let me know your thoughts regarding my schedule as well. And also please review #1888 in your free time. If that get's merged then we could start testing Dual Optimizer for StandardGAN and WGAN. Thanks. :)
KimSangYeon-DGU_ has quit [Ping timeout: 256 seconds]
< Toshal> I was thinking to run testing and implementation somewhat parallel as we have a heavy duty node. If you don't mind can I get access to that node? I would like to start testing my standardGAN implementation.
cult- has joined #mlpack
< cult-> ensmallen is awesome
Toshal has quit [Remote host closed the connection]
< rcurtin> cult-: great, glad it is useful for you :)
< rcurtin> I'm really excited about the callbacks and the templated Optimize() support that will be merged soon
< rcurtin> I think I'd like to implement a callback that sends me a text message every time an epoch is done ;)
< rcurtin> actually I wonder if anyone would find that useful...
< cult-> rcurtin, sounds good
< rcurtin> probably a month (more or less?) until that's all merged and released, we'll see
< rcurtin> Marcus has a prototype callback PR open, and I've got the finished templated Optimize() PR open, so we're pretty close
< cult-> how the demo created on https://vis.ensmallen.org/ ?
< rcurtin> Marcus wrote that, I think that every time the configuration is changed, ensmallen is called on the server and its results are plotted
< cult-> real nice
< cult-> can we have custom objectives not just the local minimum?
Toshal has joined #mlpack
< rcurtin> cult-: hmm, do you mean like "terminate if the objective is less than X"?
< rcurtin> if so, that would be doable through the callbacks, since the callbacks will allow you to terminate the optimization
govg has quit [Ping timeout: 259 seconds]
travis-ci has joined #mlpack
< travis-ci> coatless/ensmallen#11 (code-style-warnings - 0ca775e : James Balamuta): The build has errored.
travis-ci has left #mlpack []
< ShikharJ> rcurtin: Can we provide savannah access to Toshal? In the meantime, I'll initiate the builds for Saksham. One of his PRs was recently running on Savannah, so I think that should be good to go, I just need to check the output.
< ShikharJ> Toshal: Hmm, can you let me know when you'll be free from the exams? I think then we can plan better. Saksham's pre-planned work is already done from his end, only I need to merge the changes in. But unfortunately, I'm really busy this week (have been so entire month actually), so might not get the time to cross-check the output.
< Toshal> ShikharJ: I will get free from exams on 28th May. But I will be free on 25th May (Friday).
< Toshal> So it's up to you when to meet.
< zoq> rcurtin: I guess that depends on the task, not sure I like to end up with 10000 messages :)
< zoq> cult:-: Ryan is right about the visualization demo. The 3D plot is a somewhat strange setup as it runs a Matlab script to get the plot ...
pd09041999 has joined #mlpack
Toshal_ has joined #mlpack
pd09041999 has quit [Excess Flood]
Toshal has quit [Ping timeout: 258 seconds]
Toshal_ is now known as Toshal
Toshal has quit [Remote host closed the connection]
< ShikharJ> Toshal: Okay, this Friday, but I might have cancel at the last moment due to my travels. I'll let you know if there are any change of plans.
< ShikharJ> rcurtin: Hmm, seems like I can't access my savannah account?
< zoq> ShikharJ: Because of a wrong password?
< ShikharJ> zoq: Yeah, but I'm pretty sure I remember my password alright.
< zoq> hm, I think I could reset the password, if you think that could resolve the issue
< ShikharJ> zoq: Please do that, I'll set it again.
< ShikharJ> Though I'm curious, did Symantec gift us the Savannah cluster?
< rcurtin> ShikharJ: no, but they still haven't turned the systems off
< rcurtin> my understanding was that we could use them until they repurposed them, and they haven't done that yet
< ShikharJ> rcurtin: Hmm, maybe they just forgot.
< rcurtin> it's completely possible...
jeffin143 has joined #mlpack
< sreenik> I want to define an stl map which will have some keys initialized and some uninitialized (to be initialized later and it is necessary to identify which ones have been left uninitialized). I am confused between whether to use boost's optional or a special value for later identification.
< sreenik> Here is a small snippet showing what I want to do https://github.com/sreenikSS/Testing/blob/master/MapTest.cpp
< sreenik> As this is regarding a code related to mlpack, boost is already a dependency, so no worries there
< zoq> sreenik: What about using another vector (or bitset) that keeps track of initialized/uninitialized values? Or another solution might be to intalize with std::numeric_limits<Type>::max() and check against that value.
< sreenik> zoq: Thanks. I don't want to go for another vector, and maybe I'll leave boost out of this and go for either the max limit or NAN. My main concern here is programming practices and somewhat code readability.
sreenik has quit [Quit: Page closed]