ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< KimSangYeon-DGU>
I'll drawing the Qauntum GMM later when the exam finished
< sumedhghaisas>
Yup. I checked. Good work. We will dedicate first 2 weeks of coding period to plotting the QGMM
< sumedhghaisas>
classical GMM implementation shows nice graphs... maybe we should write a Mlpack GMM version of the same in models folder ... just for reference to others
< sumedhghaisas>
but we can worry about that later
< KimSangYeon-DGU>
Thanks :) Ah, Sumedh, My final exam week is 9 ~ 14 June
< sumedhghaisas>
Best of luck for rest of your exams :)
< KimSangYeon-DGU>
Thanks :)
< sumedhghaisas>
ahhh... I thought they are now...
< sumedhghaisas>
sorry for that confusion... would you like to take that week off?
< KimSangYeon-DGU>
Yes
< KimSangYeon-DGU>
Tomorrow and 9~14 June
< KimSangYeon-DGU>
My exam season
< sumedhghaisas>
ahh that makes it more clear.
< sumedhghaisas>
Cool. Do you wanna catch up on Monday instead?
< sumedhghaisas>
lets setup something then so you can concentrate on exam tomorrow
< KimSangYeon-DGU>
Ah Thanks
< KimSangYeon-DGU>
Yeah
< KimSangYeon-DGU>
What about next Monday, same time?
< KimSangYeon-DGU>
GMT, 1:00 ~ 2:00
< KimSangYeon-DGU>
I'll send an email for our schedule
< KimSangYeon-DGU>
I sent it :)
< sumedhghaisas>
Got it. Thanks. :)
< KimSangYeon-DGU>
sumedhghaisas: I'm not sure, I did understand the meaning of "maybe we should write a Mlpack GMM version of the same in models folder"
< sumedhghaisas>
I mean we can write the same implementation that you did to plot the graphs using MLPACK gmm code ... just a side project
< KimSangYeon-DGU>
Got it! Thanks
< KimSangYeon-DGU>
:)
sumedhghaisas has quit [Ping timeout: 256 seconds]
shikhar has joined #mlpack
< shikhar>
Toshal: saksham189: Are you there?
< shikhar>
I'll hang around for half an hour, if you guys wanna talk. Otherwise, let's catchup on Monday.
ShikharJ_ has joined #mlpack
shikhar has quit [Quit: Page closed]
Toshal has joined #mlpack
< Toshal>
ShikharJ: Hi.
< ShikharJ_>
Toshal: Hey
< Toshal>
Sorry once again for getting late.
< ShikharJ_>
Nah, it's fine.
< Toshal>
Let's shift our timing to 9:30. My mess is somewhat messy :)
< ShikharJ_>
I'm guessing Saksham is going to miss this one as well, but yeah cool. 9.30.
< Toshal>
We can start from now
< ShikharJ_>
Go ahead.
< Toshal>
Do you have any schedule in your mind for me?
< ShikharJ_>
Yeah, I was planning on deciding the first phase tasks.
< ShikharJ_>
Let me open your proposals
< ShikharJ_>
Hmm, can't seem to access those via original website.
< Toshal>
Okay I can send you a copy of both.
< ShikharJ_>
Nevermind, found them.
< ShikharJ_>
Okay, I wish saksham was here, because tackling the features of the GAN toolkit was my main aim for the first code phase.
< Toshal>
Okay.
< ShikharJ_>
More importantly, label-smoothing, mini-batch discrimination and virtual batch normalization.
< ShikharJ_>
Apart from that, any additional tools that you think might be useful for the completeness of our GAN library?
< ShikharJ_>
And yes, inception scoring and frechet inception distance metrics.
< Toshal>
Okay.
< Toshal>
I know somewhat of Mini-batch discrimination. It would go as layer.
< Toshal>
I can go with label smoothing.
< ShikharJ_>
Let's start with what you're most comfortable with?
< Toshal>
Yes.
< ShikharJ_>
So pick a topic from above, and work on it for the next week.
< ShikharJ_>
We'll try to get it merged by the end of the week itself. But no issues, if that gets dragged.
< ShikharJ_>
I'll ask saksham to pick a topic from above as well.
< Toshal>
Okay, I would go with label smoothing.
< ShikharJ_>
Take some time, think over it. We'll talk on Monday night regarding how you wish to implement it, and then you're free to go ahead.
< Toshal>
Okay.
< ShikharJ_>
So for the first code phase, we'll improve the usability of our existing framework, and then later, we'll expand our framework.
< ShikharJ_>
By then atleast I would be able to merge the completed work in.
< Toshal>
Yes, I am fine with this.
< ShikharJ_>
Hmm, Saksham had also mentioned some other weight normalization and regularization techniques. That should be enough work for a month between the two of you :)
< Toshal>
ShikharJ: What about the heavy node access?
< ShikharJ_>
Umm, didn't Ryan talk to you regarding that? I believe he mentioned to me regarding giving you access to savannah?
< Toshal>
Ah. I saw those messages on IRC. But they were not clear.
< Toshal>
But it's fine.
< ShikharJ_>
Talk to Ryan, he's a great guy :) Also, to get a better idea, learn about accessing cloud VM instances through SSH.
< Toshal>
Okay.
< ShikharJ_>
After you know about how to access the savannah node, it will be really simple to run your experiments on it using tmux. Learn about tmux as well.
< ShikharJ_>
Try installing local tmux in your system and play with it a little.
< Toshal>
Okay.
< Toshal>
Just one last thing. Have you got a chance to review #1888? What is your view regarding it?
< ShikharJ_>
I haven't as of yet. Sorry, I'm in the middle of shifting for three months to a different city, so can't find the time. I'll try to do so tomorrow.
< Toshal>
Okay. Where do you live now?
< ShikharJ_>
New Delhi for some part of the year. Mostly in Patna.
favre49 has joined #mlpack
< Toshal>
And where are you working?
< ShikharJ_>
Because my college is in Patna.
< Toshal>
or will be working?
< ShikharJ_>
I'm mostly into research. So I'll be working at University of Southern California as a researcher for the next few months, before heading back to Patna.
< Toshal>
Oh that's great.
< Toshal>
I will keep my rest of my questions for the next meet.
< ShikharJ_>
Okay, then I'll be off for now. See you on Monday.
< Toshal>
Yes. Bye.
< favre49>
zoq: Hey, I think we have covered most of the important points in our email discussion, is there anything else we need to discuss before the coding period begins?
< favre49>
I was wondering what my goal should be for the first phase
ShikharJ_ has quit [Quit: Leaving]
< favre49>
Also i think you missed my last email, please get back to me whenever you're free
< favre49>
Thanks :)
favre49 has quit [Quit: Page closed]
< zoq>
favre49: Just send you an email.
Toshal_ has joined #mlpack
Toshal has quit [Ping timeout: 246 seconds]
Toshal_ is now known as Toshal
< Toshal>
rcurtin: Hi sir, I want to test one of my Dual Optimizer PR Implementation. For that can I get access to savannah?
Toshal_ has joined #mlpack
Toshal has quit [Read error: Connection reset by peer]
Toshal_ is now known as Toshal
Toshal has quit [Ping timeout: 248 seconds]
sreenik has quit [Quit: Page closed]
sreenik has joined #mlpack
zoq has quit [Quit: leaving]
sreenik has quit [Ping timeout: 256 seconds]
zoq has joined #mlpack
zoq has quit [Quit: leaving]
jeffin143 has quit [Ping timeout: 244 seconds]
zoq has joined #mlpack
ShikharJ_ has joined #mlpack
< ShikharJ_>
rcurtin: zoq: Are you there?
< zoq>
ShikharJ_: yes
< ShikharJ_>
zoq: Savannah is giving me the following error on running make on the latest master: make[2]: *** No rule to make target '../src/mlpack/core/arma_extend/fn_ccov.hpp', needed by 'src/mlpack/cotire/mlpack_CXX_prefix.hxx.gch'. Stop.
< ShikharJ_>
I guess the minimum armadillo version required was bumped up recently?
< ShikharJ_>
Is there a way you might be able to update the installed packages on Savannah?
< zoq>
hm, is this on a clean build?
< zoq>
7.600 is installed
< ShikharJ_>
Yeah, ran make clean before running make and deleted the CMakeCacheLists.txt.
< zoq>
What version does cmake pick up?
< zoq>
let's see if I can reproduce the issue
< zoq>
"Found Armadillo: /usr/lib/libarmadillo.so (found suitable version "7.600.2", minimum required is "6.500.0")"
< zoq>
on Savannah
< ShikharJ_>
For me, it's just Armadillo libraries: /usr/lib/libarmadillo.so
< ShikharJ_>
On running cmake .
< zoq>
right, can you remove the build folder and try again?
< ShikharJ_>
Okay, just a minute.
< zoq>
my build looks fine so far, at 20%
< ShikharJ_>
Okay, now it works. It is weird, given I just ran a model a week ago and it ran make well.
< ShikharJ_>
Thanks zoq!
< zoq>
strange
< ShikharJ_>
zoq: Can it handle make -j4?
< zoq>
pretty sure it does
< ShikharJ_>
Hmm, my system (with 4 cores and hyper-threading) usually goes borderline Windows 98 if I build mlpack with -j6 :) It's the memory that threatens to run out, mostly while building RNN code
< zoq>
hm, might be a good idea to run htop
< ShikharJ_>
zoq: Yeah, either that or system monitor.