verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
< robertohueso> Thank you zoq and rcurtin, I'll take a look at it tomorrow
robertohueso has left #mlpack []
marioloko has quit [Ping timeout: 260 seconds]
Vss has quit [Quit: Connection closed for inactivity]
govg has joined #mlpack
Rithesh has joined #mlpack
Rithesh has quit [Client Quit]
daivik has joined #mlpack
dhoulihan has quit [Ping timeout: 265 seconds]
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
daivik has joined #mlpack
Fen has joined #mlpack
Fen has quit [Ping timeout: 260 seconds]
ASamir has joined #mlpack
ASamir is now known as Samir
sanghaisubham has joined #mlpack
sumedhghaisas2 has joined #mlpack
sumedhghaisas has quit [Read error: Connection reset by peer]
sumedhghaisas2 has quit [Read error: Connection reset by peer]
sumedhghaisas2 has joined #mlpack
Samir_ has joined #mlpack
Samir has quit [Ping timeout: 248 seconds]
Samir_ has quit [Ping timeout: 240 seconds]
sanghaisubham has quit [Ping timeout: 260 seconds]
Samir has joined #mlpack
sumedhghaisas2 has quit [Read error: Connection reset by peer]
sumedhghaisas has joined #mlpack
vivekp has quit [Ping timeout: 256 seconds]
vivekp has joined #mlpack
qwert123 has joined #mlpack
< qwert123> hi, I wanted to know: are all the project proposals on the mlpack GSOC ideas page open -- even the ones without any open tickets?
Samir has quit [Ping timeout: 240 seconds]
Samir has joined #mlpack
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
Samir has quit [Ping timeout: 248 seconds]
Samir has joined #mlpack
Samir_ has joined #mlpack
Samir has quit [Ping timeout: 245 seconds]
ShikharJ has joined #mlpack
< ShikharJ> qwert123: Yes. Apart from that you are free to propose your own ideas.
vivekp has quit [Ping timeout: 276 seconds]
witness has joined #mlpack
Samir_ has quit [Remote host closed the connection]
Samir has joined #mlpack
Samir has quit [Remote host closed the connection]
Samir has joined #mlpack
< luffy1996> @zoq, did you go through my pr request?
vivekp has joined #mlpack
qwert123 has quit [Ping timeout: 260 seconds]
nikhilgoel1997 has joined #mlpack
ShikharJ has quit [Ping timeout: 260 seconds]
nikhilgoel1997_ has joined #mlpack
Samir has quit []
nikhilgoel1997 has quit [Ping timeout: 260 seconds]
nikhilgoel1997_ is now known as nikhilgoel1997
samidha has joined #mlpack
qi has joined #mlpack
qi is now known as Guest5039
< samidha> @manish7294: Sorry for the extremely late reply, I fell sick. Here's the very simple code that I failed to execute with the error code: https://pastebin.com/6QKByTvu
< samidha> If anyone else could help me with this, it'd be highly appreciated!
samidha has quit [Quit: Page closed]
Trion has joined #mlpack
< Guest5039> Why there is no label parameter in k-nearest-neighbor?
Guest5039 has quit [Quit: Page closed]
witness has quit [Quit: Connection closed for inactivity]
travis-ci has joined #mlpack
< travis-ci> mlpack/mlpack#4193 (master - bdbe654 : Ryan Curtin): The build has errored.
travis-ci has left #mlpack []
ImQ009 has joined #mlpack
< caladrius[m]> @rcurtin Okay
avtansh has joined #mlpack
Trion has quit [Quit: Entering a wormhole]
travis-ci has joined #mlpack
< travis-ci> mlpack/mlpack#4194 (master - 6021e08 : Ryan Curtin): The build passed.
travis-ci has left #mlpack []
< dk97[m]> zoq: rcurtin I have a few questions...
< rcurtin> dk97[m]: we can try and answer them
< dk97[m]> is there any separate loss module?
< rcurtin> I don't follow the question, can you provide some more context please? mlpack is a large library...
< dk97[m]> Like there are separate files for activation functions, initialization, are there separate files for loss functions like cross entropy and KL Divergence?
nikhilgoel1997 has quit [Quit: Connection closed for inactivity]
< dk97[m]> rcurtin:
vivekp has quit [Ping timeout: 240 seconds]
vivekp has joined #mlpack
travis-ci has joined #mlpack
< travis-ci> mlpack/mlpack#4195 (master - 9f69eb6 : Ryan Curtin): The build passed.
travis-ci has left #mlpack []
qi has joined #mlpack
qi is now known as Guest30413
< Guest30413> How to set labels while training k-nearest-neighbours?
govg has quit [Ping timeout: 260 seconds]
< rcurtin> dk97[m]: I guess you are talking about the neural network code
< rcurtin> have you looked in src/mlpack/methods/ann/ at all?
< rcurtin> the files are all laid out there
< rcurtin> Guest30413: the mlpack KNN code solves k-nearest-neighbor search but is not a KNN classifier so labels are not applicable
< Guest30413> Is ther any other alternative classifier in mlpack?
Guest30413 has quit [Quit: Page closed]
sumedhghaisas has quit [Read error: Connection reset by peer]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Read error: No route to host]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Read error: Connection reset by peer]
sumedhghaisas has joined #mlpack
daivik has joined #mlpack
sumedhghaisas has quit [Read error: Connection reset by peer]
sumedhghaisas has joined #mlpack
< caladrius[m]> zoq: Hey! I tried implementing the Atrous convolution layer. I have a couple of doubts regarding that. First, I couldn't figure out exactly how backpropogation is working in this. But I guess BackwardConvolution and GradientConvolution take care of that. Please point out if I am wrong. Second, I cannot figure out how things are handled in the gradient calculations if we want to have padding. Could you please help me out
< caladrius[m]> with that? I have opened a PR for the same. All the code is there.
< dk97[m]> rcurtin: Yes, but I could not find a separate file for Losses like KL Divergence. Rather it is implemented directly in the Sparse Autoencoder.
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
daivik has joined #mlpack
< zoq> dk97[m]: You are right, there is no KL Divergence implemented for the ann code. Not sure it's possible to merge the Sparse Autoencoder code with the existing ann code.
< zoq> caladrius[m]: Sure, I'll take a look at the code once I get a chance.
< dk97[m]> Yes, it isnt right now.
< dk97[m]> Also, exactly how will the framework be? Is it like a model?
< dk97[m]> I am talking about the VAE GSoC project
< zoq> dk97[m]: Ah, yeah it will implement the same iterface as the FNN class.
< dk97[m]> Okay
< zoq> dk97[m]: Which implementes the Add function to add a new model/layer to an existing model.
< dk97[m]> Also, is mlpack available on GPU?
< dk97[m]> I could not find any instructions for it.
< zoq> dk97[m]: You could use NVBLAS to get GPU acceleration for BLAS functions.
< zoq> dk97[m]: Take a look at the armadillo documentation for further informations.
< dk97[m]> Ah okay! I will have a look.
< dk97[m]> Thanks a lot.
< dk97[m]> Also, I was thinking of implementing the deconv layer to get the final workings of making a layer and writing tests in mlpack.
< zoq> dk97[m]: Sounds good.
< dk97[m]> Is it alright to do so?
< zoq> dk97[m]: Yes, I think this is a great way to get familair with the codebase.
< dk97[m]> Also, could you have a look at the Dropout PR as well as the alpha dropout PR when you have time? The builds are passing.
< zoq> dk97[m]: Sure, I'll take a look at the PR once I get a chance.
< dk97[m]> Great then, I will let you know if I face any troubles. 🙂
< dk97[m]> Thanks for helping out!
< zoq> dk97[m]: Here to help :)
ShikharJ has joined #mlpack
manish7294 has joined #mlpack
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
< ShikharJ> dk97[m]: Would deconv layers be required in VAE project?
< dk97[m]> No they wont be required.
< dk97[m]> I just want to take up the task so that I can understand the code base and flow, and get familiar with making a layer.
< dk97[m]> ShikharJ:
< ShikharJ> Ah okay, actually I mentioned this as Deconv layers would be required in one of the ideas that I'm proposing on GANs, and I thought maybe if you could delay that till the projects are announced.
< ShikharJ> Mentioning this because i'm not too sure of the exact API that would be required.
sumedhghaisas2 has joined #mlpack
< ShikharJ> So I might have to refactor the code again.
sumedhghaisas has quit [Ping timeout: 252 seconds]
< ShikharJ> zoq: What should be done in you opinion?
< dk97[m]> Okay, sure. I will look for another layer to implement. 🙂
< ShikharJ> dk97[m]: I'm sorry for being a pain.
daivik has joined #mlpack
< dk97[m]> Its alright, not a problem really. I had not started it.
< zoq> ShikharJ: Ideally, the VAE project follows the existing interface, so I'm not sure you have to refactor the code, but if dk97[m] is happy to switch to another layer, that's fine for me as well.
daivik has quit [Client Quit]
< manish7294> zoq: rcurtin: As LMNN is based on optimizing the accuracy of KNN classifier. So, is it a good idea to propose a KNN classifier along with LMNN as the project will totally need it?
daivik has joined #mlpack
< zoq> manish7294: Yes, that sounds reasonable, if I remember right there was a discussion about a KNN classifier based on the existing code.
< manish7294> zoq: So, we may use existing neighbor search solver. Right?
< zoq> manish7294: here you go http://www.mlpack.org/irc/mlpack.20171120.html
ShikharJ has quit [Ping timeout: 260 seconds]
ShikharJ has joined #mlpack
< daivik> rcurtin: zoq: With regard to issue #356 (Heirarchical Clustering Methods), I'd like to implement BIRCH clustering (its quite popular - next to kmeans, dbscan and a few others...and also scikit-learn has an implementation for it). I've read the paper (https://www.cs.sfu.ca/CourseCentral/459/han/papers/zhang96.pdf); and it doesn't look too difficult
< daivik> to implement - maybe about a week to implement and write tests. I haven't given the exact implementation and what sort of functions to expose a lot of thought right now - just wanted opinions on how good of an idea this is (worth pursuing? worth binning? any other inputs?)
< manish7294> zoq: Thanks for providing the link. So, I think here using existing neighbor search solver should simplify the implementation.
< dk97[m]> zoq: is it alright if I implement quantized fully connected layers?
< dk97[m]> This is the paper
< dk97[m]> The idea can then be extended to conv layers
< dk97[m]> Also, what is your take on having a separate module for losses?
sumedhghaisas2 has quit [Read error: Connection reset by peer]
sumedhghaisas has joined #mlpack
< dk97[m]> It could contain the MSE, Absolute, KL Divergence loss.
< dk97[m]> Also, I think caladrius had done something on huber loss. It all could go in one folder.
sumedhghaisas2 has joined #mlpack
sumedhghaisas has quit [Ping timeout: 256 seconds]
sumedhghaisas2 has quit [Ping timeout: 240 seconds]
ShikharJ has quit [Quit: Page closed]
sumedhghaisas has joined #mlpack
< zoq> dk97[m]: I'll take a closer look at the paper later today.
< dk97[m]> Okay do let me know your views on the paper, as well as the separate module for losses.
< zoq> daivik: I'm not familiar with BIRCH, so I can't help you right now, maybe Ryan has, I'm sure he will respond once he has a chance.
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
sumedhghaisas has quit [Read error: Connection reset by peer]
sumedhghaisas has joined #mlpack
< rcurtin> daivik: BIRCH would be nice, but there is not time for me to review it now or anytime soon, unfortunately
sumedhghaisas2 has joined #mlpack
sumedhghaisas has quit [Read error: Connection reset by peer]
djhoulihan has joined #mlpack
daivik has joined #mlpack
manish7294 has quit [Remote host closed the connection]
< daivik> rcurtin: Thats too bad, I thought that it would have been a nice little self contained thing to work on. I know that the issue is old - but I guess its still open for a reason? Is there some particular deliverable you expect from it?
< rcurtin> what you described is a good contribution, but what I am saying is that even if you do what you plan to I don't have time anytime soon to review it
< rcurtin> that said, I don't think would necessarily hurt to close the issue; I had it open as a general entrypoint for contribution, but the GSoC load is now too much for me to keep up with effectively
< daivik> sorry for flogging the idomatic dead horse, but I really don't mind waiting for a review. And having seen the paper recently, I am itching to code it up. So is it a firm "no, don't do this" from your end or can I work on this and maybe in a distant future (possibly after the current gsoc session ends), you'll review it?
< rcurtin> I think that is reasonable, but set your expectations for "a year or more"
< rcurtin> it's not hard for me to quickly review simple contributions but more complex contributions where I need to read the paper, convince myself I am familiar with the paper, etc... this is very time-consuming
< daivik> right, I completely understand. I guess I'll go ahead with it (with my expectations set to "a year or more") .. and if it is to be it will eventually be :)
< daivik> On the topic of simple contributions, I was writing tests for mlpack_hmm_generate CLI binding -- there appears to be something wrong with it. When I run $ mlpack_hmm_generate --model_file model-file.xml --length 3; I get this:
< daivik> [FATAL] Attempted to access parameter --model as type N6mlpack3hmm8HMMModelE, but its true type is PN6mlpack3hmm8HMMModelE!
< rcurtin> hm, I think this will mean that somewhere in hmm_generate_main.cpp there is a CLI::GetParam<HMMModel>(...) where it should be CLI::GetParam<HMMModel*>(...)
< rcurtin> try making that change, I bet that will fix it
< daivik> okay.. will do. Thanks for your inputs.
< rcurtin> let me know if that doesn't work
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
< dk97[m]> zoq: I think before the quantized fully connected paper, we could have a losses module that contains the definitions of different loss functions. That would be more helpful in the general case. Do let me know your thoughts.
ImQ009 has quit [Quit: Leaving]
travis-ci has joined #mlpack
< travis-ci> mlpack/models#22 (master - b0b627f : yash sharan): The build passed.
travis-ci has left #mlpack []