verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
shubhamagarwal92 has joined #mlpack
shubhamagarwal92 has quit [Quit: Page closed]
trapz has joined #mlpack
trapz has quit [Quit: trapz]
sumedhghaisas has joined #mlpack
trapz has joined #mlpack
aashay has quit [Quit: Connection closed for inactivity]
sumedhghaisas has quit [Quit: Ex-Chat]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Client Quit]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Ping timeout: 240 seconds]
trapz has quit [Quit: trapz]
sumedhghaisas has joined #mlpack
Nax has joined #mlpack
Trion has joined #mlpack
Trion has quit [Ping timeout: 268 seconds]
Nax has quit [Ping timeout: 260 seconds]
kartik_ has quit [Ping timeout: 260 seconds]
etotientz has joined #mlpack
etotientz has quit [Client Quit]
etotientz_ has joined #mlpack
etotientz_ has quit [Client Quit]
etotientz_ has joined #mlpack
salil has joined #mlpack
salil has quit [Client Quit]
manirao has joined #mlpack
mikeling has joined #mlpack
vinayakvivek has joined #mlpack
< etotientz_> I m getting mlpack_kmeans --help
< etotientz_> Segmentation fault (core dumped)
etotientz_ has quit [Ping timeout: 246 seconds]
etotientz_ has joined #mlpack
etotientz_ has quit [Client Quit]
Trion has joined #mlpack
Trion has quit [Ping timeout: 258 seconds]
manirao has quit [Ping timeout: 260 seconds]
rajeev has joined #mlpack
rajeev has quit [Client Quit]
diehumblex has joined #mlpack
tejank10 has joined #mlpack
thyrix has joined #mlpack
govg has quit [Ping timeout: 264 seconds]
< mikeling> zoq: ping
< mikeling> rcurtin: ping
< mikeling> hi zoq and rcurtin , I'm confusing about this part of code https://github.com/mlpack/mlpack/blob/master/src/mlpack/methods/decision_tree/best_binary_numeric_split_impl.hpp#L29-L32 and how it works. I guess it's about sort all the labels right? Which means if we have label list like "2 1 5 3 8 9 2 0 1" and we may want orderedLabels like "0 1 1 2 2
< mikeling> 3 5 8 9" Isn't it?
< mikeling> If so, then we may have an issue for it because maybe it failed to do that. Please read this https://gist.github.com/MikeLing/b6c7d0a38ce6079c42e4717582d322cc for more detail. Thank you!
govg has joined #mlpack
vinayakvivek has quit [Quit: Connection closed for inactivity]
trion has joined #mlpack
< thyrix> mikeling: I think you are right :)
< mikeling> thyrix: oh thank you! Do you think I need to file an issue and make a pr for it now? Or waiting for zoq and rcurtin 's reply
temp has joined #mlpack
temp has quit [Client Quit]
< thyrix> mikeling: I'm not sure. I think you can start work now, and they can reply in the issue.
< mikeling> thyrix: thank you! I will do it ;)
< mikeling> thyrix: BTW, are you applying GSoC project in mlpack? Which project interested you? ;)
< thyrix> mikeling: yes, I plan to apply for "better benchmark", wbu?
< mikeling> thyrix: I plan to apply "Parallel stochastic optimization methods" project ;)
tejank10 has quit [Quit: Page closed]
mikeling is now known as mikeling|brb
trion has quit [Quit: trion]
divyansh has joined #mlpack
divyansh has quit [Client Quit]
< thyrix> mikeling: cool!
Trion has joined #mlpack
Trion has quit [Remote host closed the connection]
jayant has joined #mlpack
sanch has joined #mlpack
zoro_ has joined #mlpack
< sanch> hi! i would like to contribute to mlpack, but i am not clear what exactly is the standard procedure
Trion has joined #mlpack
sanch has quit [Quit: Page closed]
trapz has joined #mlpack
Nax has joined #mlpack
zoro_ has quit [Ping timeout: 260 seconds]
trapz has quit [Quit: trapz]
trapz has joined #mlpack
Trion has quit [Quit: Leaving.]
Trion has joined #mlpack
trapz has quit [Quit: trapz]
Trion has quit [Quit: CYA]
Trion has joined #mlpack
Trion has quit [Quit: CYA]
Trion has joined #mlpack
Trion has quit [Quit: CYA]
Trion has joined #mlpack
trapz has joined #mlpack
Trion has quit [Quit: CYA]
Trion has joined #mlpack
temp has joined #mlpack
trapz has quit [Quit: trapz]
thyrix has quit [Ping timeout: 260 seconds]
mikeling|brb is now known as mikeling
Nax has quit [Ping timeout: 260 seconds]
arishabh has joined #mlpack
trapz has joined #mlpack
sumedhghaisas has quit [Ping timeout: 240 seconds]
< arishabh> Hey all
arishabh has quit [Quit: Page closed]
Nax has joined #mlpack
temp has quit [Ping timeout: 260 seconds]
sicko has quit [Ping timeout: 264 seconds]
vivekp has quit [Ping timeout: 240 seconds]
vivekp has joined #mlpack
Trion has quit [Quit: CYA]
govg has quit [Ping timeout: 240 seconds]
shubhamagarwal92 has joined #mlpack
shubhamagarwal92 has quit [Client Quit]
sicko has joined #mlpack
Nax has quit [Ping timeout: 260 seconds]
travis-ci has joined #mlpack
< travis-ci> mlpack/mlpack#2013 (master - 5767a5f : Ryan Curtin): The build is still failing.
travis-ci has left #mlpack []
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Ping timeout: 240 seconds]
sumedhghaisas has joined #mlpack
Naveen has joined #mlpack
< Naveen> hey , how can i do the contribution
< zoq> Naveen: Hello and welcome, have you seen www.mlpack.org/involved.html and mlpack.org/gsoc.html?
< Naveen> thanks zoq, but i also want to know how to submit the idea
< zoq> An GSoC project idea or a proposal that you like to work on one of the ideas that are listed on the wiki?
< sumedhghaisas> zoq: Hey Marcus, so I took a look at the Neural Turing Machine paper as well as DQN paper. For my masters I am sort of doing my thesis on DQN so do you think, layer compression + batch normalization with double DQN is enough work?
< sumedhghaisas> DQN will require lot of testing
Naveen has quit [Ping timeout: 260 seconds]
< zoq> sumedhghais: Hello, yeah that is definitely enough work. In fact, I think you could even work the whole time on the DQN code since there are some interesting directions that could be explored. Anyway, I'm not sure it's a good idea to work on two different topics, but if you feel comfortable with that, I'm fine with the decision.
ShangtongZhang has joined #mlpack
< sumedhghaisas> zoq: hmm... is there some existing DQN code? or I will be implementing the base framework?
< sumedhghaisas> I am going to use DQN in my dissertation, I hope to use my own implemetation :)
< zoq> sumedhghais: Sort of, I'm sure you can reuse part's of the ann code :)
ShangtongZhang has quit [Quit: Page closed]
govg has joined #mlpack
< sumedhghaisas> zoq: okay. I will try to fir a nice timeline, with speciic models assigned to weeks. If I can't, I will try to reduce the work.
< sumedhghaisas> I will have to first figure out how exactly reuse the current code
< zoq> sumedhghais: Sounds good, we can we can discuss the timeline in the proposal.
< sumedhghaisas> sumedhghaisas: yeah... I was thinking the layer merge work will involve merging, linear with bias, can be also merged with nonlinearities, also batch normalization
< sumedhghaisas> hence I will implement batch normalization first
< zoq> sumedhghais: Yeah, I agree, what do you think about the timeline for that?
benchmark has joined #mlpack
benchmark has quit [Client Quit]
< mikeling> zoq: hi, did you see the message I at you on irc? :)
sagar has joined #mlpack
sagar is now known as Guest4928
< zoq> ikeling: I saw your message but I haven't had time yet to look into it. I'll take a look later today.
< zoq> mikeling*
< mikeling> zoq: sure, thank you! :)
< Guest4928> Hi, I am new to open source contribution, but am quite experienced with ML and C++. My question is, is mlpack a good place to start?
< zoq> Guest4928: Hello there, I think so, but I'm biased :)
thyrix has joined #mlpack
< Guest4928> I noticed inconsistent coding/indentation style between softmax and logistic regression. Is fixing that a good start or is it suggested to dive into issues directly?
< zoq> Guest4928: That is a good start you can learn a lot about how this whole process works (opening a PR, discuss issues, etc.).
< Guest4928> zoq: Sure then, that is great. I will look into it :)
< zoq> Guest4928: Great, looking forward to take a look at the PR.
< sumedhghaisas> zoq: So I took a quick look at the framework. Implementation of batch normalization will depend on how generic we want it to be.
< sumedhghaisas> I propose a single batch normalization layer which a user can use everywhere.
< sumedhghaisas> I mean it will take any dimensional input and can also accept dimensions on which to normalize
< sumedhghaisas> this would help in using btch normalization in convolutional layers
< sumedhghaisas> but I am not sure if implementing separate batch normalization for linear layer and convolutional layer would be faster or not
travis-ci has joined #mlpack
< travis-ci> mlpack/mlpack#2014 (master - fa0af83 : Marcus Edel): The build is still failing.
travis-ci has left #mlpack []
< sumedhghaisas> in any case implementing it and testing should not take more than a week... should be even less. assuming I go through the current code thoroughly in bonding period which I plan to do
zoro_ has joined #mlpack
thyrix has quit [Ping timeout: 260 seconds]
< zoro_> hi everybody ! I just want to know whether its mandatory to submit a pr for gsoc.
< zoq> sumedhghais: I have to think about it, but we could just test if implementing a special linear version is faster.
< zoq> zoro_: Hello there, no it's no mandatory.
jayant has quit [Ping timeout: 260 seconds]
< zoro_> thanks
< sicko> Allo people. For a logistic regression case e can have any number of parameters right?
arunreddy has quit [Quit: WeeChat 1.4]
govg has quit [Quit: leaving]
Trion has joined #mlpack
vivekp has quit [Ping timeout: 264 seconds]
vivekp has joined #mlpack
< Trion> Will the Gym api become part of Mlpack afterwards? or it is just for testing the learning algorithm for agent and only the learning algorithm will be added to Mlpack source?
govg has joined #mlpack
< zoq> Trion: It's meant to be used for testing different methods; But I think it could be a dependency for a specific model for the models repository. So if we implement the A3C algorithm we could provide a pre-trained model or script that builds all dependencies and provides an easy way to interact with the algorithm. Here are more informations about the models repo idea: https://github.com/mlpack/mlpack/issues/870
vivekp has quit [Ping timeout: 260 seconds]
< zoq> sicko: I'm not sure what you mean.
< sicko> zoq, In logistic regression the output is either 0 or 1. There can be any number of decision variables right?
< zoq> sicko: yes
vivekp has joined #mlpack
< sicko> So for example i want to know when a egg will hatch i can use logistic regression.. where the factors can be heat , light , longitude,latitude etc etc etc
< sicko> ?
< sicko> where etc can be another 10 variables
< Trion> Pretrained models would be great! And I agree, these should be hosted seperately on website/seperate repositories. In python, pickle library stores the pre-trained models as a binary file, which can then be imported into next program. Maybe boost's serialization function can make it happen in C++?
< Trion> sicko: yes
< sicko> Awesome :) thanks maties
< Trion> :D Happy coding!
< sicko> Hehe . No coding. I'm trying to learn machine learning properly. So wrapping my head around different concepts.
< sicko> ml is just probability and stats instead of anything futuristic. Makes you go full meh once you think about it.
vivekp has quit [Ping timeout: 256 seconds]
< Trion> Haha, ML will reach the science fiction level, just needs more time and data scientists to make it happen
Guest4928 has quit [Ping timeout: 260 seconds]
vivekp has joined #mlpack
< sicko> Trion, True It's really amazing stuff. And to think that the majority of the world doesn't understand the beauty of science and the brilliant minds behind it all. (the sonder effect)
Trion has quit [Quit: Have to go, see ya!]
mikeling has quit [Quit: Connection closed for inactivity]
dineshraj01 has joined #mlpack
zoro_ has quit [Quit: Page closed]
sicko has quit [Quit: Leaving]
kris1 has joined #mlpack
govg has quit [Ping timeout: 260 seconds]
govg has joined #mlpack
trapz has quit [Quit: trapz]
trapz has joined #mlpack
ironstark has joined #mlpack
diehumblex has quit [Quit: Connection closed for inactivity]
trapz has quit [Quit: trapz]
trapz has joined #mlpack
trapz has quit [Quit: trapz]
trapz has joined #mlpack
trapz has quit [Quit: trapz]
trapz has joined #mlpack
trapz has quit [Ping timeout: 260 seconds]
JoshuaTang has joined #mlpack
JoshuaTang has quit [Client Quit]
< kris1> zoq: ParameterVisitor is always returning a m*1 vector how can i change it to return a matrix
< zoq> kris1: You could reshape the parameter, if you need (m, n) instead of (m x n, 1).
< kris1> Yes but unfortunately you don't know always (m,n).
< zoq> Could't you use the InSize/OutSize parameter?
< kris1> yes. and check that the layer actually has a attribute called Insize
< kris1> right
arunreddy has joined #mlpack
trapz has joined #mlpack
dineshraj01 has quit [Ping timeout: 256 seconds]