verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
palashahuja has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
palashahuja has joined #mlpack
< zoq> palashahuja: There are a couple of thing we have to do to make it work right.
< zoq> palashahuja: 1. Forward the InputParameter() and OutputParameter() to the baseLayer.
< zoq> palashahuja: 2. Add the void Gradient(const arma::Mat<eT>& d, GradientDataType& g) function.
< zoq> palashahuja: 3. 'Denoise' the weights, in line 94 you set weights randomly to zero which is fine, but we have to fallback to the orginal weights once we calculated the gradient.
< zoq> There is another problem, that has nothing to do with your code. More with the way how the optimizer works.
< zoq> The main problem is that the drop connect layer acts differently in training mode and in prediction mode. The optimizer doesn't know that, so it's always in training mode.
< zoq> Here comes the problem SGD or RMSprop starts by evaluating the complete dataset, which results in an empty weight matrix because every time Forward is called we set some weights to zero right? So when we actually start to train the network, we start with an empty weight matrix.
chick_ has quit [Quit: Connection closed for inactivity]
< palashahuja> zoq, what is the proposed solution for empty weight matrix ?
< palashahuja> Should we try manipulating it somehow using layer traits ?
< zoq> palashahuja: I modified the RMSprop optimizer to test the code. That's pretty easy because all networks implement a third parameter for the Evaluation function, to set the state. But we can't just modify all optimizer to set the state. Maybe, there is another solution ... I'll need to think about that
< zoq> palashahuja: If you like I can send you the code, so that you can test everything.
< palashahuja> zoq, Yes please send the gist if you'd like ..
< zoq> palashahuja: I removed some lines from the code you send just to make testing: https://gist.github.com/zoq/4fc4c7061a698f82cf47
< zoq> palashahuja: *make testing easier
Divyam has joined #mlpack
Divyam has quit [Client Quit]
< zoq> Oops ... I guess, everybody who subscribed to the commit mailing list likes to get mails
deadpool_ has quit [Ping timeout: 244 seconds]
palashahuja has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
palashahuja has joined #mlpack
travis-ci has joined #mlpack
< travis-ci> mlpack/mlpack#600 (master - 1a9c41a : Marcus Edel): The build was broken.
travis-ci has left #mlpack []
deadpool_ has joined #mlpack
palashahuja has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
deadpool_ has quit [Ping timeout: 276 seconds]
rohitpatwa has joined #mlpack
rohitpatwa has quit [Ping timeout: 246 seconds]
travis-ci has joined #mlpack
< travis-ci> mlpack/mlpack#601 (master - 74593ce : Ryan Curtin): The build was fixed.
travis-ci has left #mlpack []
rohitpatwa has joined #mlpack
sumedhghaisas has quit [Ping timeout: 252 seconds]
rohitpatwa has quit [Ping timeout: 240 seconds]
praveench has joined #mlpack
< praveench> /msg NickServ VERIFY REGISTER praveench kffwjdatpsfi
Rishabh has joined #mlpack
praveench has quit [Quit: Page closed]
ranjan1234 has joined #mlpack
rohitpatwa has joined #mlpack
wasiq has quit [Ping timeout: 276 seconds]
praveench has joined #mlpack
praveench has quit [Client Quit]
deadpool_ has joined #mlpack
wasiq has joined #mlpack
Rishabh has quit [Remote host closed the connection]
anveshi has joined #mlpack
deadpool_ is now known as McCathy
rohitpatwa has quit [Ping timeout: 244 seconds]
McCathy has quit [Ping timeout: 246 seconds]
anveshi has quit [Quit: Page closed]
McCathy has joined #mlpack
will__ has joined #mlpack
McCathy has quit [Ping timeout: 240 seconds]
ank_95_ has joined #mlpack
will__ has left #mlpack []
Rishabh has joined #mlpack
McCathy has joined #mlpack
jayy has joined #mlpack
jayy has quit [Client Quit]
leo__ has joined #mlpack
leo__ has quit [Client Quit]
aacr has joined #mlpack
< aacr> Hi, I'm a GSoC student and interested in Neuroevolution algorithms
< aacr> I followed the tutorial here https://github.com/zoq/nes
< aacr> almost done but have problem compiling lua-gd
< aacr> here's the error
< aacr> ?
< aacr> why can't i copy and paste
kirizaki has joined #mlpack
< aacr> can not be used when making a shared object
< aacr> recompile with -fpic
< aacr> tem cc1IqKXb.o :error adding symbols :Bad vaule
< aacr> thanks!
< aacr> usr] bin]ld: ]tmp]cc1IqKXb.o: relocation R_X86_64_32 against `.rodata.str1.1' can not be used when making a shared object; recompile with -fPIC ]tmp]cc1IqKXb.o: error adding symbols: Bad value collect2: error: ld returned 1 exit status Makefile:70: recipe for target 'gd.so' failed make: *** [gd.so] Error 1
< ranjan1234> Log::Info where is the logfile stored ?
< kirizaki> Log::Info / Warn / Fatal / Assert is just giving You message inside program
< kirizaki> it's not writing any log to file
< kirizaki> to see Log::Info You have to run Your program with -v flag
< kirizaki> -v or --verbose
< kirizaki> ranjan1234: let me know if it help
< kirizaki> aacr: for better read, copy Your log to www.pastebin.com
< ranjan1234> ohhk . got it ! thanks :)
< kirizaki> aacr: and You should wait for #zoq, he gonna help You with this for sure
< kirizaki> #ranjan1234: happy to help You ;)
< ranjan1234> when I am running a test and the test is calling a function which has Log::Info . when I am runnning the test prog with --verbose it is not showing the expected msg
< ranjan1234> I am running a test and the test is calling another function which has Log::Info and when I am runnning the test prog with --verbose it is not showing the expected msg .
< ranjan1234> #kirizaki ping
< kirizaki> ok
< kirizaki> because the test it's not taking the -v flag
< aacr> ok. thanks
< kirizaki> it doesn't have CLI::ParseCommandLine
< kirizaki> better use Log::Warn
< kirizaki> in this situation
< kirizaki> it will be showed always
< kirizaki> "Log::Warn is always shown, and Log::Fatal will throw a std::runtime_error exception, when a newline is sent to it"
< kirizaki> and more over, this doc. which I followed to You by link
< kirizaki> is not updated but Log::Fatal prints some sort of backtrace
< kirizaki> but only if You build mlpack with debugging symbols: cmake -DDEBUG=ON ../
niks has joined #mlpack
< niks> Hi! I am Nikhil and I read your Projects' list for GSOC 2016. I know C/C++ and am good at data Structures. Plz guide me so that I can contribute.
< aacr> Hi! I'm a GSoC student too. Perhaps you can first have a look at this page
< niks> Yes I am viewing this from several days and i find these projects really interesting and worth doing.
< aacr> yeah,
< niks> But what to do next?
< aacr> have no idea.but maybe you can firstly compile mlpack
< aacr> and i'm waiting for zoq for a compiling error
< kirizaki> #niks: Hi there!
< kirizaki> You can start here: http://mlpack.org/involved.html
< kirizaki> and if You will have any questions You can go via mailing list, github or here ;)
< kirizaki> "We don't respond instantly... but we will respond. Give it a few minutes. Or hours"
niks has quit [Quit: Page closed]
ranjan1234 has quit [Ping timeout: 252 seconds]
aacr has quit [Quit: Page closed]
rohitpatwa has joined #mlpack
rohitpatwa has quit [Ping timeout: 252 seconds]
rohitpatwa has joined #mlpack
rohitpatwa has quit [Ping timeout: 246 seconds]
mrbean has joined #mlpack
McCathy has quit [Ping timeout: 240 seconds]
mrbean1 has quit [Ping timeout: 240 seconds]
McCathy has joined #mlpack
McCathy has quit [Quit: Leaving]
wasiq has quit [Ping timeout: 268 seconds]
Nilabhra has joined #mlpack
FRossi has joined #mlpack
rebeka has joined #mlpack
< rebeka> Hi! Anyone online? :)
< FRossi> Hi! I'm Federico
< FRossi> I am a university student and I want to apply in this project for GSOC
< zoq> rebeka: Hello, we are always online :)
< zoq> FRossi: You are welcome to do so.
< rebeka> Hi. I'm Rebeka, I'm also interested to apply to mlpack for GSoC.
< rebeka> I have read the list of ideas on the website, some of which I have already studied and worked on before. I just wanted to know more about some of the projects to help me decide which one to go for?
< rcurtin> hi Rebeka, you can take a look through the mailing list archive... there is a lot of extra information about the projects before
< rcurtin> er
< rcurtin> there is a lot of extra information about the projects that has already been written
wasiq has joined #mlpack
< rebeka> Thanks
< rebeka> Could you please also give me tips so that I can increase my chance of being selected? I am completely new to this.
< zoq> rebeka: Take a look at: http://write.flossmanuals.net/gsocstudentguide/what-is-google-summer-of-code/. Writing a good proposal is a good start.
< FRossi> I'm in the same situation. I have studied and worked with some of the methods included in the project. Now I'm installing mlpack
< FRossi> Thanks for help!
rohitpatwa has joined #mlpack
< FRossi> I have a compile time error with isnan and isinf macros and I was looking for info about it in the github issues but I didn't find anything
< FRossi> Anyone know anything about its causes?
< rcurtin> FRossi: I seem to remember a github issue about this at one point...
< rcurtin> are you using the git master branch?
< FRossi> sorry, I found it
< FRossi> I will try with that solutions
< rcurtin> yeah, I think it's fixed in the latest master branch
pkgupta has joined #mlpack
archange_ has joined #mlpack
ank_95_ has quit [Quit: Connection closed for inactivity]
ank_95_ has joined #mlpack
LimeTheCoder has joined #mlpack
rebeka has quit [Ping timeout: 252 seconds]
rohitpatwa has quit [Ping timeout: 244 seconds]
LimeTheCoder has quit [Ping timeout: 252 seconds]
tafodinho has joined #mlpack
< tafodinho> hello everyone i will love to work on the project to implement tree types please how do i get started
anveshi has joined #mlpack
christie has joined #mlpack
cache-nez has joined #mlpack
< christie> hi , i downloaded and build mlpack from source today. After installing , i tired some command line executables mlpack proviedes, but i'm not able to run any of it
< christie> it shows this error "error while loading shared libraries: libmlpack.so"
< christie> although the libraries are there in the lib folder (inside build folder)
< christie> i mean the so files
< christie> can anybody please help ?
< christie> @zoq ?
< kirizaki> hi
< kirizaki> did You tried to add flag -lmlpack while compile Your program?
< zoq> christie: Take a look at https://github.com/mlpack/mlpack/blob/master/README.md and search for "error while loading shared librar"
< christie> thanks , will try it !
< zoq> tafodinho: There's been a lot of interest in the project to implement different types of trees. One good way to get started with that project might be to take a look at this other mailing list reply: https://mailman.cc.gatech.edu/pipermail/mlpack/2016-March/000760.html
< tafodinho> zoq: please so which other project can u advice me to try
< rcurtin> tafodinho: what other projects are you interested in?
< rcurtin> realistically we can't tell you what the best project for you is, because that depends on your interests
< tafodinho> ok then i will take a look at other projects because i based my interest on the implementation of different type of trees
< rcurtin> it might be useful to browse the mailing list archive for more information; there are lots of emails exchanged there:
< tafodinho> i am also interested in the Approximate Nearest Neighbor Search
< Rishabh> hi, can anyone confirm that text/code in this webpage - http://mlpack.org/docs/mlpack-git/doxygen.php?doc=trees.html is overlapping ? Also, the images seem broken. Not sure if its happening only on my browser.
< christie> everythings seems to be working just fine :) thanks guys
travis-ci has joined #mlpack
< travis-ci> mlpack/mlpack#602 (master - c75652b : marcus): The build was broken.
travis-ci has left #mlpack []
anveshi has quit [Ping timeout: 252 seconds]
yvtheja has joined #mlpack
yvtheja is now known as Guest90152
Guest90152 has quit [Client Quit]
yvtheja_ has joined #mlpack
< rcurtin> Rishabh: it's not just you, I need to fix the CSS
< rcurtin> every time the doxygen version changes the stylesheets change...
mizari has left #mlpack []
christie has quit [Quit: Page closed]
rohitpatwa has joined #mlpack
Nilabhra has quit [Remote host closed the connection]
tafodinho has quit [Ping timeout: 240 seconds]
kirizaki has quit [Ping timeout: 244 seconds]
kirizaki has joined #mlpack
yvtheja_ has quit [Quit: Leaving]
yvtheja has joined #mlpack
< kirizaki> rcutin: with mlpack::backtrace PR I changed docs about "mlpack Input and Output" but it's still not updated on website, could You check it, is it proper ?
< kirizaki> of coures in the mean time ;) no rush
ach_ has quit [Quit: Connection closed for inactivity]
kalingeri has joined #mlpack
ranjan123 has joined #mlpack
vineet has joined #mlpack
< kalingeri> Hi, I find the project on neuroevolution algorithms to be extremely interesting. I set up the nes emulator and understood how it interfaces with mlpack. I also ran a FFN on iris and kaggle data. I am going through the papers on the topic. Since the tickets related to this are already fixed, are there any warmup task I can do. I found it a little hard initially to set up the neural network component compared to other modules, would it be a good idea to
< rcurtin> kalingeri: your message was too long, it got cut off after "would it be a good idea to" :)
< kalingeri> Oh :). I just wanted to know if it would be a good idea to write a tutorial on it with example code ?
< rcurtin> have you seen this wiki page? https://github.com/mlpack/mlpack/wiki/Neural-Network
< rcurtin> I think that eventually when the ANN code is stable a tutorial will get written and added to the list of tutorials, but that wiki page should be helpful for now
Eloitor has joined #mlpack
< kalingeri> Yes I used the same page to set up a simple network as well. I was thinking of an example centric one, but it's better I wait then. Any other tasks I can get my hands on ?
< rcurtin> did you take a look at the github issues list?
< rcurtin> I think there are some issues marked "easy" that relate to the ANN code
< kalingeri> I will spend my time there then, thanks :)
< rcurtin> sure, glad I could help
palashahuja has joined #mlpack
< palashahuja> zoq, hi
< palashahuja> for dropconnect we could simply transfer the attributes to a temporary layer ..
tsathoggua has joined #mlpack
Stellar_Mind2 has joined #mlpack
< Stellar_Mind2> Hi! I am a 4th year undergraduate student at BITS Pilani, Goa campus pursuing Electrical and Electronics engineering. I am mighty interested in the neuro evolution algorithm project idea. I had planned to implement this in the summer before itsel, the fact that it is a part of MLPACK makes it sweeter. I was previously referring to this video and the links in the description- https://www.youtube.com/watch?v=qv6UVOQ0F44.
< Stellar_Mind2> I have previously developed a high speed neural network classifier for classification and regression of online sequential data with incremental classes as an Intern at NTU Singapore. The results have been submitted to IEEE IJCNN 2016. I would like to contribute to this project, can anyone guide me on the best way to showcase my proficiency?
travis-ci has joined #mlpack
< travis-ci> awhitesong/mlpack#12 (LeakyReLULayer - ea703b9 : Dhawal Arora): The build failed.
< travis-ci> Change view : https://github.com/awhitesong/mlpack/compare/e671aa8479cb^...ea703b9e80bf
travis-ci has left #mlpack []
albert has joined #mlpack
albert is now known as Guest43995
Guest43995 has quit [Client Quit]
archange_ has quit [Quit: My Mac has gone to sleep. ZZZzzz…]
chick_ has joined #mlpack
palashahuja has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
< zoq> Stellar_Min: Hello, that sounds great. A good start would be to compile mlpack and explore the source code, especially the neural network code and the code to communicate with the emulator. You can find the code used to communicate with the NES emulator here: https://github.com/zoq/nes. Also take a look at the mailing list archive, to get more informations: https://mailman.cc.gatech.edu/pipermail/mlpack/2016-M
< zoq> arch/thread.html
pkgupta_ has joined #mlpack
pkgupta has quit [Ping timeout: 244 seconds]
pkgupta_ is now known as pkgupta
palashahuja has joined #mlpack
Eloitor has quit [Ping timeout: 252 seconds]
< zoq> palashahuja: I'm not sure what you mean.
< palashahuja> What I meant was to think dropconnect as layer
< palashahuja> So my idea is to assign the weights attribute to the dropconnect class itself and not to baselayer
< palashahuja> and so on and so forth
< zoq> palashahuja: I'm not sure why we should do that, the code I sent you yesterday works fine.
< zoq> palashahuja: Maybe I missed something?
< palashahuja> zoq, for adam optimizer are there any papers that you could recommend ?
< palashahuja> never mind I found it
< zoq> palashahuja: The problem is there is no way we can solve the optimizer problem inside the layer. I'll have to think about the problem, there is definitely a solution. What you could do is to open a pull request using the code I sent you yesterday and test it with the modified optimizer.
< zoq> palashahuja: The paper "Adam: A Method for Stochastic Optimization" by D. Kingma is pretty good.
< palashahuja> or the idea that I suggested earlier ..
< rcurtin> I found a really cool feature of the Boost Unit Test Framework today:
< rcurtin> thought it might be useful to post here
< rcurtin> now I can do something like BOOST_AUTO_TEST_CASE_TEMPLATE(KDTreeTest, ElemType, boost::mpl::list<float, double>)
< rcurtin> and then the KDTreeTest will be run with both float and double as ElemType
< rcurtin> I think I'll probably use that for some optimizer tests
< zoq> rcurtin: neat
< zoq> palashahuja: The idea to implement DropConnect as conenction, so without using a base layer?
< palashahuja> yes
ranjan123 has quit [Quit: Page closed]
rohitpatwa has quit [Ping timeout: 264 seconds]
pkgupta has quit [Quit: Going offline, see ya! (www.adiirc.com)]
travis-ci has joined #mlpack
< travis-ci> mlpack/mlpack#604 (master - e6f7ffe : Marcus Edel): The build is still failing.
travis-ci has left #mlpack []
< Stellar_Mind2> Hi @Zoq. I am on it! That NES emulator is really cool. Are you by any chance sethbling on Youtube? (Just some trivia I wanted to confirm)
vineet has quit [Ping timeout: 246 seconds]
< zoq> palashahuja: You could do that, but it wouldn't solve the optimizer issue.
< zoq> Let me try to explain the issue: Once we call the Train(..) function the optimizer is called e.g. SGD or RMSprop. So the first step of the optimizer is to calculate the first objective function, by calling the Evaluation function for each sample of the dataset.
< zoq> The network implements that Evaluation function and runs the forward step for each sample.
< zoq> And here comes the problem, once the Forward function of the DropConnect layer is called, it doesn't know in what state it is. It could be in the training state or in the predicting state.
< zoq> So let's say, state = train. We randomly set weights to zero we do that for all samples, which ends up in a matrix filled with zero (not necessary but in most cases).
< palashahuja> hmm .. okay
< zoq> The only solution is to tell DropConnect the state of the current optimization process. So for the first n samples prediction mode, for the next x samples training mode and for the last m samples prediction mode.
< zoq> Stellar_Min: No, sorry!
< Stellar_Mind2> Wow. there are so many interesting projects in MLpack! Is there any precedence list for the project ideas?
palashahuja has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
< zoq> Stellar_Min: There are't any specific priorities for projects.
< Stellar_Mind2> okay!
travis-ci has joined #mlpack
< travis-ci> awhitesong/mlpack#13 (master - e6f7ffe : Marcus Edel): The build was fixed.
travis-ci has left #mlpack []
Stellar_Mind2 has quit [Ping timeout: 276 seconds]
kalingeri has quit [Ping timeout: 276 seconds]
cesc_folch has joined #mlpack
Eloitor has joined #mlpack
FRossi has quit [Remote host closed the connection]
chick_ has quit [Quit: Connection closed for inactivity]
palashahuja has joined #mlpack
< kirizaki> guyz, I'm already working about simple fuzzy logic api
< kirizaki> but I understand it's not so novel at all
< rcurtin> what do you mean?
< kirizaki> because there is a lot of Fuzzy Logic implementations in C++
< rcurtin> it's okay; maybe with Armadillo (which is fast) and template metaprogramming, we can make something that's a bit faster :)
< kirizaki> and due to my programming experience it's gonna be very difficult for me to make it fast
< rcurtin> it's okay, we can go back and forth about how to make it faster :)
< kirizaki> that is why I'm gonna try to do ANFIS
< rcurtin> yeah, we can see how it performs
palashahuja has quit [Ping timeout: 268 seconds]
< rcurtin> if it's designed well, it should be pretty easy to add more fuzzy logic algorithms
< kirizaki> ok
Eloitor has quit [Quit: Page closed]
cesc_folch has quit [Quit: Page closed]
cache-nez has quit [Ping timeout: 248 seconds]
LimeTheCoder has joined #mlpack