verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
sonudoo has joined #mlpack
< rcurtin>
MystikNinja: that is the correct paper, be sure to look through the references also
sumedhghaisas2 has quit [Ping timeout: 264 seconds]
sumedhghaisas has joined #mlpack
< sonudoo>
Hello everyone!
sonudoo has quit [Ping timeout: 260 seconds]
sonudoo has joined #mlpack
s1998_ has quit [Ping timeout: 260 seconds]
sonudoo has quit [Ping timeout: 260 seconds]
sonudoo has joined #mlpack
sonudoo has quit [Quit: Page closed]
s1998_ has joined #mlpack
thepro has joined #mlpack
s1998_ has quit [Ping timeout: 260 seconds]
manthan has joined #mlpack
bhuwnesh has quit [Quit: Connection closed for inactivity]
sumedhghaisas has quit [Ping timeout: 264 seconds]
< manthan>
@daivik, i was going through the irc logs and saw today about your concern for working on ANN for CLI. Actually I didnt start working on it as there were many people who had claimed the issue. And so i shifted my focus on pruning algorithms for decision trees.
< manthan>
however, i will take up that issue after the pruning algorithm is in place
< manthan>
Is there any specific reason that attention layer is not implemented but recurrent attention is implemented?
karan_ has joined #mlpack
< karan_>
Hello Everyone, My name is karan and i am a 3rd year graduate from ggsipu. I have a good experience with c++ and python. I Had been working with neural networks from past 1 month. I would like to contribute to Essential Deep Learning Modules. Would someone tell me about about the next steps please or help me connect with potential mentors.
sumedhghaisas has quit [Read error: Connection reset by peer]
manish7294 has quit [Ping timeout: 245 seconds]
MK_18 has quit [Quit: Leaving]
manish7294 has joined #mlpack
K4k has quit [Read error: Connection reset by peer]
ctsevsp has joined #mlpack
ctsevsp has quit [Client Quit]
rf_sust2018 has joined #mlpack
karan_ has quit [Quit: Page closed]
Trion_ has quit [Remote host closed the connection]
sumedhghaisas has joined #mlpack
sumedhghaisas2 has quit [Read error: Connection reset by peer]
Trion has joined #mlpack
sumedhghaisas2 has joined #mlpack
sumedhghaisas has quit [Ping timeout: 246 seconds]
manthan has quit [Ping timeout: 260 seconds]
daniel__ has joined #mlpack
< daniel__>
Hey guys , this is Daniel Li from the University of Edinburgh. I have gone through the idea list and found the project of reinforcement learning quite appealing to me. I have been participate in coding competition for years and write fluent c++ python code. I 've also learned relevant courses and skills in ML and NLP. Currently I am setting up mlpack on my PC, and would go over the code base after that.
< daniel__>
Hopefully I will submit the first version of my proposal over the weekend and solve some easy bugs on Github over that. Cheers.
daniel__ has quit [Quit: Page closed]
< zoq>
daniel__: Weclome, glad you like the idea, and let us know if we should clarify anything.
yamidark has joined #mlpack
bhuwnesh has joined #mlpack
Trion has quit [Remote host closed the connection]
mrcode has joined #mlpack
yashsharan has joined #mlpack
yamidark has quit [Quit: Page closed]
K4k has joined #mlpack
ImQ009 has joined #mlpack
Trion has joined #mlpack
< yashsharan>
Hello. I was going through the gym_tcp_api example and I am having trouble understanding that how are we importing the gym environment.Specifically in which file has the gym namespace has been declared?
manish7294 has quit [Ping timeout: 245 seconds]
manish7294 has joined #mlpack
< yashsharan>
Also I had i doubt that if i'm trying to make my own RL model, how can i call this gym_tcp_api in my code?
mrcode has quit [Quit: Leaving.]
manish7294 has quit [Ping timeout: 248 seconds]
manish7294 has joined #mlpack
< yashsharan>
Nevermind,I just saw that mlpack already has methods for reinfocement learning available.Thanks anyway
manish7294 has quit [Ping timeout: 245 seconds]
luffy1996_ has joined #mlpack
Trion has quit [Quit: Entering a wormhole]
luffy1996_ is now known as luffy1996
manish7294 has joined #mlpack
mrcode has joined #mlpack
rf_sust2018 has quit [Quit: Leaving.]
yashsharan has quit [Ping timeout: 260 seconds]
rf_sust2018 has joined #mlpack
bhuwnesh has quit [Quit: Connection closed for inactivity]
spatodia has joined #mlpack
spatodia has quit [Quit: Page closed]
csoni has joined #mlpack
rf_sust2018 has quit [Quit: Leaving.]
mrcode has quit [Quit: Leaving.]
rf_sust2018 has joined #mlpack
daivik has joined #mlpack
< daivik>
manthan: Sorry for the late reply, I think akhandait has already picked up the issue (he replied on the issue) - so you can maybe talk to him about collaborating on it
< farabhi>
Greetings, everyone. Farabhi is here. I'm completing my Master in Data Science, really interested in contributing to MLPack trhough GSOC 2018. I have good background in math, specifically in convex functions and hermitian matrices. Naturally, I loved SDP and MVU ideas from the list. But also recently I took a class in high performance computing and studied benchmark profiling and C optimization interfaces a lot (OpenMP and MPI). So 'Al
< farabhi>
'Profiling for Parallelization' seem also attractive to me. My question to mentors: based on all requests and suggestions you saw from potential candidates recently, what would you suggest as the most promising/available projects of four. In other words, since I cannot choose myself, what would you say is the most important for MLPack community from the four I listed? Thanks in advance.
< rcurtin>
farabhi: hello there
< rcurtin>
each project on the list is important to mlpack, otherwise it wouldn't be on the list
< rcurtin>
but the projects that will be chosen will be the strongest proposals across the projects, so I think you should not worry so much about which one you want to choose
< rcurtin>
and instead focus on writing a proposal that shows you have a clear plan and understand the project and have a realistic timeline
< farabhi>
rcurtin: sorry, the way I asked the question was confusing. I see that the number of people already committed to different topics. For example, MystikNinja is already looking at MVU. So I was curious if you think that any of those four are 'underrepresented', i.e. don't have potential contributors or have a small number compared to other ideas
< farabhi>
Since, I'm equally interested in these four, I would gladly focused on the one that received least attention from others.
< rcurtin>
farabhi: so, the list was actually cut off because of IRC line length limits, so I am not sure what the four projects are
< rcurtin>
but what I will say is, it's hard to know how much attention each project will get before the application period is actually over
< rcurtin>
I know for sure the deep learning modules project will get lots of applications because it always does, since it is in a popular field
< rcurtin>
but still, at the end of the day, the strongest proposals are most likely to get the spot, regardless of the project chosen
davidinouye has joined #mlpack
< davidinouye>
Does anyone know when/if the libmlpack-dev will be updated on in the apt-get repositories? Currently, using apt-get the version seems to be 2.0.2 even though the current stable version is 2.2.5.
csoni has quit [Quit: Connection closed for inactivity]
< rcurtin>
davidinouye: I think I filed an ubuntu bug asking them to update both libarmadillo-dev and libmlpack-dev, but it has not been done yet
< rcurtin>
I agree, 2.0.2 is far too out of date
< rcurtin>
you could send the maintainer an email (I think it is Barak Pearlmutter? not sure)
< rcurtin>
but we are also going to release 3.0.0 very soon, and at that point I will push on the debian repo admins to get a new version in
< farabhi>
rcurtin: got it, I think I misunderstood the selection process and now it's a bit clearer. Thanks!
< rcurtin>
farabhi: no problem, glad to clarify. every organization does it differently :)
< davidinouye>
rcuritin: Thanks for the info! Do you know if mlpack_det changed significantly in terms of speed or interface since 2.0.2? If not, I might just try to rework my python wrapper so that it doesn't require making mlpack from source. Thanks!
< rcurtin>
davidinouye: hmmm, so if you are willing to build from source, you could build the new python bindings which have a det() function that has the same functionality as the command line
< rcurtin>
maybe that is good enough, maybe not
< rcurtin>
but I seem to remember mentioning that before to you actually (but my memory is hazy)
< rcurtin>
I think there has been at least a small change to the DET code since 2.0.2, but I don't believe that it is major
< rcurtin>
certainly not an entire refactoring or anything; at most, maybe a template parameter was added between 2.0.2 and 2.2.5, and maybe some of the function signatures changed slightly
< davidinouye>
rcurtin: Yeah, I needed to extract and manipulate the internal tree so the current bindings wouldn't work.
< rcurtin>
right, now I remember a little better, yeah
< davidinouye>
rcurtin: Yeah, I think the templating is the main thing. I'll try to remove that in my code and see what happens.
< rcurtin>
sure, good luck, and I hope soon mlpack 3 will be in the apt repositories which should make your life easier
< daivik>
rcurtin: I know you asked me to consider extensions/improvements to the existing codebase -- rather than trying to propose new implementations (as it would just mean more code to maintain). I still feel that I would end up with a stronger proposal going with the clustering algorithms - and it would really be something I would genuinely be intereste
< daivik>
d in working on (as opposed to selecting something from the ideas page that I'm not 100% a-okay with). So here is my plan of action, let me know what you think. I'll draft a proposal for my clustering algorithms + ensemble methods, as best as I can. And if you think it looks completely horrible after that (hopefully you'll have a chance to review i
< daivik>
t before the deadline), then I'll pick an idea from the ideas page to go with.
< rcurtin>
daivik: that sounds reasonable to me, but I should point out it's not likely I'll have much of a chance to review proposals in too much detail before the deadline
< daivik>
completely understandable. thanks for the heads-up, and sorry for not heeding your (arguably, good) advice
< rcurtin>
no, it's fine, not all of my advice is the best :)