ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< Mina>
Hi, is this the right channel to discuss GSoC projects?
pd09041999 has quit [Remote host closed the connection]
KimSangYeon-DGU has quit [Quit: Page closed]
aman_p has joined #mlpack
Mina has quit [Quit: Page closed]
aman_p76 has joined #mlpack
< aman_p>
hi!
< aman_p>
can anyone help, how to start with tensorflow translators.
< aman_p>
i just want a little idea for a kickstart. thank you :)
aman_p76 has quit [Ping timeout: 245 seconds]
aman_p has quit [Ping timeout: 250 seconds]
aman_p has joined #mlpack
aman_p76 has joined #mlpack
aman_p has quit [Remote host closed the connection]
aman_p has joined #mlpack
aman_p_ has joined #mlpack
< aman_p_>
hi
< aman_p_>
just checking irc clients for the first time
< aman_p>
hey
< aman_p>
well check it
< aman_p_>
doesnt work the way i thought
< aman_p_>
bye
kinshuk has joined #mlpack
aman_p76 has quit [Ping timeout: 245 seconds]
aman_p has quit [Ping timeout: 240 seconds]
aman_p_ is now known as aman_p
kinshuk has quit [Ping timeout: 250 seconds]
kinshuk has joined #mlpack
kinshuk_ has joined #mlpack
kinshuk has quit [Ping timeout: 255 seconds]
kinshuk_ has quit [Ping timeout: 245 seconds]
KimSangYeon-DGU has joined #mlpack
aman_ has joined #mlpack
aman_p has quit [Quit: Connection closed for inactivity]
aman_ has quit [Remote host closed the connection]
aman_p76 has quit [Remote host closed the connection]
aman_p76 has joined #mlpack
aman_p has quit [Read error: Connection reset by peer]
aman_p76 has quit [Ping timeout: 255 seconds]
Mina has joined #mlpack
rathod_sahaab has joined #mlpack
Aman_ has joined #mlpack
Aman_ has quit [Ping timeout: 256 seconds]
rathod_sahaab has quit [Ping timeout: 256 seconds]
aman_p has joined #mlpack
aman_p has quit [Read error: Connection reset by peer]
aman_p has joined #mlpack
aman_p has quit [Read error: Connection reset by peer]
Mina has quit [Ping timeout: 256 seconds]
aman_p has joined #mlpack
aman_p has quit [Read error: Connection reset by peer]
aman_p has joined #mlpack
aman_p has quit [Read error: Connection reset by peer]
aman_p has joined #mlpack
aman_p has quit [Ping timeout: 250 seconds]
picklerick has joined #mlpack
KimSangYeon-DGU has quit [Quit: Page closed]
rathod_sahaab has joined #mlpack
rathod_sahaab has quit [Client Quit]
KimSangYeon-DGU has joined #mlpack
Suryo has joined #mlpack
< Suryo>
zoq, rcurtin: I have a couple of questions regarding the GSoC application.
< Suryo>
1. I have familiarized myself with the code base, tweaked the code, got it to build on my system, included the pso module and passed tests (so far, on my system). Would it be okay to mention all of these things in my application?
< Suryo>
2. There has definitely been some efforts in the past and even currently to introduce PSO to ensmallen and mlpack and some of it coincides with the requested objectives of the project.
< Suryo>
In order to maintain the strength of the application and avoid doing any redundant work - would it be okay to mention variants of PSO that have not been asked for in the ideas page?
< Suryo>
3. In the past, we have spoken about parallelization and one of the suggestions was that I look at the codebase of armadillo. Briefly stated, the parallelization that I think is appropriate comes in two different forms:
< Suryo>
(i) parallelization in the PSO code in ensmallen (ii) parallelization within the matrix operations done by armadillo
< Suryo>
However, at present, I have not gone through the codebase of armadillo thoroughly (I did look at it a couple of times to understand the flow, but my efforts weren't substantial).
< Suryo>
Would it be okay to mention parallelization (within ensmallen as well as arma) as a kind of a secondary objective in the proposal?
< Suryo>
4. Can I submit two proposals? I'm kind of interested in the Quantum Gaussian Mixtures idea as well (but I haven't studied it). I'm familiar with Gaussian Mixture Models, though (covered a bit of it in class and I've gaussian mixtures in my work in the past)
< Suryo>
5. zoq: Related to constraints in PSO: At present, I have not introduced constraints in the PSO implementations. However, I have some ideas.
< Suryo>
Basically, I think that every constraint for a problem should be a function, and that an array (or vector) of functions should be passed as a parameter to the PSO class.
< Suryo>
I can describe this using pseudocode. However, what we do in practice might change later on when the actual implementation is done. Assuming that my proposal is accepted, is it okay if something I mention in the proposal goes through changes at the implementation phase?
< Suryo>
Related to query 2: the variants of PSO I would be interested in are essentially in integer and binary search spaces and not just real valued search spaces.
< Suryo>
Those are a lot of queries, I suppose. But kindly let me know :) I'll begin working on my proposal(s) once I'm clear about the above.
< Suryo>
Thank you
Suryo has quit [Quit: Page closed]
Suryo has joined #mlpack
picklerick has quit [Ping timeout: 268 seconds]
< Suryo>
Last thing: Related to query 1: I've also submitted a pull request recently. Would it be okay to mention that as well?
Suryo has quit [Client Quit]
nisarg_ has joined #mlpack
< nisarg_>
Hello sir, I am Nisarg Shah. I wish to send a proposal for GSOC 2019. I am interested in ML algorithms and their optimisation to get better results . Could you guide me on how to start contributing and what should be the first steps?
nisarg_ has quit [Ping timeout: 256 seconds]
nisarg has joined #mlpack
nisarg has quit [Ping timeout: 256 seconds]
abc__ has joined #mlpack
abc__ has left #mlpack []
vivekp has quit [Ping timeout: 245 seconds]
vivekp has joined #mlpack
Robb9 has joined #mlpack
< Robb9>
What happened to Bang Liu’s NEAT method? Did it ever get merged?
vivekp has quit [Read error: Connection reset by peer]
vivekp has joined #mlpack
Robb9 has quit [Ping timeout: 256 seconds]
Robbb has joined #mlpack
< rcurtin>
Robb9: Robbb: no, it didn't get merged, but it was most of the way there
Robbb has quit [Ping timeout: 256 seconds]
< rcurtin>
Suryo: I'll do my best to answer but we are still a month away from the application period being open :)
< rcurtin>
1) Sure, of course it's reasonable to mention what you've done
< rcurtin>
2) yes, definitely it is okay to go above and beyond what's suggested on the ideas page
< rcurtin>
3) yeah, parallelization as a secondary objective is reasonable, but be sure that you include details of what you're planning. specifically, OpenBLAS provides parallelism for a lot of what Armadillo does
< rcurtin>
so it would be important to be sure that the things you wanted to provide parallelism for aren't already covered by OpenBLAS
< rcurtin>
4) you can submit two proposals, but often it's better to focus on one proposal instead of spreading yourself a bit thin between two (up to you what you'd like to do there)
< rcurtin>
hope that is helpful :)
vbsinha has joined #mlpack
vbsinha has quit [Ping timeout: 256 seconds]
Suryo has joined #mlpack
< Suryo>
rcurtin: thank you for your response, that was very helpful. I've made a note of the things you've said and I'll work on them. I just want to make sure that I can make a proposal that can be followed as exactly as possible :)
Suryo has quit [Quit: Page closed]
robb_ has joined #mlpack
< robb_>
I have a question... I thought each neuron automatically had a bias but Bang Liu's NEAT treats them as separate "bias neurons"?
< robb_>
Why is that exactly?
< robb_>
I thought each neuron automatically had one, but it seems that I am wrong
< rcurtin>
I can't answer that question; it's possible that Bang's implementation has issues (remember that it was not merged at the end of the summer because it was incomplete)
vivekp has quit [Read error: Connection reset by peer]
< rcurtin>
I also didn't work directly on that project so maybe others have better input
< robb_>
Got it
robb_ has quit [Quit: Page closed]
vivekp has joined #mlpack
robb1 has joined #mlpack
rajiv_ has joined #mlpack
< zoq>
robb_: You could include the bias, there is no particular advantage.
rajiv_ has quit [Ping timeout: 256 seconds]
rajiv_ has joined #mlpack
< rajiv_>
zoq: I have added Forward() to Dense blocks... It would be great if you could help me out with the Backward() function... The description of the problem is in the PR :)
< zoq>
rajiv_: Will take a look once I have a chance.
< rajiv_>
Thanks!
< rajiv_>
zoq: I am interested to contribute to the idea "Essential Deep learning Modules" in GSoC 19. I want to work on "Graph Neural Networks" as they perform really well on Non-Euclidian data. I feel that it would be great addition to the MLPack algorithms. Please let me know what you think... I have also sent a mail in the mailing list :)
rajiv_ has quit [Quit: Page closed]
< zoq>
Right, I saw the mail but I haven't got a chance to get to it.
rajiv_ has joined #mlpack
< rajiv_>
Oh cool!
rajiv_ has quit [Client Quit]
vijaysingh has joined #mlpack
robb1 has quit [Quit: Page closed]
vijaysingh has quit [Quit: Page closed]
kinshuk has joined #mlpack
robb_ has joined #mlpack
< robb_>
hey, what is an IdentityLayer<>? Doesn't that just mean Linear?
unitrix1999 has joined #mlpack
< robb_>
an example is it being used in the RNN tutorial
< zoq>
robb_: It does nothing just forwarding the input
< robb_>
Then why just not add the layer ?
< zoq>
in this case it's used to avoid an unnecessary compution of the gradient
< robb_>
oh
< robb_>
thank you
< zoq>
we don't need to perform the backward step for the first layer, because there is no layer we would have to pass the error to
unitrix1999 has quit [Client Quit]
< robb_>
what happened to the input layer?
< robb_>
by that I mean is Add<> the first actual layer?
< zoq>
Hold on, what I said, is the other way around, if the first layer would be e.g. a linear layer we don't perform the backward step since there is no layer we would have to pass the error to.
< zoq>
But since we use a recurrent layer which holds other layers like Add<>, we have to put an empty layer infront.
< zoq>
Sorry for the confusion.
< robb_>
oh ok that makes sense now
< zoq>
There was a plan to detect if we have to add a Identitylayer.
robb_ has quit [Quit: Page closed]
< zoq>
Basically if a layer implements the Model() function and is the firt layer we have to add an Identitylayer.
Mina has joined #mlpack
< zoq>
Mina: Hello there, this is the right channel for questions :)
< Mina>
zoq: Thanks for your reply. I've chosen a project - NEAT - I am interested in contributing to. I'm a successful GSoC student from last year and got everything almost prepared for this project. I wanted to know if there is like a qualification task or anything required regarding this project.
< zoq>
There is no actual requirement; if you like you could see if you can extend an exsisting method or add something to the ensmallen framework to get familiar with the codebase; but don't feel obligated.
< Mina>
So other than that I can start with the application? If so should I share a draft with the mentor? Last year I actually prepared the proposal with the mentor but thought to ask first since every organization has its own rules.
< Mina>
porposal*
< zoq>
Usually, we do our best to give general feedback once applications can be shared via the dashboard.
< Mina>
By that I mean I would prepare a full proposal and iterate it according to his feedback.
< Mina>
Okay, so basically all I can do till that stage is getting to know the code more.
< zoq>
Getting familiar with the codebase and the community around the project is definitely helpful.
< Mina>
Okay, thanks a lot for your help. Most probably will work on a patch for an issue or an extension before the proposal opens.
< zoq>
Sounds good, let me know if I should clarify anything.
< Mina>
Thanks will do. Already done a great job with the wiki.