ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< jenkins-mlpack2> Project docker mlpack weekly build build #98: NOW UNSTABLE in 5 hr 55 min: http://ci.mlpack.org/job/docker%20mlpack%20weekly%20build/98/
xiaohong has joined #mlpack
xiaohong_ has joined #mlpack
xiaohong has quit [Ping timeout: 256 seconds]
< rcurtin> birm[m]1: for now we added an azure workaround in #2306, so if branches merge from master then they should trigger azure for a rebuild
< jeffin143[m]> Rcurtin : how do we create variable length columns in armadillo ???
< rcurtin> jeffin143[m]: that's not possible with arma::mat, to do that would require the use of arma::field
< rcurtin> (alternately, consider 'std::vector<arma::vec>' or something?)
< rcurtin> if this is about the column-major string encoding, I do think that whatever you're doing now can just be transposed with no issue
< jeffin143[m]> Because in string encoding we give output as without padding , that means variable length rows , but how do we that in vectors
< jeffin143[m]> Ok I will just take a look
< jeffin143[m]> Then
< rcurtin> hmm, I think we would need padding in this case
< jeffin143[m]> Can we follow this approach ,. Merge string encoding as this and then refactor ? If you don't mind
< rcurtin> in this case I think I'd rather wait, since the current string encoding is "backwards" and wouldn't give results that are ready to use with other mlpack code
< rcurtin> really there's no hurry---we can easily release another one as soon as those are in
< rcurtin> (I say this a lot, so I hope it still means something, but I really would like to have significantly more frequent releases for mlpack)
< rcurtin> (perhaps not quite every PR, which is basically the ensmallen release cycle, but certainly every couple weeks. the 3.3.0 release has been slow since it was held up on Julia)
< jeffin143[m]> Ok , then I will take time and see into this issue
< rcurtin> yeah, there is no hurry, don't worry :)
< rcurtin> anyway, it's late here---I'm going to head to bed. 'night!
< jeffin143[m]> Good night
Khatna has joined #mlpack
Khatna has quit [Remote host closed the connection]
ayushwashere has joined #mlpack
ayushwashere has quit [Remote host closed the connection]
< jenkins-mlpack2> Project docker mlpack nightly build build #644: STILL UNSTABLE in 3 hr 14 min: http://ci.mlpack.org/job/docker%20mlpack%20nightly%20build/644/
< jeffin143[m]> On of the best bits I have read
ImQ009 has joined #mlpack
toluschr has joined #mlpack
AnjishnuGitter[m has joined #mlpack
witness has joined #mlpack
favre49 has joined #mlpack
< favre49> I really like arch, but what's not fun is spending your afternoon fixing some obscure bug that unfortunately breaks some part of your system :)
< favre49> Well I think it's fixed... I've probably jinxed it now though
favre49 has quit [Remote host closed the connection]
< hemal[m]> I've shared my proposal draft for review through GSoC portal. @zoq @rcurtin please provide your reviews whenever you get time.
< hemal[m]> Proposal Topic: Graph Convolutional Network (GCN)
< hemal[m]> Thanks.
< hemal[m]> jeffin143: thanks for sharing the article!
xiaohong_ has quit [Remote host closed the connection]
< rcurtin> hemal[m]: thanks for letting us know, but I'll just be honest and say that I'm not going to have any time to provide a review
< rcurtin> there's no way, given the huge amount of open PRs and other requests
< PrinceGuptaGitte> (edited) ... `model.Add<Convolution<>>(3, 64, 7, ... => ... `model.Add<Convolution<>>(3, 16, 7, ...
< PrinceGuptaGitte> (edited) ... `model.Add<Convolution<>>(3, 16, 7, ... => ... `model.Add<Convolution<>>(3, 16 *here changed*, 7, ...
< pickle-rick[m]> rcurtin: Should I send an email to the mailing list with a summary of my proposal? Or should we just submit the proposal through the GSoC portal, and hope for the best, since everyone will be busy with reviewing PRs and stuff?
< hemal[m]> rcurtin: thanks for letting me know
< hemal[m]> I'll Review some PRs to reduce some of the workload. 🙂
< zoq> pickle-rick[m]: You can directly send it to the GSoC portal, we we have time we can provide feedback over there.
< zoq> hemal[m]: Please feel free, any help is very much appreciated.
< pickle-rick[m]> <zoq "pickle-rick: You can directly se"> zoq: Thanks! Good to know.
< zoq> PrinceGuptaGitte: Maybe t he kernel size is wrong, did you build with DEBUG=ON, also might be good to step through the code with gdb.
< PrinceGuptaGitte> I didn't build with DEBUG=ON, should I do that while building mlpack?
< zoq> PrinceGuptaGitte: Yes, that enables bounds checks.
< PrinceGuptaGitte> Thanks, i'll try that
< kartikdutt18Gitt> Hi @prince776, it might be related to this [comment](https://github.com/mlpack/models/issues/63#issuecomment-599199730). Not really sure though. Hope it helps.
bisakh[m] has joined #mlpack
travis-ci has joined #mlpack
< travis-ci> shrit/models#12 (digit - f897bf9 : shrit): The build is still failing.
travis-ci has left #mlpack []
< zoq> So with #2278 merged the macOS build should be fixed, great work Yashwant!
< rcurtin> agreed, thank you!
SlackIntegration has quit [Ping timeout: 245 seconds]
< himanshu_pathak[> zoq (@freenode_zoq:matrix.org): I have sent a message on mailing list can you please reply
< zoq> himanshu_pathak[: Will do later.
< himanshu_pathak[> OK
LayMann has joined #mlpack
< LayMann> hey are there qualification rounds or should i debug and PR something
< zoq> LayMann: Hello, what do you mean with qualification rounds?
< LayMann> i meant sometimes there are tasks given by an organization to filter the students
< zoq> We don't require a contribution to be considered for GSoC.
< zoq> There are some open issues, if you find something intersting, please feel free to work on it.
< LayMann> so i should just write a proposal then
< LayMann> ?
< zoq> Also there are open PR's that are waiting for a review, so if you like to help with a review, that's another great way to get familiar with mlpack.
< zoq> You can, submit your proposal right away yes, it's certainly a great idea to get familiar with the codebase as well.
< zoq> One question we often ask is how does this integrate into the codebase, that's difficult to answer if you haven't looked into the project.
< LayMann> reinforcement learning is still an active project right?
< zoq> LayMann: yes
toluschr has quit [Remote host closed the connection]
mikhail has joined #mlpack
lozhnikov[m] has joined #mlpack
lozhnikov[m] is now known as lozhnikov[m][m]
lozhnikov_ is now known as lozhnikov
mikhail has quit [Quit: mikhail]
M_slack_mlpack13 has joined #mlpack
< M_slack_mlpack13> I'm interested in working on reinforcement learning project and would like to contribute for ACKTER and Persistent Advantage Learning DQN algorithm. In the proposal should I need to mention all the technicalities after going through the required code base or should I need to mention brief workflow about how I'm going to implement in 12 weeks time?
togo has joined #mlpack
Random14 has joined #mlpack
ImQ009 has quit [Quit: Leaving]
< zoq> M_slack_mlpack13: Hello there, ideally you can find a good balance; perhaps you can point out some details that are important to know for the project or what key element makes the method stick out etc.; noo need to explain everything in detail.
togo_ has joined #mlpack
Random14 has quit [Ping timeout: 240 seconds]
togo has quit [Quit: Leaving]
togo_ has quit [Quit: Leaving]