ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< jenkins-mlpack2>
* birm: documentation specific for this registry
< jenkins-mlpack2>
* Marcus Edel: Use the actual error message instead of a default one.
< jenkins-mlpack2>
* Ryan Curtin: Rename files and mark +x.
< jenkins-mlpack2>
* Ryan Curtin: Add first attempt at ensmallen memory check script.
< rcurtin>
vigsterkr[m]: yep, I'm here, just digging out from the weekend :)
< rcurtin>
robertohueso: just scrolled back and saw your message, the situation seems tough in Spain, stay safe! (and sane, if you're stuck inside all day :-O)
< vigsterkr[m]>
rcurtin: what's your status of azure pipelines?
< vigsterkr[m]>
i think we are dead since last friday :(
< rcurtin>
ohh, it says right there. I did not read closely enough
< rcurtin>
thank you :)
< vigsterkr[m]>
which literally contradicts their docs
< vigsterkr[m]>
of course
< vigsterkr[m]>
;)
< vigsterkr[m]>
as that should be the default behaviour :P
< vigsterkr[m]>
zoq: thnx for the link!
< Saksham[m]>
Hi Ryan Curtin, I was interested in the CMA-ES project. I have gone through the original paper and tutorial and through the codebase. Can you provide some more details i could include in my proposal ? Also does the project have to be about CMA based strategies , i read about natural evolution strategies (PEPG) and it seemed interesting to me.
< rcurtin>
zoq[m]: do you think we should open a PR with the workaround, and then ask everyone who is having trouble to merge master into their branch once the workaround is committed?
< rcurtin>
Saksham[m]: I don't know much about the CMA-ES project, unfortunately, so I can't give much input about it that would be more helpful than anything you might have already read :(
< Saksham[m]>
Thanks a lot !, i see zoq is listed as a potential mentor , he might me able to give me a better idea
< Saksham[m]>
Hi zoq can you go through my previous message about CMA-ES based optimizers and help me a little
favre49 has joined #mlpack
favre49 has quit [Remote host closed the connection]
favre49 has joined #mlpack
favre49 has quit [Remote host closed the connection]
favre49 has joined #mlpack
< favre49>
rcurtin You're right, I figured out what was up but it's weird
< favre49>
when I open it on a half-width window, it doesn't show. But it does on a full width window
< favre49>
And if i switch from full width to half width
< rcurtin>
oh, interesting, let's see if I can reproduce it
< rcurtin>
huh, can't seem to reproduce... stretched my window width from ~400px (small enough for the mobile version) to 6400px (I have a big resolution :))
< rcurtin>
the formulas seemed to display just fine the full time
< PrinceGuptaGitte>
https://www.mlpack.org/doc/mlpack-3.1.0/doxygen/anntutorial.html#layer_api_anntut in this page's Layer API section, to make our own Layer we need to make our getter for gradient matrix : `OutputDataType& Gradient() { return gradient; }`. But in FFN class I didn't notice its use. Is it integral for FFN to work?
< PrinceGuptaGitte>
Same for : `OutputDataType& Parameters() { return weights; }`
< PrinceGuptaGitte>
I didn't see their usage anywhere
< chopper_inbound[>
<toluschr "As you can see, the file libarma"> You can try removing armadillo and remaking the latest version, it worked for me for similar error
< LakshyaOjhaGitte>
whats going on these days, azure, github.....even corona
< LakshyaOjhaGitte>
@prince776 I am not sure but I think weights are used for differentiated use between layers.
< rcurtin>
toluschr: can you tell me more about the chroot environment you're using?
< rcurtin>
assuming the chroot is correctly in masterdir/, the make error doesn't seem to make sense
< rcurtin>
maybe the symbolic link fails when inside the chroot or something?
< favre49>
rcurtin Hmm I guess we'll never know. I assume you use firefox too
< kartikdutt18Gitt>
Hey @prince, The Parameter are used to set weights. You can refers to tests of LSTM, I think I saw various params in test for Transposed Convolutions.
< toluschr>
rcurtin It's the void-linux build system. I installed `libarmadillo-devel` and `boost-devel` as dependencies. I don't think that the symlink is a problem, as it is the official package.
< toluschr>
I can send you a log of all commands executed if you want me to
< rcurtin>
favre49: yeah, I also tested in chromium
< rcurtin>
toluschr: hm, interesting, but I guess I'm not sure what happens when something is symlinked outside the chroot; what happens in the chroot if you run `ls -lh /usr/lib/libarmadillo.so`?
favre49 has quit [Remote host closed the connection]
< jeffin143[m]>
Rcurtin : you there ??
< rcurtin>
jeffin143[m]: yeah, I'm still here, I have not gotten up in the past 30 seconds :-D
< jeffin143[m]>
:-p , you left a comment to implement column major in string encoding
< rcurtin>
jeffin143[m]: I haven't gotten back to that PR yet, I'll probably handle it later today if I can
< rcurtin>
toluschr: ok, I guess, I was just worried that the symlink was to a destination *outside* the chroot
< jeffin143[m]>
But I was doubtful how will we handle without padding case
< rcurtin>
but it looks like that is not the case
< rcurtin>
toluschr: I'm a bit at a loss for this one, my thoughts for directions to investigate are:
< rcurtin>
- is this a cross-compilation situation or anything? maybe libarmadillo.so is compiled for the wrong architecture and the compiler can't use it o?
< rcurtin>
- is the chroot of make actually the exact same as what you're thinking? like, is it possible that libarmadillo.so does actually not exist in whatever environment 'make' is running in?
< toluschr>
I'm not cross compiling, but I can try adding libarmadillo as a host dependency
< rcurtin>
yeah, maybe worth a shot? honestly I don't have too many clear ideas here
< toluschr>
Still doesn't work. I'm not sure whats going on here. Probably a stupid mistake I made.
< toluschr>
I'm thinking of just removing the check for libarmadillo.so
< rcurtin>
toluschr: removing from the Makefile directly? I guess that's one strategy that could work :)
< toluschr>
Well now that is really strange. g++ can't find it while linking either.
< toluschr>
rcurtin It has nothing to do with the chroot environment. It even fails on my host system.
< toluschr>
The voidlinux package is broken.
< toluschr>
The symlink points to nothing, because armadillo is not a dependency of armadillo-devel
< rcurtin>
toluschr: well, that could be the reason :-D
< toluschr>
Works now
< toluschr>
I spent an hour on nothing...
< rcurtin>
hehe
< rcurtin>
I know the feeling :)
< jeffin143[m]>
Hehe that was bad
< zoq>
rcurtin: Will open a PR with a quickfix for the ci later.
< PrinceGuptaGitte>
Just a side question, in your LeNet model, why did you use Sequential<> instead of FFN<>
< rcurtin>
zoq: one of two :)
favre49 has joined #mlpack
< favre49>
zoq I went ahead and merged it. That's a pretty big problem fixed :)
< rcurtin>
just merged it into #2305, seems to work just fine
toluschr has quit [Quit: toluschr]
< favre49>
Ah, so people will need to merge into the new master for it to work
< rcurtin>
yeah, but I don't feel the need to do a big blast of comments on all the PRs :) I was planning on only mentioning it on ones that I was reviewing
< favre49>
Wouldn't it be best if you could have mlpack-bot do announcements across PRs?
< favre49>
This kind of situation isn't normal (don't remember something like this happening last year) but in the case it does, saves you a headache :)
< kartikdutt18Gitt>
@prince776, Sequential allows me to think of it as a layer, So we can add something before the model too, incase the model is a subpart of the some other model. I don't think we can do model1.Add(model2) if both are FFN, So using sequential gives the user some flexibility.
favre49 has quit [Remote host closed the connection]
tip has joined #mlpack
tip is now known as Guest7503
Guest7503 has quit [Remote host closed the connection]
zoso_floyd has joined #mlpack
zoso_floyd has quit [Client Quit]
karyam has joined #mlpack
erdem has joined #mlpack
< erdem>
Hello, I wonder how to submit proposal to mlpack?
< himanshu_pathak[>
kartikdutt18 (Gitter): Currently in my pr of Neural Turing Machine I am trying to change FFN. So, that it can be used as right now I am stuck due to some bug once I find workaround it will be a nice thing for us to use
< himanshu_pathak[>
*So, that it can be used as a layer
erdem has quit [Remote host closed the connection]
< kartikdutt18Gitt>
Yes, I think that would be great. :)
< PrinceGuptaGitte>
I see. It's a nice idea.
< himanshu_pathak[>
Yes, but my only problrm about it is that everytime when I was able to find workaround new error come into my way.
< PrinceGuptaGitte>
If you get stuck at something for too long, you can always ask here, we are happy to help :)
karyam has quit [Remote host closed the connection]
< himanshu_pathak[>
Yes I am stuck with a problem it is as follows in memory
< himanshu_pathak[>
In memory_head_impl.hpp I am trying to do this gy += prevDW and gy is of type ErrorType and prevDW previously before rvalue refactor it was working but now I am getting an error note: ‘arma::vec {aka arma::Col<double>}’ is not derived from ‘const arma::Op<T1, op_type>’
< himanshu_pathak[>
prevDW is of type arma::vec I don't know why it's not working after r-value refactor
< PrinceGuptaGitte>
I don't know how exactly rvalue references work, I'll have to do some reading I guess.
< himanshu_pathak[>
Maybe ryan have better answer for this.
< PrinceGuptaGitte>
yes
< rcurtin>
himanshu_pathak[: I'm in a meeting right now so my answer will be kind of short but I hope it helps...
< rcurtin>
my guess is that gy was previously inferred as arma::vec, but now there is some type that's being inferred incorrectly
< rcurtin>
so, e.g., if you are doing Forward(a, b + c), where that "b + c" is an Armadillo expression, the type of the second parameter gets inferred as some arma::Op<...> intermediate type
< rcurtin>
the way to fix it might be to form a temporary vector or something for each of the parameters to Forward()
< rcurtin>
or something of this general idea---I hope the input helps
< himanshu_pathak[>
Thanks for the help I will try this outrcurtin: I will try to add temporary vectors may be this will work.
< zoq>
TrinhNgo[m]: If you have any question you can ask here, it's generally not a great idea to contact mentors directly (private message).
< zoq>
TrinhNgo[m]: To answer your question, we don't require a contributions to be considered for GSoC. But any contribution is great, because it shows you can work with a larger project.