ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< ArunavShandeelya> <abhisaphire[m] "Hey Arunav Shandeelya: "> Thank you, I am looking into it.
jenkins-mlpack2 has quit [Ping timeout: 265 seconds]
jenkins-mlpack2 has joined #mlpack
ImQ009 has joined #mlpack
< PrinceGuptaGitte> Hi @kartikdutt18, can you tell me how did you make the `LSTM_MULTIVARIATE_TIME_SERIES.ipynb` file in your PR in examples repo(like did you use Jupyter or something else).
< PrinceGuptaGitte> I was thinking if it'll be a good idea to add some basic CV examples in notebook format.
< kartikdutt18Gitt> Yes it will be, I used Xeus-Cling kernel for Jupiter notebook. However with the current implementation I faced some issues so zoq will be opening a PR with some examples as well so I think that would be a better reference than what I did. Also CV notebook would be even better since we could plot some images etc.
< kartikdutt18Gitt> Here is the GitHub link for [xeus-cling](https://github.com/jupyter-xeus/xeus-cling).
< PrinceGuptaGitte> Oh nice, then I'll wait till zoq open a PR. Thanks for the info.
< PrinceGuptaGitte> I'll also try with this notebook though, it seems cool
< kartikdutt18Gitt> You can try running it till then with the examples maybe it might work on your system cause its really cool compared to normal cpp files.
< himanshu_pathak[> Yeah nice idea I tried to run kartikdutt18 (Gitter): your example LSTM I thought their is any problem with my setup
< himanshu_pathak[> Every time it run new shell it disconnect and delete previous history I don't know why
< himanshu_pathak[> *When I run code cell
< kartikdutt18Gitt> I am not sure why either, on my machine it only runs one epoch. The cpp version works fine so I don't know why this is happening.
< himanshu_pathak[> May be somewhere in mlpack code we are using any feature which xeus-cling compiler does not support not sure about this or may be we are doing any silly thing not sure
< kartikdutt18Gitt> I am pretty sure its the latter, don't know what I am missing.
< himanshu_pathak[> Yeah that one has higher probability of happening. I will get back to you if I find something.
< LakshyaOjhaGitte> @favre you online?
< kartikdutt18Gitt> himanshu_pathak, Great.
favre49 has joined #mlpack
< favre49> LakshyaOjhaGitte: now I am
< LakshyaOjhaGitte> wanted to know if fold layer is a good addition?
< LakshyaOjhaGitte> @sreenik also, what do you suggest if I should proceed with this?
< favre49> Hmmm while I guess feature parity with pytorch is a good thing, I'm not sure I understand the utility of their fold layer?
< LakshyaOjhaGitte> Actually I thought the same, had a talk with @kartikdutt18 also regarding this but I thought I should ask first. I think @sreenik will be able to give a good insight also
< LakshyaOjhaGitte> (edited) ... insight also => ... insight also.
< sreenik[m]> favre49: Yes even I didn't get a very good idea of its use. I believe this has something to do with 4-d tensors
< Saksham[m]> Has it been used in literature somewhere ?
< kartikdutt18Gitt> Hey sreenik, favre49, as far as I understood, the unfold portion returns reshaped input to perform any operation along an axis, I think for this purpose we can simply change inputW inputH for the next layer since we pass matrices in layers. What do you think?
< kartikdutt18Gitt> Does that make sense?
< jeffin143[m]> :)
< sreenik[m]> kartikdutt18: that looks like an adaptation of the torch layer but given mlpack's structure its importance/usage also needs to be weighed.
< LakshyaOjhaGitte> Yes, valid point.
< LakshyaOjhaGitte> Found something on matlab might help with this discussion
< LakshyaOjhaGitte> they are saying to apply convolution operation individually at each time step
< kartikdutt18Gitt> > kartikdutt18: that looks like an adaptation of the torch layer but given mlpack's structure its importance/usage also needs to be weighed.
< kartikdutt18Gitt> That makes sense.
< LakshyaOjhaGitte> I think it is better used in pytorch because it has n dimensional data.
< AbishaiEbenezerG> hi @rcurtin . with reference to #2347 , i checked out my CMakeLists and it was pretty different compared to the one posted on the issue. In fact , i had got the same error a few weeks ago where the preprocessor was complaining about armadillo
< AbishaiEbenezerG> and it was corrected with a simple make install
< rcurtin> AbishaiEbenezerG: I think the bug report isn't about the CMakeLists.txt in the mlpack repository, it looks to me like the user is building mlpack into a downstream application with its own CMake configuration (which is the one that was posted)
< AbishaiEbenezerG> oh ok
< AbishaiEbenezerG> but how do you know its a downstream application @rcurtin
< AbishaiEbenezerG> and also, i could not find FindMLPACK.cmake file in the version of mlpack i have...
< AbishaiEbenezerG> Just am kinda curios to know why there are differences...
< jeffin143[m]> rcurtin (@freenode_rcurtin:matrix.org): done with image binding , do take a look before you release
< jeffin143[m]> And also shuffle data solit
< AbishaiEbenezerG> sorry @rcurtin totally my bad. I found the FindMLPACK.cmake file
favre49 has quit [Remote host closed the connection]
< rcurtin> AbishaiEbenezerG: no worries :)
< rcurtin> AbishaiEbenezerG: I knew it was downstream because of the first sentence of the issue: "Hello, I have a problem with building my package that uses mlpack." :)
< rcurtin> jeffin143[m]: awesome, thanks---we decided to push #1366 to a future release so probably #2019 (and the shuffle data split, I think that is close) are close to the last things
< himanshu_pathak[> Hey rcurtin: Can you check #2324 I think that is also necessary to merge before release Adding copy constructor
< himanshu_pathak[> You mentioned this in #2326
< jeffin143[m]> rcurtin : 1366 probably would be a great support
< jeffin143[m]> I am following it closely :)
< rcurtin> jeffin143[m]: yeah, agreed, it can help clean things a lot
< rcurtin> himanshu_pathak[: sure, I agree, the copy constructors working correctly would be nice. I'll try to review #2324 today, and if it's close, we can hold up the release for it, but if it looks like there's still a long way to go, maybe not worth it (that could be a part of a 3.3.1 release)
< jeffin143[m]> rcurtin (@freenode_rcurtin:matrix.org): when are you planing for the release ??
< rcurtin> hehe, three weeks ago? :-D
< rcurtin> I think it can be done this week. but probably we shouldn't trust my estimates anymore, or ever :)
< rcurtin> I made some nice steps forward with the automated ensmallen release process, so I'm hoping maybe I can do the same for mlpack---then, later, we won't need me to do these big manual releases, and it will be much easier to have quicker releases
< metahost> rcurtin: how do you decide on the release schedule? (As in when to release, what to include etc.)
favre49 has joined #mlpack
favre49 has quit [Quit: Lost terminal]
< rcurtin> metahost: ...it's basically arbitrary :) one of the things that I have always thought is that I've always wanted to release mlpack more often, but it's tricky because there's often so much in motion
< rcurtin> I think that, if I can automate release scripts, we can adopt a policy of "release every handful of weeks"
< rcurtin> I've made some attempts to have scripts do this, but they're not perfect (...yet :))
< rcurtin> time is really the limiting factor though
< LakshyaOjhaGitte> What is the scenario at your place?
< jeffin143[m]> Asia largest slum reports first death due to covid
< LakshyaOjhaGitte> just heard US numbers reached 2lac
< jeffin143[m]> 7 lakh people in 2.1 sq km
< LakshyaOjhaGitte> pretty bad :(
< rcurtin> I live in Atlanta and here there is a shelter-in-place order, which basically means "no non-essential trips", so, I'm only going out to buy groceries every two weeks or so
< rcurtin> I don't currently know anyone who is infected; the most tests are being done in New York, not where I am in Georgia. I think there are far more infections than are reported here, but that's just a guess
< rcurtin> for me life is not changing much, I already worked from home many days, now I work from home every day :)
< abhisaphire[m]> Same scenario in india too rcurtin
< rcurtin> yeah, it's very unfortunate, and the worst thing is that the worst is yet to come :(
< jeffin143[m]> rcurtin (@freenode_rcurtin:matrix.org): exactly testing is done less then expected and hence less positive cases
< jeffin143[m]> Yes , I hope some one comes up with some vaccination or something
< PrinceGuptaGitte> I live in Delhi, India, and there is a known community outbreak now. I'm not going out anymore
< LakshyaOjhaGitte> it might take months to curb this outbreak and vaccine is not even expected before an year from now.
< PrinceGuptaGitte> Worst part about this virus is, it is very infectious, high mortality rate of around 30% and convenient incubation period of over a week(for it to spread)
< LakshyaOjhaGitte> fact: the fastest vaccine that broke record was discovered in 5 years time
< rcurtin> one good thing is that lack of widespread testing means that the reported mortality rate is likely a pretty loose upper bound
< rcurtin> (it's still far higher than anyone would hope though!)
< PrinceGuptaGitte> Indeed.
< jeffin143[m]> I wish it would have as easy as writing a simple snippet of code
< jeffin143[m]> Ig , after everything come back to normal there would be a shift in software industry
< jeffin143[m]> More people would be looking for work for home perks
< rcurtin> yeah, that's actually something I'm really hopeful about. it might help with traffic and air pollution :)
< abhisaphire[m]> Indeed the only good side of this lockdown is rejuvenation of nature
< LakshyaOjhaGitte> People living in and around delhi can see the real difference in Air pollution.
< LakshyaOjhaGitte> Ppl here are happy with AQI of 158 which means unhealthy XD
< LakshyaOjhaGitte> Just to lighten up the mood, here's a funny video https://www.youtube.com/watch?v=BK0KcFu1Dtk
< LakshyaOjhaGitte> XD
< PrinceGuptaGitte> Hi @kartikdutt18, sorry to ping you again, but can you tell me how can i link mlpack in the xeus-cling notebook?'
< PrinceGuptaGitte> (edited) ... xeus-cling notebook?' => ... xeus-cling notebook?
< PrinceGuptaGitte> (edited) ... you tell ... => ... you please tell ...
< zoq> PrinceGuptaGitte: See the pragma on the top of the notebook.
< PrinceGuptaGitte> I can't find it
< LakshyaOjhaGitte> Hey @zoq wanted to ask you, is there any good in implementing the fold/unfold layer in mlpack?
< LakshyaOjhaGitte> I think it is good for n dimension data only, am not sure.
< LakshyaOjhaGitte> sorry to interrupt your conversation @prince776
< PrinceGuptaGitte> It's okay :)
< PrinceGuptaGitte> Thanks
< zoq> LakshyaOjhaGitte: What would that layer do?
< zoq> LakshyaOjhaGitte: And do you have a method in mind that would use that layer?
< LakshyaOjhaGitte> I think it is just used to reshape the input matrix and generate more layers/slices in it.
< LakshyaOjhaGitte> https://pytorch.org/docs/stable/nn.html#unfold , I think it is not useful in mlpack because of dimension limitation
< zoq> LakshyaOjhaGitte: In mlpack slides are encoded as cols, so on a first look I think you could use the subview layer to get the same result.
< kartikdutt18Gitt> I agree, since layers pass matrices so we can we simply use subview or change input-params for the next layer (if possible).
< LakshyaOjhaGitte> So in end it is not advisable to implement this layer here.
< LakshyaOjhaGitte> It was good that I asked, got everything clarified.
< zoq> LakshyaOjhaGitte: At least I don't see a need for the layer right now, maybe there is some need in the future.
togo has joined #mlpack
ImQ009 has quit [Quit: Leaving]
< LakshyaOjhaGitte> Okay
< LakshyaOjhaGitte> Thanks
< metahost> rcurtin: Interesting! Versioning is a mysterious task. :P
< rcurtin> metahost: yeah, more work than I would hope, but it's a necessary evil :)
louisway has joined #mlpack
< metahost> Haha
louisway has quit [Remote host closed the connection]
togo has quit [Quit: Leaving]