ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
gitter_jacob-ear has joined #mlpack
gitter_jacob-ear is now known as jacob-earleGitte
< jacob-earleGitte>
I updated to Python 3.8 today and noticed that pip couldn't install mlpack. It did work with older versions though. When will the Python 3.8 compatible package be posted on pypi.org?
< rcurtin>
jacob-earleGitte: hang on a second, I can make that happen as soon as it's built... :) just gotta modify a config setting
< GauravSinghGitte>
Hey @jeffin143 @kartikdutt18 @prince776 Sorry the pull request that I created got deleted because of some mishap from my side. Will create a new PR in an hour. Sorry.
< RohitKartikGitte>
usually how long does this build take?
< rcurtin>
a couple hours or so, I think?
< rcurtin>
also, sorry about the windows build failures---this was due to a slight server misconfiguration on masterblaster.mlpack.org. I've got that fixed now, so I've restarted many of the PR builds
< RohitKartikGitte>
whats the main focus of "Application of ANN Algorithms Implemented in mlpack", to help new users get familiar and witness the power of mlpack right?
< RohitKartikGitte>
does this sound like something you would like me to work on this summer?
< kartikdutt18Gitt>
@gaurav-singh1998, no worries.
< simplesit[m]>
whoa, this place is really active!
< RohitKartikGitte>
Welcome to the community :)
< TanayMehtaGitter>
@OO7kartik The idea seems great! Do you happen have knowledge of any good visualization libs in cpp?
< hemal[m]>
Tanay Mehta (Gitter): A pull request is open for the issue. I am not sure if the functionality is implemented (since the PR seems stale (over a month since any new activity)). Perhaps, you could have a look at the PR mentioned above
< TanayMehtaGitter>
@hemal I was just looking at the PR, It failed in Memory checks. Also looking at the old comments, it seems that the person who took it was trying to replicate a function to guess the file-type similar to what has been done in arma internals. Might've been a slight disagreement there i guess
Guest47384 has quit [Remote host closed the connection]
< SaraanshTandonGi>
#2236 is taking very long. Is that normal?
basuki has joined #mlpack
< basuki>
hi
basuki has quit [Remote host closed the connection]
< KhizirSiddiquiGi>
hi basuki
< PrinceGuptaGitte>
Hi @khizirsiddiqui , I've implemented model.summary() function, can you have a look at it when you're free. (Right now I'm using std::cout instead of Log::Info because it was causing unknown problems I'll fix that soon). Thanks.
< PrinceGuptaGitte>
PR #2237
< PrinceGuptaGitte>
Can we not chain `<<` in `Log::Info` like we can in `std::cout`
< rcurtin>
zoq: I don't seem to have permission to cancel appveyor jobs; could I ask you to cancel all mlpack-wheels builds except 1.0.142 please? :)
< zoq>
rcurtin: ahh the permission issue
< rcurtin>
I dunno, maybe we can ping appveyor support, but it doesn't usually make too much of a difference for me
< rcurtin>
anyway, thanks for canceling them all :)
< zoq>
done
< zoq>
happy to help :)
< rcurtin>
:)
< rcurtin>
I think I am getting close to getting this Windows wheel build fixed... but I shouldn't say that, because I'm not actually sure...
< zoq>
HimanshuPathak: Sure, it's on my list.
< zoq>
I wonder what the electric bill looks like
< zoq>
"We will no longer be sending tshirts for GSoC programs (to students or mentors). " ... that is a bummer :(
< himanshu_pathak[>
> I think I am getting close to getting this Windows wheel build fixed... but I shouldn't say that, because I'm not actually sure...
< himanshu_pathak[>
rcurtin: If we get windows wheel fixed the we can also use it in conda build
< himanshu_pathak[>
It will be helpfull for that also
< himanshu_pathak[>
* > I think I am getting close to getting this Windows wheel build fixed... but I shouldn't say that, because I'm not actually sure...
< himanshu_pathak[>
rcurtin: If we get windows wheel build fixed the we can also use it in conda build
< rcurtin>
yeah, perhaps
< rcurtin>
it seemed like the conda build failure was a little bit different... I don't know how related the failures are
< rcurtin>
I'll try and take a look at the current status of the conda build soon, but it is definitely not easy
< rcurtin>
one of the things that really helped me out with the wheel build is that appveyor lets you remote desktop in to the node after it's finished building, so I was able to interactively explore what was wrong
< rcurtin>
do you think there is something similar for the conda build? it could be helpful
< himanshu_pathak[>
Can you provide a link to the windows wheel build. So that I can see is there any similarities between these two
< himanshu_pathak[>
and in my case conda was not able to build adaboost because of boost::program_option() may be possible this error is also beacause of same linking problem but I am not sure
< himanshu_pathak[>
And in your case also we are getting this error LINK : fatal error LNK1181: cannot open input file 'adaboost.cp36-win32.pyd'
< metahost>
rcurtin, zoq: Is the CI for ensmallen down?
< zoq>
metahost: No, we just have to wait for the monthly build to finish, it's using all almost al machines.
< metahost>
Ah okay, thanks! :)
< rcurtin>
yeah, the build queue is down to "only" 169 from over 1000 yesterday :-D
< metahost>
Whoa!
< rcurtin>
himanshu_pathak[: I think the LNK1181 error you found is actually from some manual debugging that I was doing, not from an actual build failure
< rcurtin>
yeah, when I worked at Symantec they were very generous about the build systems they were willing to provide, and when I left, they said I could keep using them until they turned them off...
< rcurtin>
...and 1.5 years later they are still running
< metahost>
:))
< metahost>
That's amazing!
< zoq>
LakshyaOjha: What PR?
< LakshyaOjhaGitte>
oh sorry forgot to mention. But I think you got them anyway. Actually its very late night just going to do some changes right away. Rest I will take up in the morning and thanks for having a look at them :).
< zoq>
LakshyaOjha: Right it's already pretty late, but I think no need to rush anything :)
< zoq>
I guess most of the people here are night owls :)
< GauravSinghGitte>
Oh sorry!
< GauravSinghGitte>
If I disturb you
< zoq>
Nahh, I just thought it's super late for you.
< zoq>
Isn't it like 2am for you?
< GauravSinghGitte>
Well actually 5
< GauravSinghGitte>
(edited) ... 5 => ... 5 am
< zoq>
okay ... I think we can count that as late :)
< zoq>
or early in the morning
< GauravSinghGitte>
Yeah, I think I should go back to bed now. Sorry once again to disturb you @zoq
< zoq>
No worries, for me it's still early, especially compared to you.
< zoq>
Will take a look at the PR in just a few minutes, so that you have a review once you wake up.
< GauravSinghGitte>
Okay thank you 😊
< jeffin143[m]>
It's 4;30
< jeffin143[m]>
I guess at least in my watch
< jeffin143[m]>
Which part are you in india , for time difference to be this.much :)
< zoq>
and you are still up as well?
< sreenik[m]>
India has one time zone I thought..
< sreenik[m]>
For me I've just woken up
< zoq>
Here it's 0:15
< zoq>
crazy
< jeffin143[m]>
Zoq : have to submit a early release today morning 😜
< jeffin143[m]>
Office attroocities
< jeffin143[m]>
atrocities*
< zoq>
Well best of luck with the release, I guess :)
< jeffin143[m]>
Sreenik : someone wanted to get in touch with you here , if I fondly remember it right
< jeffin143[m]>
Just go through for logs once
< jeffin143[m]>
Zoq : just wanted to inform all , I am currently building a poc - mlpack-board (dashboard for visual) just like tensorboard during my weekends
< zoq>
Ahh, most of the people I know start with a coffee, but I think reading mlpack message is also a good way to start the day :)
< zoq>
Ohh nice!
< zoq>
If you have anything to show, happy to take a look at.
< sreenik[m]>
jeffin143[m]: Thanks for pointing out. I'll go update the project description now to something clearer
< sreenik[m]>
jeffin143[m]: Sounds great!
< jeffin143[m]>
Ummm the flow is messy for a developer , but I wanted to follow @rcurtin approach of not adding any dependancy , so I resort to boost::asio to send data to frontend
< jeffin143[m]>
Once I am ready with intial things I would get in touch with you
< jeffin143[m]>
Actually I have thought of many solutions but this was the one I thought which could be apt
< jeffin143[m]>
I have others up my sleeves , I have pend it down, I just know have to check which may be the best , will definitely get in touch with you when I am somewhat ready with the poc
< jeffin143[m]>
> I have others up my sleeves , I have pend it down, I just know have to check which may be the best , will definitely get in touch with you when I am somewhat ready with the poc
< jeffin143[m]>
Now*
< zoq>
Sounds good, I guess there are multiple solutions to get the data out, but we just have to start with one.