rcurtin_irc changed the topic of #mlpack to: mlpack: a scalable machine learning library (https://www.mlpack.org/) -- channel logs: https://libera.irclog.whitequark.org/mlpack -- NOTE: messages sent here might not be seen by bridged users on matrix, gitter, or slack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
braceletboy has quit [Ping timeout: 246 seconds]
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
CaCode- has joined #mlpack
texasmusicinstru has joined #mlpack
CaCode_ has quit [Ping timeout: 256 seconds]
braceletboy has joined #mlpack
<zoq[m]> Looks like a sourceforge connection issue, I restarted the failed build again, and now it looks like the nodes successfully downloaded armadillo.
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
<heisenbuugGopiMT> It's okay @shrit:matrix.org
<heisenbuugGopiMT> Thank you @marcusedel:matrix.org
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
CaCode_ has joined #mlpack
CaCode- has quit [Ping timeout: 256 seconds]
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
CaCode has joined #mlpack
CaCode_ has quit [Ping timeout: 256 seconds]
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
CaCode_ has joined #mlpack
CaCode has quit [Ping timeout: 250 seconds]
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
<jeffin143[m]> Did you all see the updated gsoc program ??
<jeffin143[m]> Now there is no constraint of being a school/University student
<jeffin143[m]> Anyone above 18 can participate
<jeffin143[m]> Also 2 types of project 175h and 350h
<jeffin143[m]> I think new revamp would be great , will give mentor and contributor both enough time , they could take break and spend the project upto 6 months
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
CaCode- has joined #mlpack
CaCode_ has quit [Ping timeout: 256 seconds]
CaCode_ has joined #mlpack
CaCode- has quit [Ping timeout: 268 seconds]
CaCode_ has quit [Quit: Leaving]
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
<zoq[m]> I guess for us nothing really changes, we have done the 175h and 350h projects. The one thing that worries me a little bit is that the projects timeline is dynamic (can be stretched, paused) so the mentor and the contributor (they don't call it student anymore, but did they say contributor?) have to make sure they are available over that period of time.
<zoq[m]> <jeffin143[m]> "Anyone above 18 can participate..." <- I guess we will only see that in the number of applications, we also have to rank applications now before requesting the slots (we already did that).
<shrit[m]> hopefully it will not get converted to some kind of consulting were will se people with 25 year of experience
<shrit[m]> s/se/see/
<shrit[m]> Or even will see more retired people asking to join
texasmusicinstru has quit [Remote host closed the connection]
braceletboy has quit [Ping timeout: 256 seconds]
<zoq[m]> Personally I don't see a problem with that, at the end it's about introducing people to open-source.
braceletboy has joined #mlpack
texasmusicinstru has joined #mlpack
CaCode has joined #mlpack
CaCode has quit [Remote host closed the connection]
CaCode has joined #mlpack
CaCode_ has joined #mlpack
CaCode has quit [Ping timeout: 264 seconds]
<jonpsy[m]> I don't think there will be much changes compared to the past atleast not immediately. Most professionals are busy with their jobs to have time for GSoC. Also this pause feature might backfire causing a lot of unfinished projects, but i guess any kind of progress is good
<zoq[m]> I guess with the new structure the program is interesting for a person with a full-time job as well, because you can now say I can't work 8 hours a day, can we stretch it and I work 2-4 hours per day.
<yugansharora01yu> an error occurred while building mlpack
<yugansharora01yu> string_encoding_test.obj : error LNK2019: unresolved external symbol "__declspec(dllimport) public: static class boost::unit_test::unit_test_log_t & __cdecl boost::unit_test::unit_test_log_t::instance(void)" (__imp_?instance@unit_test_log_t@unit_test@boost@@SAAEAV123@XZ) referenced in function "void __cdecl boost... (full message at https://libera.ems.host/_matrix/media/r0/download/libera.chat/d0bec6738cff490595a6824f580a593abcdcaecd)
braceletboy has quit [Ping timeout: 264 seconds]
<zoq[m]> Now I'm confused, didn't you build the master branch?
<shrit[m]> boost tests, this is very old branch
<jonpsy[m]> Hm, I guess we should have atleast one person who has Windows as their dev environment
<shrit[m]> jonpsy: You have windows right?
braceletboy has joined #mlpack
<zoq[m]> You're volunteering? :)
<jonpsy[m]> Heavens, no
<jonpsy[m]> I was however successful at one point, building it in Windows using Visual Studio
<jonpsy[m]> But it's such a drag.. Having to install nuget and all. I did it just because Visual studio has nice fonts :)
<zoq[m]> I mean, we know it works on Windows, because of the CI (which fails right now due to build time).
<jonpsy[m]> Appveyor?
<zoq[m]> The nice thing is that you don't need nuget anymore.
<shrit[m]> oh no, you chose Windows because it has a good fonts
<jonpsy[m]> shrit[m]: Lol yes
<shrit[m]> no Appveyor is worse
<zoq[m]> You can just use the auto-downloader.
<jonpsy[m]> Afaik, we had Appveyor as our CI right....?
<jonpsy[m]> In ensmallen ateast
<shrit[m]> Yeah, and we moved from their for a proper reason
<zoq[m]> Yes, some time ago we switched to azure pipelines, because they have 6h build time and more memory.
<shrit[m]> jonpsy: use Hack, it is a good font, much nicer than what you can get in Windows
<jonpsy[m]> shrit[m]: I can sense you have a strong distate towards Windows :))
<jonpsy[m]> * I can sense you have a strong distaste towards Windows :))
<zoq[m]> shrit[m]: Still using Fira Code, maybe I should switch.
<jonpsy[m]> Same, Firc code.
<zoq[m]> Look slike Hack builds on Fira Code?
<yugansharora01yu> @marcusedel:matrix.org all other projects build fine just this one is giving error
texasmusicinstru has quit [Remote host closed the connection]
<zoq[m]> yugansharora01yu: I see, but I can see it builds `mlpack_catch_test.exe` which is the replacement for `boost::unit_test`, so something happened in between because `boost::unit_test` was removed in favor of `catch2`.
CaCode_ has quit [Quit: Leaving]
texasmusicinstru has joined #mlpack
<zoq[m]> So either you build an mlpack from a specific commit, because at some point between versions we supported catch and boost unit test, or you use some old CMake file to build a newer mlpack version.
<yugansharora01yu> Any idea what should i do?
<zoq[m]> Like you use the CMake config from some stable version to build the master branch.
<zoq[m]> I guess I'm missing some important information, so my recommendation would be to remove the mlpack code you are currently using and download https://github.com/mlpack/mlpack/archive/refs/heads/master.zip or clone the latest master branch: https://github.com/mlpack/mlpack.
<zoq[m]> Then skip the nuget/dependency part and stick with the auto-downloader.
<yugansharora01yu> So i have to do the compilation again?
<zoq[m]> The cmake process will output some information, maybe you can post that here.
<zoq[m]> I don't see an easy solution to recover your current build.
<yugansharora01yu> I haven't copied the output from command prompt
<yugansharora01yu> From cmake
<zoq[m]> Because as I mentioned the CMake file doesn't match with the version you are currently using, and I'm not sure how that happend.
<zoq[m]> You can also upload the CMakeCache file.
<yugansharora01yu> where can i find this
<zoq[m]> It's in the build folder.
<yugansharora01yu> where can i send you that file
<shrit[m]> Here in the chat
<yugansharora01yu> how ?? there is no file attach option....
<jonpsy[m]> Just drag and drop.
<yugansharora01yu> i downloaded mlpack from https://www.mlpack.org/
<yugansharora01yu> cmake version that i have is 3.21.3 (just in case that is required)
<yugansharora01yu> did every step as it is here https://www.mlpack.org/doc/mlpack-3.4.2/doxygen/build_windows.html
<yugansharora01yu> for armadillo i ran this command cmake -G "Visual Studio 16 2019" -A x64 -DBLAS_LIBRARY:FILEPATH="D:/dev/mlpack/packages/OpenBLAS.0.2.14.1/lib/native/lib/x64/libopenblas.dll.a" -DLAPACK_LIBRARY:FILEPATH="C:/dev/mlpack/packages/OpenBLAS.0.2.14.1/lib/native/lib/x64/libopenblas.dll.a" ..
<yugansharora01yu> for mlpack cmake -G "Visual Studio 16 2019" -A x64 -DBLAS_LIBRARIES:FILEPATH="C:/dev/mlpack/packages/OpenBLAS.0.2.14.1/lib/native/lib/x64/libopenblas.dll.a" -DLAPACK_LIBRARIES:FILEPATH="C:/dev/mlpack/packages/OpenBLAS.0.2.14.1/lib/native/lib/x64/libopenblas.dll.a" -DARMADILLO_INCLUDE_DIR="C:/dev/mlpack/armadillo/include" -DARMADILLO_LIBRARY:FILEPATH="C:/dev/mlpack/armadillo/build/Debug/armadillo.lib"
<yugansharora01yu> -DBOOST_INCLUDEDIR:PATH="C:/dev/boost/" -DBOOST_LIBRARYDIR:PATH="C:/dev/boost/lib64-msvc-14.2" -DBUILD_JULIA_BINDINGS=OFF -DBUILD_GO_BINDINGS=OFF -DBUILD_PYTHON_BINDINGS=OFF -DBUILD_R_BINDINGS=OFF -DDEBUG=OFF -DPROFILE=OFF ..
<yugansharora01yu> pls help
<jonpsy[m]> It'd be convenient for you to use Cmake GUI App rather than typing all these commands
<jonpsy[m]> Also, do you have Linux subsystem? There's a windows feature which offers that. It'll greatly simplify things for you after you install it
<yugansharora01yu> Actually i dont mind using command prompt
<yugansharora01yu> But the web page was like preffering command line
<zoq[m]> For the Cache file I can see that the boost_unit lib was found.
<zoq[m]> `C:/dev/boost/lib64-msvc-14.2/boost_unit_test_framework-vc142-mt-x64-1_77.lib`
<heisenbuugGopiMT> I am working on NCF one, I think we can merge that by weekend as well...
<zoq[m]> But looks like it's not linking against the test target.
<shrit[m]> heisenbuug (Gopi M Tatiraju): thanks, I am merging the math pr once it passed the tests
<heisenbuugGopiMT> Yuppp...
<heisenbuugGopiMT> Do we have a deadline for mlpack 4?
<shrit[m]> no, but we prefer to release mlpack 4 at the start of the next year
<heisenbuugGopiMT> we will still have 9 open issues...
<zoq[m]> So we could try to figure out why it's not linking correctly, which will trigger a new build or switch to another version.
<zoq[m]> Let me know what you prefer.
<heisenbuugGopiMT> I think we can easily complete these by then...
<heisenbuugGopiMT> And also @shrit:matrix.org thank you soo much for i3...
<heisenbuugGopiMT> I was very frustrated with my machine for the past week but now it's working well.
<heisenbuugGopiMT> I think I might shift to arch completely...
<heisenbuugGopiMT> Also using i3 makes you use CLI more and I am loving it, for me it increases my working speed...
<yugansharora01yu> Whatever will give me a working build
<yugansharora01yu> Is good for me
<shrit[m]> heisenbuug (Gopi M Tatiraju): The idea nis to get rid of the mouse, since it is a slowing factor for your work
<shrit[m]> s/nis/is/
<heisenbuugGopiMT> Yes, I totally agree...
<shrit[m]> especially if you are on a laptop
<heisenbuugGopiMT> first vim now i3...
<heisenbuugGopiMT> I can see a significant increase in speed...
<heisenbuugGopiMT> It also reduces stress on RAM...
<shrit[m]> However, the only thing that sometimes is hard to get right with tailing window manager is the Zoom meeting screen sharing
<shrit[m]> Yeah, basically all GUI addition are for normal users, usually we do not need any of these
texasmusicinstru has quit [Remote host closed the connection]
<yugansharora01yu> @marcusedel:matrix.org so which one will give me a working build with highest probability of success
<heisenbuugGopiMT> Which browser do you use?
<zoq[m]> yugansharora01yu: Okay, so can you download mlpack from this url - https://github.com/mlpack/mlpack/archive/refs/heads/master.zip
<heisenbuugGopiMT> Chrome uses a lot of RAM
<shrit[m]> Using firefox
<shrit[m]> but it also uses a lot of RAM
<zoq[m]> zoq[m]: And use `cmake -DDOWNLOAD_DEPENDENCIES=ON ..` instead of the previous command line.
<TarunKumar[m]> heisenbuugGopiMT: I use brave
<heisenbuugGopiMT> We need CLI for browsing...
<shrit[m]> there are
<zoq[m]> zoq[m]: The make step is the same.
<shrit[m]> Lynx
<heisenbuugGopiMT> Even i tried it
<heisenbuugGopiMT> Have you tried them?
texasmusicinstru has joined #mlpack
<shrit[m]> still I have never found a solution for browsing with limited resources
<heisenbuugGopiMT> Somehow firefox and brave were giving me DRM issues when I was using sites like udemy
<shrit[m]> probably because the web it bloated
<shrit[m]> s/it/is/
<heisenbuugGopiMT> Chrome is kinda dominating the space and somehow they are not thinking about optimization...
<yugansharora01yu> One request pls , can i do it tomorrow pls
<yugansharora01yu> Its night time now
<shrit[m]> Integrate more ram in your laptop to resolve all issues, today we need at least 10 GB or ram in any working device
<zoq[m]> yugansharora01yu: Sure, you are also welcome to join our mlpack meeting 17 UTC tomorrow.
<heisenbuugGopiMT> They are pushing us to do that without letting us utilize the full power of existing hardware...
<heisenbuugGopiMT> It's same with high end android phones...
<yugansharora01yu> How can i join? And also i dont know anything so should i join it??
<heisenbuugGopiMT> Samsung makes it soo hard for users to root their phones
<shrit[m]> Just use Open source phone
<heisenbuugGopiMT> What are options other than android for phones?
<heisenbuugGopiMT> I have no idea about this space...
<zoq[m]> yugansharora01yu: Don't feel obligated, just wanted to point you that we have a meeting tomorrow (see the details in the "Video Meetup" section on https://www.mlpack.org/community.html).
<say4n[m]> yugansharora01yu: you’ll find the link to join at: https://mlpack.org/community.html :)
<zoq[m]> zoq[m]: We don't have an agenda, sometimes we try to solve build issues.
<heisenbuugGopiMT> I am willing to experiment with phone as well...
<heisenbuugGopiMT> I have Samsung S10...
<heisenbuugGopiMT> And the phone started lagging soo much after recent updates...
<heisenbuugGopiMT> I am regretting buying such expensive phone now
<heisenbuugGopiMT> All I do is listen songs and use it for navigation or uber...
<shrit[m]> I only buy the cheapest phone I can get
<shrit[m]> it has two destinies either dead with updates or broken
<heisenbuugGopiMT> I have an old phone, I just to replace the battery. I am planning to shift.
<heisenbuugGopiMT> need to*
<shrit[m]> or a dead non replacable battery, if it has a chance to survive the life
<heisenbuugGopiMT> agreed...
<heisenbuugGopiMT> Also about windows, the only way I see myself using windows is for Gaming
<jonpsy[m]> <shrit[m]> "Just use Open source phone" <- They make those? Where can i get one...?
<yugansharora01yu> Thanks
<say4n[m]> heisenbuugGopiMT: Steam’s (well Valve’s) proton is great, try it out on Linux. :)
<say4n[m]> jonpsy[m]: librem5 by purism, fairphone etc…
<heisenbuugGopiMT> I have tried valve, but I think nvidia needs to step-up and do more things for open-source...
<heisenbuugGopiMT> There are a lot of issues when using GTX 10 series GPUs with linux
<jonpsy[m]> say4n[m]: It costs more than iPhone....
<shrit[m]> this one is cheap 😆 https://www.crowdsupply.com/arsenijs/zerophone
<say4n[m]> heisenbuugGopiMT: I totally agree; here's an interesting video from LTT on daily driving Linux from a windows users perspective: https://www.youtube.com/watch?v=0506yDSgU7M
<shrit[m]> just kidding
<say4n[m]> haha
<say4n[m]> shrit[m]: there was this other: https://www.thenophone.com
<jonpsy[m]> shrit[m]: Looks like Nokia 3310 on steroids 😂
<shrit[m]> Yeah, definitely
<shrit[m]> say4n[m]: This is perfect for kid and teenagers
<shrit[m]> s/This is perfect for kid and teenagers/This is perfect for kids and teenagers, as a gift/
<jonpsy[m]> <heisenbuugGopiMT> "I have tried valve, but I..." <- This reminds me of Linus Torvalds famous "bless you" finger to NVidia
<say4n[m]> shrit[m]: imagine finding it under the Christmas tree as a teenager
<heisenbuugGopiMT> LTT is one of the best on yt
<jonpsy[m]> <say4n[m]> "there was this other: https://..." <- Wait... They even pitched this on Shark Tank 😆
<heisenbuugGopiMT> hahahaaa
<heisenbuugGopiMT> Yes, saw that video
<heisenbuugGopiMT> I don't understand why nvidia do this? Now that GPUs are center for ML they need to step-up
<say4n[m]> aside: does anyone know of pytorch's progress with supporting AMD GPUs?
<heisenbuugGopiMT> `f/mlpack/src/mlpack/methods/cf/ncf_main.cpp:35:18: error: expected constructor, destructor, or type conversion before ‘(’ token
<heisenbuugGopiMT> 35 | BINDING_USER_NAME("Neural Collaborative Filtering");`
<heisenbuugGopiMT> * ```
<heisenbuugGopiMT> 35 | BINDING_USER_NAME("Neural Collaborative Filtering");
<heisenbuugGopiMT> f/mlpack/src/mlpack/methods/cf/ncf_main.cpp:35:18: error: expected constructor, destructor, or type conversion before ‘(’ token
<heisenbuugGopiMT> ```
<heisenbuugGopiMT> any idea about this error?
<jonpsy[m]> say4n[m]: Sup about that?
<jonpsy[m]> I know that AMD's been destroying NVidia and Intel
<say4n[m]> jonpsy[m]: just came across news about Pytorch working on supporting the M1 lineup of SoCs, so was wondering if they finished their work on bringing torch to AMD GPUs. :p
<jonpsy[m]> Oh... sounds cool. Would be nice to see GPU support that's not Nvidia. GPU has become synonymous to NVidia and not many people like that
<jonpsy[m]> Fancy.
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
<heisenbuugGopiMT> Hey @zoq I have some doubts regarding NCF.
<heisenbuugGopiMT> I think I need to rewrite the binding...
<heisenbuugGopiMT> The names of the function we have now and how its structured now is pretty different from what we had eairlier.
<heisenbuugGopiMT> Like we are currently using `void BINDING_FUNCTION` but earlier we were maybe using `mlpackMain()`
<shrit[m]> heisenbuug (Gopi M Tatiraju): the NFC code is old, so do not hesitate in adapting what ever needed to bring back the feature and we can merge it soon.
<heisenbuugGopiMT> Yea, never written bindings before, so I guess this will be interesting.
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
<shrit[m]> any thoughts on this error that I am facing:... (full message at https://libera.ems.host/_matrix/media/r0/download/libera.chat/965e5182d12eb81651a589ab890bcff328ed667e)
<zoq[m]> Are you using `arma::cube` instead of `arma::mat`?
<zoq[m]> Just glanced over the log, so not sure yet what this is.
<shrit[m]> I am using arma::Mat<unsigned int>
<shrit[m]> The error is long this was part of it
<shrit[m]> I think there is a bug somewhere, either in LSH or in the norm
<shrit[m]> if I change the dataset type to `arma::Mat<double> ` the error disapear
<shrit[m]> Even though, I do not want to convert my data type to double
<shrit[m]> this make LSH usable with only double type instead of any type
<shrit[m]> I faced the same error if the dataset type was char, unsinged int, uword,
<rcurtin[m]> is `arma::norm()` implemented for `arma::Mat<unsigned int>`?
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
<shrit[m]> I think it should take any type, right?
<rcurtin[m]> I'm not sure, maybe? it's worth a test program to see, then we can know if the issue is in Armadillo or in mlpack's usage of it
<rcurtin[m]> but I glanced at the code (at least in `master`), and I can't see a quick reason for the problem you are having
<shrit[m]> neither do I, I have done a glance on it too.
<zoq[m]> `arma::norm` should print an error if the type isn't supported - https://gitlab.com/conradsnicta/armadillo-code/-/blob/10.7.x/include/armadillo_bits/fn_norm.hpp#L99
<shrit[m]> it is on mlpack side then
<zoq[m]> Maybe something changed between versions, so I guess still a good idea to do a quick test.
<shrit[m]> Hmm, I do not see we have any test for LMetric, it is used inside other tests, but it is not tested individually
<zoq[m]> I think this is what rcurtin proposed to write a simple individual test, just a does it compile test.
<shrit[m]> it does not compile
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
<jonpsy[m]> Can you try arma::uword and see if it works. Just trying to isolate the possibilities...
<jonpsy[m]> Instead of unsigned int
<shrit[m]> already tried in a different code,
<jonpsy[m]> hm, same error?
<shrit[m]> gave the same results
<shrit[m]> you can try just copy the code directly
<shrit[m]> yeah
<jonpsy[m]> aight, now that im awake. Lets debug this
<rcurtin[m]> shrit: isn't it tested but maybe under a different name in `metric_test.cpp`? also, my suggestion was just to call `arma::norm()` directly 👍️
<jonpsy[m]> same eror with arma::norm directly
<shrit[m]> I think it is from armadillo side
<jonpsy[m]> ok, so it works with `arma::vec` and fails only with `uvec`
<shrit[m]> Yeah
<shrit[m]> only works with double
<jonpsy[m]> double/float too i guess
<jonpsy[m]> so basically its getting template substitution'd out
<shrit[m]> I would open an issue in gitlab for it
<shrit[m]> at the armadillo repository
<jonpsy[m]> ok here's something interesting
<jonpsy[m]> note: candidate: ‘template<class T1> typename arma::enable_if2<arma::is_arma_type<T1>::value, typename T1::pod_type>::result arma::norm(const T1&, arma::uword, const typename arma::arma_real_or_cx_only<typename T1::elem_type>::result*)’
<jonpsy[m]> notice: `arma_real_or_cx_only`
<jonpsy[m]> need to check in the repo what are the available types here. that'll solve the scooby do mystery
<shrit[m]> this does not correspond with the error there
<jonpsy[m]> got it!
<jonpsy[m]> check this
<jonpsy[m]> template<> struct arma_real_or_cx_only< float > { typedef float result; };
<jonpsy[m]> template<> struct arma_real_or_cx_only< std::complex<float> > { typedef std::complex<float> result; };
<jonpsy[m]> template<> struct arma_real_or_cx_only< double > { typedef double result; };
<jonpsy[m]> template<> struct arma_real_or_cx_only< std::complex<double> > { typedef std::complex<double> result; };
<shrit[m]> I mean there exist a norm for non double values right>
<shrit[m]> ?
<jonpsy[m]> okay, well atleast we've found the culprit
<shrit[m]> or I have forgotten mathematics completely
<jonpsy[m]> i think their logic is, whatever we do norm is a float value. So they're expecting the input as float/double as well
texasmusicinstru has quit [Remote host closed the connection]
<jonpsy[m]> <jonpsy[m]> "i think their logic is, whatever..." <- https://tenor.com/view/it-is-what-it-is-it-is-eet-eet-iz-iz-gif-17896182
texasmusicinstru has joined #mlpack
<rcurtin[m]> yeah, my guess is that Conrad restricted to using only float/double because he could call directly to LAPACK
<rcurtin[m]> (or really BLAS)
<rcurtin[m]> norm could be defined for integer types, but the return value would still be floating-point since there could be a sqrt (or other-based root) involved
CaCode has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack
<jonpsy[m]> <rcurtin[m]> "yeah, my guess is that Conrad..." <- this makes sense because `arma::is_blas_type` also has the same SFINAE
<shrit[m]> hmm, okay, in this case, when the norm is used, we need to convert the type to float
<rcurtin[m]> I do think it's worth opening an issue
<shrit[m]> why not casting the return type to whatever is the input type?
<rcurtin[m]> even if we do, in the end, need to convert to float
<shrit[m]> I will open an issue for this
<rcurtin[m]> jonpsy: sorry, maybe I misunderstood---I thought the code from lines 1-3 that you wrote simply did not compile?
<rcurtin[m]> 👍️ got it
<jonpsy[m]> yep, im talking of an hypothetical case
<jonpsy[m]> i.e. if uvec was allowed and the answers were casted down to input type
<rcurtin[m]> when I suggest opening an issue, it's more about whether arma::norm should allow uvec at all---personally, I think it should, and if we put the idea in front of Conrad, we can see what he thinks
<jonpsy[m]> <rcurtin[m]> "norm could be defined for..." <- basically elaborating your point
<jonpsy[m]> I, for one, stand in disagreement. But I think Conrad would be a better judge
<rcurtin[m]> :)
<rcurtin[m]> yeah, in the worst case, we can use `conv_to` internally if needed 👍️
texasmusicinstru has quit [Remote host closed the connection]
<shrit[m]> jonpsy: Yeah, the main issue with this is that it breaks the logic behind the usage of template for this case and limits a large number of cases, In addition, to add an additional step for the user that is not the same necessary, I opened an issue let us see the reason behind it
texasmusicinstru has joined #mlpack
texasmusicinstru has quit [Remote host closed the connection]
texasmusicinstru has joined #mlpack