ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< heisenbuugGopiMT> Okay, I work it out accordingly and will let you know if I have any more queries.
mindlifter has quit [Remote host closed the connection]
mindlifter has joined #mlpack
mindlifter has quit [Remote host closed the connection]
mindlifter has joined #mlpack
mindlifter has quit [Quit: Leaving.]
< jonpsy[m]> zoq: Hey, looks like all our ```callback.hpp``` needs to accept a tuple of objective functions? I'm not sure how ```callbacks_test.cpp``` is working. Can you shed some light?
< jonpsy[m]> * zoq say4n : Hey, looks like all our `callback.hpp` needs to accept a tuple of objective functions? I'm not sure how `callbacks_test.cpp` is working. Can you shed some light?
UmarJ has quit [Ping timeout: 260 seconds]
ImQ009 has joined #mlpack
< zoq> jonpsy[m]: You can already pass a tuple, because all callbacks use a template for the function type.
< zoq> jonpsy[m]: What you do in the function itself is totally up to you, so you can unpack the tuple if needed.
< jonpsy[m]> I suppose FunctionType acts as a std::tuple<multiplefunctions> correct?
< zoq> Yes, it's the same type you passed to Optimize(...)
< jonpsy[m]> Hmm, I would be able to access each of the objectives?
< jonpsy[m]> Ahh , right that should work.
< zoq> If we wanted to make the exsisting callbacks work for multi-function optimization we have to specilize each method inside the callback class.
< jonpsy[m]> On that notw
< zoq> To make sure we select the correct method for the correct optimizer type.
< jonpsy[m]> so i created this [callback](https://pastebin.com/Wcu6DBk5)
< zoq> Looks good to me.
< jonpsy[m]> the problem is, paretoFront is only filled at the end of the optimization. So, no, you can't track the optimization process unless we do for loops per generation to fill the bestFront/bestSet.
< zoq> But you could use Evaluate in combination with calculatedObjectives right?
< zoq> To calculate the front inside the callback in each iteration.
< jonpsy[m]> yeah.....
< jonpsy[m]> i think i see what you're getting at but I'm not quite there
< jonpsy[m]> inside ```StepTaken()```?
< jonpsy[m]> * On that note
< zoq> Yes, you would have to pass population, since that isn't exposed otherwise.
< zoq> objectives is the function, you already pass that
< zoq> Hm, the cached calculated objectives is also not exposed.
< zoq> I guess in this case it makes sense to add another function to the callback pool that is meant to be used for multi-objective optimization.
< zoq> Unless you think you can calculate the fron inside the callback class, by exposing the necessary parameter in the NSGAII class.
< zoq> I guess another solution would be to calculate the front in each iteration and just use it the way you currenty do.
< zoq> NOt sure how expensive that would be if we do that for each iteration.
< jonpsy[m]> > I guess another solution would be to calculate the front in each iteration and just use it the way you currenty do.
< jonpsy[m]> I thought of this, but I was wondering if it would be costly
< jonpsy[m]> > NOt sure how expensive that would be if we do that for each iteration.
< jonpsy[m]> YES, EXACTLY!
< zoq> Yeah, that's why I thought let's do it in the callback, since then we only add that additional cost if the user is interested in that information.
< jonpsy[m]> > I guess in this case it makes sense to add another function to the callback pool that is meant to be used for multi-objective optimization.
< jonpsy[m]> I think this is the best option of all. What's your thoughts on merging the current notebook as-is, then let's work on callback. And get back on restructuring the notebook?
< zoq> Yes, let's do that.
< jonpsy[m]> Allrighty!
< shrit[m]> Would someone recall me, where the ensmallen-latest tar was stored on each website?
< shrit[m]> just to add it to the CI
< shrit[m]> * Would someone recall me, where the ensmallen-latest tar was stored on which website?
< rcurtin[m]> which CI? it would be better to use a package manager probably
< shrit[m]> ah
< shrit[m]> there are ensmallen on ubuntu?
< rcurtin[m]> yes, `libensmallen-dev`
< shrit[m]> I see what you mean, I thought I need to download it as armadillo
< rcurtin[m]> it should also be possible to install Armadillo via apt
< shrit[m]> Yeah, we are only keeping the 8.4 for compatibility
< rcurtin[m]> ah, sure, that is a good point; I forgot that we use 8.400 specifically to ensure that code works with the oldest version of Armadillo we support
< shrit[m]> But there is no a minimal version for ensmallen
< shrit[m]> I mean if I use `libensmallen-dev` it should compile perfectly with mlpack
< rcurtin[m]> yes, I think that is correct
< shrit[m]> And on windows CI, can I use this command directly on the powershell `vcpkg: vcpkg install ensmallen:x64-windows`? or I have to use nuget?
< rcurtin[m]> I don't know, maybe try it and find out?
< shrit[m]> * And on windows CI, can I use this command directly on the powershell ` vcpkg install ensmallen:x64-windows`? or I have to use nuget?
< shrit[m]> I will give it a try
< Aakash-kaushikAa> Hey, So we have two models in our models repo which is yolo and darknet and darknet has different versions for it so right now to create different versions you have to pass that as a template param, I have been thinking that we can create typedef for every type of model that we have and also is there a way so the user can only import one file and all the models are available to him but the actual model code is only
< Aakash-kaushikAa> dragged into compilation which is used and not the whole file ?
< Aakash-kaushikAa> something like the layers.hpp that mlpack has but a bit simpler
< zoq> Aakash-kaushikAa: The typedef idea sounds good to me, makes it easier to use.
< zoq> Aakash-kaushikAa: Not sure I get the second question.
< Aakash-kaushikAa> hi @zoq https://github.com/mlpack/mlpack/blob/master/src/mlpack/methods/ann/layer/layer.hpp basically something like this for models
< zoq> shrit[m]: ensmallen is header only, so if it's not available easy to just download it and point cmake to the correct path.
< Aakash-kaushikAa> so i can import something named as `models.hpp` and then can create any model i want rather than having to import resnet to create resnet
< shrit[m]> Yeah, I know I need to avoid the version number in the download and installation step on the CI
UmarJ has joined #mlpack
< shrit[m]> so we do not touch it in the future
< shrit[m]> zoq: would you guide me to enable the embedded for all mlpack pull requests
< zoq> Aakash-kaushikAa: Yes that would be possible, the issue I see in this case each cpp file would now include everything, which might have a negative effect on the build time.
< Aakash-kaushikAa> And also the structure is just odd, a folder for yolo, a folder for darknet and so a folder for every model, we can just have a models folder in the repo and all the files reside there.
< Aakash-kaushikAa> @zoq Yes exactly because of that i asked if there is a solution so the whole code is not dragged while compilation if that model is not used
< zoq> shrit[m]: http://ensmallen.org/files/ensmallen-latest.tar.gz download the latest ensmallen version.
< shrit[m]> Thanks I will bookmark the page
< Aakash-kaushikAa> Btw @zoq does this file:https://github.com/mlpack/mlpack/blob/master/src/mlpack/methods/ann/layer/layer.hpp not include every layer and have a bad impact too?
< zoq> Aakash-kaushikAa: yes and yes, one thing we like to solve with the vtable branch.
< Aakash-kaushikAa> damn, the first yes was for all file in a single model folder right ?
< Aakash-kaushikAa> * damn, the first yes was for all files in a single model folder right ?
< zoq> Aakash-kaushikAa: yes layer.hpp includes every layer and yes it will impact the build time.
< zoq> Aakash-kaushikAa: About the restructure, let's open an issue for that.
< zoq> Aakash-kaushikAa: https://github.com/mlpack/examples/issues/61 might be interesting to go over as well.
< zoq> Aakash-kaushikAa: Totally open to improve it, and I can see some benefits in what you proposed above.
< zoq> shrit[m]: You can pretty much copy the config from http://ci.mlpack.org/job/pull-requests-mlpack-static-code-analysis/configure
< shrit[m]> Thanks 👍️
< zoq> shrit[m]: You have a two step cnfig right? You are cross-building on x86 first and then run the tests on arm?
< zoq> shrit[m]: Or do you building everything on arm?
< shrit[m]> Yeah, the second one is triggered automatically with the first one
< shrit[m]> at the end of the first oen
< shrit[m]> one
< shrit[m]> do I need to configure the second one too?
< shrit[m]> * at the end of the first one
< zoq> That is already working right?
< shrit[m]> I triggered one now to check the modification of the CI considering the autodownloader
< shrit[m]> Yeah the one that is working now
< zoq> Like we already have all the jobs configured but what is left is the GitHub integration?
< shrit[m]> Exactly, we only need it to be triggered automatically with each commit for all mlpack
< shrit[m]> the same thing for the static analysis
< zoq> What is the name of the first job?
< shrit[m]> aarch64
< zoq> ahh, okay, might be a good idea to rename that job.
< zoq> Okay, so in the Git section you have to update "Branch Specifier (blank for 'any')
< zoq> You can just use the same config as for the static one
< zoq> enable "Use github hooks for build triggering"
< zoq> That should do the trick.
< shrit[m]> Thanks, I will give it a try
< shrit[m]> I have update the config, let us see when the first one it will be triggered
< Aakash-kaushikAa> Hi @freenode_zoq:matrix.org opened the issue here: https://github.com/mlpack/models/issues/60
< Aakash-kaushikAa> I would be glad to reciveve feedback from others too.
< Aakash-kaushikAa> * I would be glad to receive feedback from others too.
7F1AABEOS has joined #mlpack
7F1AABEOR has joined #mlpack
7F1AABEOS has quit [Changing host]
7F1AABEOS has joined #mlpack
7F1AABEOR has quit [Changing host]
7F1AABEOR has joined #mlpack
GaborBakos[m] has quit [Ping timeout: 245 seconds]
huberspot[m] has quit [Ping timeout: 245 seconds]
AlexNguyen[m] has quit [Ping timeout: 245 seconds]
ABHINAVANAND[m] has quit [Ping timeout: 245 seconds]
M21WAABCJK has quit [Ping timeout: 245 seconds]
_slack_mlpack_34 has quit [Ping timeout: 245 seconds]
dkipke[m] has quit [Ping timeout: 245 seconds]
M32NAABJ92 has quit [Ping timeout: 245 seconds]
kartikdutt18kart has quit [Ping timeout: 245 seconds]
Amankumar[m] has quit [Ping timeout: 245 seconds]
MayankRaj[m] has quit [Ping timeout: 245 seconds]
himanshu_pathak[ has quit [Ping timeout: 245 seconds]
_slack_mlpack_40 has quit [Ping timeout: 245 seconds]
_slack_mlpack_43 has quit [Ping timeout: 245 seconds]
RishabhGoel[m] has quit [Ping timeout: 245 seconds]
M21WAAB348 has quit [Ping timeout: 245 seconds]
zoq[m]1 has quit [Ping timeout: 245 seconds]
DavidportlouisDa has quit [Ping timeout: 245 seconds]
Pushker[m] has quit [Ping timeout: 245 seconds]
sdev_7211[m] has quit [Ping timeout: 245 seconds]
abernauer[m] has quit [Ping timeout: 245 seconds]
NippunSharmaNipp has quit [Ping timeout: 245 seconds]
zoq[m] has quit [Ping timeout: 245 seconds]
shrit[m] has quit [Ping timeout: 245 seconds]
jjb[m] has quit [Ping timeout: 245 seconds]
M18WAADRWG has quit [Ping timeout: 245 seconds]
M7ITAADAFJ has quit [Ping timeout: 245 seconds]
M32NAABJ9R has quit [Ping timeout: 245 seconds]
32NAABU6N has quit [Ping timeout: 245 seconds]
_slack_mlpack_31 has quit [Ping timeout: 245 seconds]
M32NAABJ9U has quit [Ping timeout: 245 seconds]
_slack_mlpack_16 has quit [Ping timeout: 245 seconds]
M59NAABH3E has quit [Ping timeout: 245 seconds]
_slack_mlpack_44 has quit [Ping timeout: 245 seconds]
HARSHCHAUHAN[m] has quit [Ping timeout: 245 seconds]
_slack_mlpack_13 has quit [Ping timeout: 245 seconds]
_slack_mlpack_47 has quit [Ping timeout: 245 seconds]
AbdullahKhilji[m has quit [Ping timeout: 245 seconds]
_slack_mlpack_49 has quit [Ping timeout: 245 seconds]
AyushSingh[m] has quit [Ping timeout: 245 seconds]
ManishKausikH[m] has quit [Ping timeout: 245 seconds]
Aakash-kaushikAa has quit [Ping timeout: 245 seconds]
7F1AABEOS has quit [Ping timeout: 245 seconds]
7F1AABEOR has quit [Ping timeout: 245 seconds]
OleksandrNikolsk has quit [Ping timeout: 245 seconds]
SaiVamsi[m] has quit [Ping timeout: 245 seconds]
_slack_mlpack_19 has quit [Ping timeout: 245 seconds]
SoumyadipSarkar[ has quit [Ping timeout: 245 seconds]
sailor[m] has quit [Ping timeout: 245 seconds]
TathagataRaha[m] has quit [Ping timeout: 245 seconds]
Gulshan[m] has quit [Ping timeout: 245 seconds]
_slack_mlpack_25 has quit [Ping timeout: 245 seconds]
kaushal[m] has quit [Ping timeout: 245 seconds]
Cadair has quit [Ping timeout: 246 seconds]
gitter-badgerThe has quit [Ping timeout: 246 seconds]
rcurtin[m] has quit [Ping timeout: 246 seconds]
ShahAnwaarKhalid has quit [Ping timeout: 245 seconds]
fazamuhammad[m] has quit [Ping timeout: 245 seconds]
SergioMoralesE[m has quit [Ping timeout: 245 seconds]
ZanHuang[m] has quit [Ping timeout: 245 seconds]
_slack_mlpack_U4 has quit [Ping timeout: 245 seconds]
M94KAAAZXO has quit [Ping timeout: 245 seconds]
M59NAABH23 has quit [Ping timeout: 245 seconds]
HrithikNambiar[m has quit [Ping timeout: 245 seconds]
M7IZAAB9PF has quit [Ping timeout: 245 seconds]
M7IZAAB9PL has quit [Ping timeout: 245 seconds]
AvikantSrivasta4 has quit [Ping timeout: 245 seconds]
ryan[m]4 has quit [Ping timeout: 245 seconds]
Saksham[m] has quit [Ping timeout: 245 seconds]
M32NAABJT0 has quit [Ping timeout: 245 seconds]
MohomedShalik[m] has quit [Ping timeout: 245 seconds]
GauravTirodkar[m has quit [Ping timeout: 245 seconds]
M21WAABCJB has quit [Ping timeout: 245 seconds]
AyushKumarLavani has quit [Ping timeout: 245 seconds]
M32NAABJ9W has quit [Ping timeout: 245 seconds]
AbhinavGudipati[ has quit [Ping timeout: 245 seconds]
ArunavShandeelya has quit [Ping timeout: 245 seconds]
_slack_mlpack_14 has quit [Ping timeout: 245 seconds]
AmanKashyap[m] has quit [Ping timeout: 245 seconds]
M32NAABKRP has quit [Ping timeout: 245 seconds]
M32NAABJ9V has quit [Ping timeout: 245 seconds]
jeffin143[m] has quit [Ping timeout: 245 seconds]
M7ITAAC647 has quit [Ping timeout: 245 seconds]
M32NAABKRL has quit [Ping timeout: 245 seconds]
M21WAAB35B has quit [Ping timeout: 245 seconds]
M32NAABKRF has quit [Ping timeout: 245 seconds]
EricTroupeTeste4 has quit [Ping timeout: 246 seconds]
_slack_mlpack_28 has quit [Ping timeout: 246 seconds]
M32NAABJ9S has quit [Ping timeout: 246 seconds]
HemalMamtora[m] has quit [Ping timeout: 246 seconds]
M7Ain7Soph77Ain7 has quit [Ping timeout: 246 seconds]
rencugamob1981[m has quit [Ping timeout: 246 seconds]
_slack_mlpack_46 has quit [Ping timeout: 246 seconds]
jonpsy[m] has quit [Ping timeout: 246 seconds]
KrishnaSashank[m has quit [Ping timeout: 246 seconds]
bisakh[m] has quit [Ping timeout: 246 seconds]
M94KAAAZXN has quit [Ping timeout: 246 seconds]
KumarArnav[m] has quit [Ping timeout: 246 seconds]
_slack_mlpack_52 has quit [Ping timeout: 246 seconds]
AbhishekMishra[m has quit [Ping timeout: 246 seconds]
ronakypatel[m] has quit [Ping timeout: 246 seconds]
VarunGupta[m] has quit [Ping timeout: 246 seconds]
M32NAABKRE has quit [Ping timeout: 246 seconds]
M7ITAADAC7 has quit [Ping timeout: 246 seconds]
piocolo[m] has quit [Ping timeout: 246 seconds]
LokeshJawale[m] has quit [Ping timeout: 246 seconds]
_slack_mlpack_27 has quit [Ping timeout: 246 seconds]
mlpack-inviter[m has quit [Ping timeout: 246 seconds]
M7ITAADAC5 has quit [Ping timeout: 246 seconds]
M7F1AAA1G5 has quit [Ping timeout: 246 seconds]
M32NAABJ95 has quit [Ping timeout: 250 seconds]
yuvraj_2701[m] has quit [Ping timeout: 250 seconds]
DirkEddelbuettel has quit [Ping timeout: 250 seconds]
M32NAABJ9Q has quit [Ping timeout: 250 seconds]
ABoodhayanaSVish has quit [Ping timeout: 250 seconds]
Shadow3049[m] has quit [Ping timeout: 250 seconds]
M7IZAAB9PJ has quit [Ping timeout: 250 seconds]
NitikJain[m] has quit [Ping timeout: 250 seconds]
swaingotnochill[ has quit [Ping timeout: 250 seconds]
RishabhGarg108Ri has quit [Ping timeout: 250 seconds]
RishabhGarg108[m has quit [Ping timeout: 258 seconds]
_slack_mlpack_U0 has quit [Ping timeout: 258 seconds]
heisenbuugGopiMT has quit [Ping timeout: 258 seconds]
Gauravkumar[m] has quit [Ping timeout: 258 seconds]
94KAACIBF has quit [Ping timeout: 258 seconds]
bash[m] has quit [Ping timeout: 258 seconds]
VedantaJha[m] has quit [Ping timeout: 258 seconds]
M59NAABH26 has quit [Ping timeout: 258 seconds]
muis[m] has quit [Ping timeout: 258 seconds]
FranchisNSaikia[ has quit [Ping timeout: 258 seconds]
JatoJoseph[m] has quit [Ping timeout: 258 seconds]
birm[m] has quit [Ping timeout: 258 seconds]
Kaushalc64[m] has quit [Ping timeout: 258 seconds]
M94KAAAZXQ has quit [Ping timeout: 258 seconds]
M32NAABKRR has quit [Ping timeout: 258 seconds]
M32NAABKRO has quit [Ping timeout: 258 seconds]
M7IZAAB9PD has quit [Ping timeout: 258 seconds]
GauravGhati[m] has quit [Ping timeout: 258 seconds]
say4n has quit [Ping timeout: 258 seconds]
_slack_mlpack_U7 has quit [Ping timeout: 258 seconds]
DivyanshKumar[m] has quit [Ping timeout: 258 seconds]
MatheusAlcntaraS has quit [Ping timeout: 258 seconds]
M7IZAAB9PG has quit [Ping timeout: 258 seconds]
GopiManoharTatir has quit [Ping timeout: 258 seconds]
ChaithanyaNaik[m has quit [Ping timeout: 258 seconds]
AbhishekNimje[m] has quit [Ping timeout: 258 seconds]
_slack_mlpack_22 has quit [Ping timeout: 258 seconds]
_slack_mlpack_10 has quit [Ping timeout: 258 seconds]
M21WAAB349 has quit [Ping timeout: 258 seconds]
_slack_mlpack_24 has quit [Ping timeout: 258 seconds]
Prometheus[m] has quit [Ping timeout: 258 seconds]
RudraPatil[m] has quit [Ping timeout: 258 seconds]
LolitaNazarov[m] has quit [Ping timeout: 258 seconds]
kta[m] has quit [Ping timeout: 245 seconds]
_slack_mlpack_37 has quit [Ping timeout: 246 seconds]
jonathanplatkiew has quit [Ping timeout: 246 seconds]
SiddhantJain[m] has quit [Ping timeout: 246 seconds]
DillonKipke[m] has quit [Ping timeout: 246 seconds]
ShivamShaurya[m] has quit [Ping timeout: 246 seconds]
TrinhNgo[m] has quit [Ping timeout: 246 seconds]
KshitijAggarwal[ has quit [Ping timeout: 246 seconds]
M7F1AAA1OQ has quit [Ping timeout: 258 seconds]
M7F1AAA1K8 has quit [Ping timeout: 258 seconds]
erictroupetester has quit [Ping timeout: 258 seconds]
_slack_mlpack_17 has quit [Ping timeout: 258 seconds]
M7F1AAA1K4 has quit [Ping timeout: 258 seconds]
_slack_mlpack_47 has joined #mlpack
_slack_mlpack_U7 has joined #mlpack
_slack_mlpack_17 has joined #mlpack
_slack_mlpack_46 has joined #mlpack
_slack_mlpack_22 has joined #mlpack
_slack_mlpack_34 has joined #mlpack
_slack_mlpack_13 has joined #mlpack
_slack_mlpack_14 has joined #mlpack
_slack_mlpack_19 has joined #mlpack
_slack_mlpack_27 has joined #mlpack
_slack_mlpack_25 has joined #mlpack
_slack_mlpack_28 has joined #mlpack
_slack_mlpack_10 has joined #mlpack
_slack_mlpack_37 has joined #mlpack
_slack_mlpack_14 has quit [Quit: Bridge terminating on SIGTERM]
_slack_mlpack_U7 has quit [Quit: Bridge terminating on SIGTERM]
_slack_mlpack_47 has quit [Quit: Bridge terminating on SIGTERM]
_slack_mlpack_17 has quit [Quit: Bridge terminating on SIGTERM]
_slack_mlpack_28 has quit [Quit: Bridge terminating on SIGTERM]
_slack_mlpack_37 has quit [Client Quit]
_slack_mlpack_13 has quit [Quit: Bridge terminating on SIGTERM]
_slack_mlpack_22 has quit [Quit: Bridge terminating on SIGTERM]
_slack_mlpack_34 has quit [Quit: Bridge terminating on SIGTERM]
_slack_mlpack_25 has quit [Quit: Bridge terminating on SIGTERM]
_slack_mlpack_19 has quit [Quit: Bridge terminating on SIGTERM]
_slack_mlpack_27 has quit [Quit: Bridge terminating on SIGTERM]
_slack_mlpack_10 has quit [Client Quit]
_slack_mlpack_46 has quit [Quit: Bridge terminating on SIGTERM]
shrit[m] has joined #mlpack
< shrit[m]> Maybe, but it would not be causing a kernel panic at boot, unless if Jenkins is somehow modifying the kernel
sdev_7211[m] has joined #mlpack
M7Ain7Soph77Ain7 has joined #mlpack
rcurtin[m] has joined #mlpack
< rcurtin[m]> that's true, the kernel panic must be unrelated
say4n has joined #mlpack
< say4n> Maybe a failing microSD card?
< shrit[m]> The card is new, I used it first time when I installed Jenkins on the Raspberry PI, it is 4 months old at Max
< shrit[m]> When you install a fresh version of Ubuntu 20, everything goes fine for some while
< say4n> Very weird!
< rcurtin[m]> yep, I was about to send a reminder
< rcurtin[m]> thanks jjb
< rcurtin[m]> it'll be in this zoom room: https://zoom.us/j/3820896170
< rcurtin[m]> password is 'mlpack' :)
jjb[m] has joined #mlpack
< jjb[m]> I think the GSOC 2021 kick-off meeting starts in about 10 minutes or so?
RudraPatil[m] has joined #mlpack
DirkEddelbuettel has joined #mlpack
GauravTirodkar[m has joined #mlpack
LolitaNazarov[m] has joined #mlpack
ZanHuang[m] has joined #mlpack
AbhinavGudipati[ has joined #mlpack
ronakypatel[m] has joined #mlpack
TrinhNgo[m] has joined #mlpack
DivyanshKumar[m] has joined #mlpack
RishabhGoel[m] has joined #mlpack
AbdullahKhilji[m has joined #mlpack
AyushSingh[m] has joined #mlpack
AmanKashyap[m] has joined #mlpack
_slack_mlpack_U4 has joined #mlpack
SiddhantJain[m] has joined #mlpack
_slack_mlpack_U0 has joined #mlpack
AbhishekNimje[m] has joined #mlpack
HARSHCHAUHAN[m] has joined #mlpack
_slack_mlpack_U7 has joined #mlpack
ShahAnwaarKhalid has joined #mlpack
Gulshan[m] has joined #mlpack
AyushKumarLavani has joined #mlpack
LokeshJawale[m] has joined #mlpack
ABHINAVANAND[m] has joined #mlpack
FranchisNSaikia[ has joined #mlpack
SoumyadipSarkar[ has joined #mlpack
HrithikNambiar[m has joined #mlpack
ABoodhayanaSVish has joined #mlpack
MohomedShalik[m] has joined #mlpack
94KAAC19T has joined #mlpack
32NAABYGU has joined #mlpack
32NAABYGU has quit [Client Quit]
94KAAC19T has quit [Client Quit]
Cadair has joined #mlpack
EricTroupeTester has joined #mlpack
erictroupeteste4 has joined #mlpack
gitter-badgerThe has joined #mlpack
Gauravkumar[m] has joined #mlpack
NitikJain[m] has joined #mlpack
DillonKipke[m] has joined #mlpack
ManishKausikH[m] has joined #mlpack
_slack_mlpack_10 has joined #mlpack
kta[m] has joined #mlpack
TathagataRaha[m] has joined #mlpack
bash[m] has joined #mlpack
rencugamob1981[m has joined #mlpack
GopiManoharTatir has joined #mlpack
yuvraj_2701[m] has joined #mlpack
SergioMoralesE[m has joined #mlpack
Amankumar[m] has joined #mlpack
_slack_mlpack_13 has joined #mlpack
OleksandrNikolsk has joined #mlpack
jonathanplatkiew has joined #mlpack
GaborBakos[m] has joined #mlpack
Shadow3049[m] has joined #mlpack
ShivamNayak[m] has joined #mlpack
_slack_mlpack_16 has joined #mlpack
ryan[m]3 has joined #mlpack
ArunavShandeelya has joined #mlpack
mlpack-inviter[m has joined #mlpack
sailor[m] has joined #mlpack
DavidportlouisDa has joined #mlpack
zoq[m] has joined #mlpack
Pushker[m] has joined #mlpack
M32NAABYGU has joined #mlpack
Saksham[m] has joined #mlpack
bisakh[m] has joined #mlpack
heisenbuugGopiMT has joined #mlpack
kartikdutt18kart has joined #mlpack
birm[m] has joined #mlpack
GauravGhati[m] has joined #mlpack
kaushal[m] has joined #mlpack
jeffin143[m] has joined #mlpack
7ITAADQR5 has joined #mlpack
zoq[m]1 has joined #mlpack
7ITAADQR5 has quit [Client Quit]
jonpsy[m] has joined #mlpack
7F1AABHGO has joined #mlpack
piocolo[m] has joined #mlpack
Aakash-kaushikAa has joined #mlpack
7F1AABHGO has quit [Client Quit]
_slack_mlpack_19 has joined #mlpack
NippunSharmaNipp has joined #mlpack
muis[m] has joined #mlpack
rcurtin[m]1 has joined #mlpack
abernauer[m] has joined #mlpack
RishabhGarg108Ri has joined #mlpack
HemalMamtora[m] has joined #mlpack
_slack_mlpack_22 has joined #mlpack
VarunGupta[m] has joined #mlpack
yashwants19[m] has joined #mlpack
94KAAC193 has joined #mlpack
32NAABYG5 has joined #mlpack
32NAABYG5 has quit [Client Quit]
94KAAC193 has quit [Client Quit]
7ITAADQR9 has joined #mlpack
7ITAADQR9 has quit [Client Quit]
7F1AABHGU has joined #mlpack
7F1AABHGU has quit [Client Quit]
_slack_mlpack_25 has joined #mlpack
7ITAADQSA has joined #mlpack
7F1AABHGY has joined #mlpack
M7F1AABHGO has joined #mlpack
7ITAADQSA has quit [Client Quit]
7F1AABHGY has quit [Client Quit]
RishabhGarg108[m has joined #mlpack
_slack_mlpack_28 has joined #mlpack
_slack_mlpack_31 has joined #mlpack
M32NAABYG5 has joined #mlpack
M7ITAADQR9 has joined #mlpack
M7F1AABHGU has joined #mlpack
_slack_mlpack_34 has joined #mlpack
M7ITAADQSA has joined #mlpack
_slack_mlpack_37 has joined #mlpack
_slack_mlpack_40 has joined #mlpack
ChaithanyaNaik[m has joined #mlpack
_slack_mlpack_43 has joined #mlpack
ShivamShaurya[m] has joined #mlpack
AlexNguyen[m] has joined #mlpack
KumarArnav[m] has joined #mlpack
JatoJoseph[m] has joined #mlpack
_slack_mlpack_46 has joined #mlpack
AbhishekMishra[m has joined #mlpack
AvikantSrivasta4 has joined #mlpack
KrishnaSashank[m has joined #mlpack
7F1AABHHK has joined #mlpack
Kaushalc64[m] has joined #mlpack
7IZAAEXSY has joined #mlpack
7F1AABHHK has quit [Client Quit]
7IZAAEXSY has quit [Client Quit]
_slack_mlpack_49 has joined #mlpack
SaiVamsi[m] has joined #mlpack
KshitijAggarwal[ has joined #mlpack
MatheusAlcntaraS has joined #mlpack
MayankRaj[m] has joined #mlpack
M7ITAADQR5 has joined #mlpack
himanshu_pathak[ has joined #mlpack
M7F1AABHHK has joined #mlpack
M94KAAC19T has joined #mlpack
huberspot[m] has joined #mlpack
fazamuhammad[m] has joined #mlpack
Prometheus[m] has joined #mlpack
M7IZAAEXSY has joined #mlpack
VedantaJha[m] has joined #mlpack
dkipke[m] has joined #mlpack
M7F1AABHGY has joined #mlpack
swaingotnochill[ has joined #mlpack
M94KAAC193 has joined #mlpack
< rcurtin[m]> here are the slides I used, please let me know if there are any questions or anything: https://docs.google.com/presentation/d/11OBWuGs0AwNTnJ7bLopfGZA0_P6bdde_ENfHK6mFpps/edit?usp=sharing
< rcurtin[m]> thanks everyone for joining! it was great to meet all of you :)
< rcurtin[m]> and if anyone has any music recommendations, I am all ears 😃 personally I have been listening to a lot of Wilco lately
< jonpsy[m]> Nice meeting y'all :)!!
< jjb[m]> Great meeting everyone as well 🙂
< heisenbuugGopiMT> When we make a PR github runs checks on it, like memory checks, style checks etc. How can I run those checks on my local machine? i.e. before making the PR.
< zoq> heisenbuugGopiMT: To run the style check you can use - https://github.com/mlpack/jenkins-conf/tree/master/linter
< zoq> heisenbuugGopiMT: To run the memory checks you can use - https://github.com/mlpack/jenkins-conf/tree/master/memory
< zoq> heisenbuugGopiMT: To run the static analysis - https://github.com/mlpack/jenkins-conf/tree/master/static-code-analysis
< heisenbuugGopiMT> okay, thank you.
< zoq> You can use the https://hub.docker.com/r/mlpack/static-code-analysis it has all the packages.
< heisenbuugGopiMT> I will try this...
< heisenbuugGopiMT> I will be sure to listen to them...
< heisenbuugGopiMT> I generally listen to Bach when I work or some lofi music recently maybe...
< rcurtin[m]> nice, that is good working music :)
< zoq> Since you mentioned lofi, I really like beowulf - https://soundcloud.com/rareartpoetry/popular-tracks
< jonpsy[m]> <heisenbuugGopiMT "I generally listen to Bach when "> If you're interested in classics I totally recommend Chopin's Nocturne s there so "classy". There's also a whole new genre of modern classic, for ex: La valse da Amelie or Comptine dun autre ete by Yann Tiersen they perfectly capture the spirit of France(my french is not that good).
< zoq> Some really great stuff
< heisenbuugGopiMT> I will check them...
< heisenbuugGopiMT> Getting to know new music while you discuss code, that's awesome, a great start already...
ImQ009 has quit [Quit: Leaving]
< heisenbuugGopiMT> I made some changes to the code, but looks like I am getting some error when I am running `make` command, am I calling the functions wrong?
< shrit[m]> For music recommendation I like Ludovico Einaudi music
< shrit[m]> I am also fan of most neoclassical and minimalist music in general
< shrit[m]> heisenbuug (Gopi M Tatiraju): what is the error you are getting when running make?
< jonpsy[m]> <shrit[m] "For music recommendation I like "> OMG Same! I'm actually learning Nuvole Bianche maybe I should post someday 😄. You should definitely check out Yann Tiersen though
< heisenbuugGopiMT> Function not found, but I solved it by adding the template parameter before the function call. Is it compulsory to add that? I don't see it in arma's implementation...
< shrit[m]> sorry, you have put the error at the end of the gist I did not notice
< shrit[m]> let me check
< heisenbuugGopiMT> Yes...
< shrit[m]> That is normal, template parameters are the type of the armadillo matrix
< shrit[m]> so you need to tell the compiler explicitly what is the type
< shrit[m]> Unless if you have defined a default type in the template parameters itself.
< shrit[m]> You should not use just `eT`, you can replace it by `double` and it will be fine
< shrit[m]> have a look at line 63
< heisenbuugGopiMT> So should I declare double as default type?
< heisenbuugGopiMT> And then I won't need to add `<...>` at the time of function call?
< shrit[m]> I do not think you need to change the template type in the code. However when you use these functions you need to define the type of the armadillo matrix that is going to be used
< shrit[m]> and then the compiler will deduce the type in this case
< heisenbuugGopiMT> okay, got it...
< heisenbuugGopiMT> I need to do more reading on templates, it never feels enough...
< shrit[m]> You can also change the code style to follow the mlpack style 👍️
< shrit[m]> it will be easier to read
< heisenbuugGopiMT> Okay, I will make changes accordingly...
< shrit[m]> jonpsy: Yeah, I know Yann Tiersen music
< shrit[m]> heisenbuug (Gopi M Tatiraju): Also you can change the name from `new_parser` into `csv_parser` since that what it does 😀
< heisenbuugGopiMT> Yea, I will make that change as well.
< shrit[m]> There are really few compositors having the same musical style as Tiersen in France. However, there are much more in Germany.
< heisenbuugGopiMT> Haven't watched the anime yet, but I love the music...
< jonpsy[m]> You're missing out a ton if you haven't seen it. It's one of the best (and it has a lot more beautiful music)
< heisenbuugGopiMT> Yea, I haven't watched the latest season actually, I was planning on reading the manga...
< jonpsy[m]> The 4th season has soo many plot twists. Please don't read the manga though, it'll spoil your anime experience ;)