rcurtin_irc changed the topic of #mlpack to: mlpack: a scalable machine learning library (https://www.mlpack.org/) -- channel logs: https://libera.irclog.whitequark.org/mlpack -- NOTE: messages sent here might not be seen by bridged users on matrix, gitter, or slack
<AnaB[m]> <OmSurase[m]> "zoq I was looking through the "..." <- Same here, if possible, please :)
<AnaB[m]> > <@anablaz:gitter.im> Hello everyone! My name is Ana BlaΕΎ and I'm an undergraduate computer science student from the University of Ljubljana, Slovenia. I'm interested especially in two of the projects in the Ready to use Models in mlpack section - Adding ready to use state of the art deep learning models and Showcasing... (full message at
krushia has quit [Remote host closed the connection]
krushia has joined #mlpack
batyousefx[m] has joined #mlpack
Guest209 has joined #mlpack
Guest209 has quit [Client Quit]
vmpyrUjjwalSarsw has quit [Quit: Client limit exceeded: 20000]
spottAndrewSpott has quit [Quit: Client limit exceeded: 20000]
ManavKu37708330_ has quit [Quit: Client limit exceeded: 20000]
kritika12298Krit has quit [Quit: Client limit exceeded: 20000]
OmkarGowda990Omk has quit [Quit: Client limit exceeded: 20000]
arnscottarnscott has quit [Quit: Client limit exceeded: 20000]
intragalactic-st has quit [Quit: Client limit exceeded: 20000]
Drock2001Drock20 has quit [Quit: Client limit exceeded: 20000]
nappaillavVallia has quit [Quit: Client limit exceeded: 20000]
PKhurana09Pranav has quit [Quit: Client limit exceeded: 20000]
vstflugelVaibhav has quit [Quit: Client limit exceeded: 20000]
vsai121VSaichara has quit [Quit: Client limit exceeded: 20000]
nilesh05aprNiles has quit [Quit: Client limit exceeded: 20000]
UmarJUmar[m] has quit [Quit: Client limit exceeded: 20000]
Sujal-vajireSuja has quit [Quit: Client limit exceeded: 20000]
rxng8AlexNguyen[ has quit [Quit: Client limit exceeded: 20000]
nj1902NamanJain[ has quit [Quit: Client limit exceeded: 20000]
abhivarma07Abhij has quit [Quit: Client limit exceeded: 20000]
dakshpokarDakshP has quit [Quit: Client limit exceeded: 20000]
Shefali321Shefal has quit [Quit: Client limit exceeded: 20000]
dittapandeyAdity has quit [Quit: Client limit exceeded: 20000]
camolezicamolezi has quit [Quit: Client limit exceeded: 20000]
R-AravindRAravin has quit [Quit: Client limit exceeded: 20000]
AgrawalTaapas_tw has quit [Quit: Client limit exceeded: 20000]
AvikantSrivastav has quit [Quit: Client limit exceeded: 20000]
jacob-earlejacob has quit [Quit: Client limit exceeded: 20000]
code-ash-IITAshu has quit [Quit: Client limit exceeded: 20000]
intello72MihirAg has quit [Quit: Client limit exceeded: 20000]
imraniacImranAla has quit [Quit: Client limit exceeded: 20000]
ayushi2019031ayu has quit [Quit: Client limit exceeded: 20000]
adithya-tpAdithy has quit [Quit: Client limit exceeded: 20000]
kunalsingh2002ku has quit [Quit: Client limit exceeded: 20000]
kshitijandmojoks has quit [Quit: Client limit exceeded: 20000]
isohamnemesisSoh has quit [Quit: Client limit exceeded: 20000]
tamandeepsTamand has quit [Quit: Client limit exceeded: 20000]
varun-b-gVarunBG has quit [Quit: Client limit exceeded: 20000]
reekithakAkhilSa has quit [Quit: Client limit exceeded: 20000]
pranav-mscpranav has quit [Quit: Client limit exceeded: 20000]
ImagineZero0Imag has quit [Quit: Client limit exceeded: 20000]
ishduttIshduttTr has quit [Quit: Client limit exceeded: 20000]
priyash555priyas has quit [Quit: Client limit exceeded: 20000]
ayanbagAyanBag[m has quit [Quit: Client limit exceeded: 20000]
ayushmcodesayush has quit [Quit: Client limit exceeded: 20000]
KevinMathewTKevi has quit [Quit: Client limit exceeded: 20000]
domenicostefaniD has quit [Quit: Client limit exceeded: 20000]
nisha1729NishaG4 has quit [Quit: Client limit exceeded: 20000]
ayush29AyushJain has quit [Quit: Client limit exceeded: 20000]
sakshamv30Saksha has quit [Quit: Client limit exceeded: 20000]
matakshayAkshayM has quit [Quit: Client limit exceeded: 20000]
parthsureshParth has quit [Quit: Client limit exceeded: 20000]
fRod5799_twitter has quit [Quit: Client limit exceeded: 20000]
HimanshuPathak[m has quit [Quit: Client limit exceeded: 20000]
vivektalwar13071 has quit [Quit: Client limit exceeded: 20000]
AryamanBhagatAry has quit [Quit: Client limit exceeded: 20000]
yfgeng6yfgeng6[m has quit [Quit: Client limit exceeded: 20000]
godfather-ranaAm has quit [Quit: Client limit exceeded: 20000]
kartikdutt18kart has quit [Quit: Client limit exceeded: 20000]
Chinmay-GurjarCh has quit [Quit: Client limit exceeded: 20000]
nikochikoKaustub has quit [Quit: Client limit exceeded: 20000]
Karthikeyan564Ka has quit [Quit: Client limit exceeded: 20000]
NinaPacifierNish has quit [Quit: Client limit exceeded: 20000]
geek-2002Sankirt has quit [Quit: Client limit exceeded: 20000]
EshitabhargavaEs has quit [Quit: Client limit exceeded: 20000]
kkg2001KrishnaKa has quit [Quit: Client limit exceeded: 20000]
santhosh-RPSanth has quit [Quit: Client limit exceeded: 20000]
webcodermcs[m] has quit [Quit: Client limit exceeded: 20000]
robobub has quit [Quit: Connection closed for inactivity]
LIERO has joined #mlpack
LIERO has quit [Changing host]
smit1603[m] has quit [Quit: Client limit exceeded: 20000]
_slack_mlpack_31 has quit [Quit: Client limit exceeded: 20000]
_slack_mlpack_31 has joined #mlpack
_slack_mlpack_34 has quit [Quit: Client limit exceeded: 20000]
_slack_mlpack_34 has joined #mlpack
coatless[m] has quit [Quit: You have been kicked for being idle]
_slack_mlpack_31 has quit [Quit: You have been kicked for being idle]
_slack_mlpack_31 has joined #mlpack
akhunti1[m] has joined #mlpack
<vaibhavp[m]> Hey zoq! I was working a little deeper into the DAG Network and I had a few ideas and improvements on the API thatI have been working on (though it's not completely working at the moment πŸ₯²). But I thought I should follow up with you what I have been working on, I think it's looking great and wanted your opinion on it. I have created a prototype of residual block(from ResNet architecture):
<vaibhavp[m]> [ResidualBlock](https://github.com/mrdaybird/mlpack-dag-network/blob/devel/ResidualBlock.cpp). It's looking a bit rough, but I am excited about the capabilities of the API. Please share your thoughts! Thank you!
<akhunti1[m]> Hi All ,
<akhunti1[m]> I am using Mlpack cli command this : [ ./mlpack4/mlpack_random_forest --input_model_file /prod_model/mlpack4_cli_rf_model.bin --test_file /prod_model/np_test.csv --test_labels_file /prod_model/test_labels.csv --probabilities_file /prod_model/test_probabilites.csv --verbose ] this to execute mlpack model in data bricks . and it is working as expected .
<akhunti1[m]> but now i want to load same model in c++ and do inferencing :
<akhunti1[m]> for that i have written code like this :mat dataset;... (full message at <https://libera.ems.host/_matrix/media/v3/download/libera.chat/b4b5f42632fb600c1877dae34a7729151b2083db>)
<akhunti1[m]> could u help me how to write c++ code to load same model in C++ for inferencing ?
<rcurtin[m]> akhunti1: sorry for this slightly confusing bit. the model serialized by the `mlpack_random_forest` program does not have type `RandomForest`, it instead has type `RandomForestModel`. however, that `RandomForestModel` class is only in `random_forest_main.cpp`. so you could copy the definition into your code: https://github.com/mlpack/mlpack/blob/master/src/mlpack/methods/random_forest/random_forest_main.cpp#L138-L158
<rcurtin[m]> and then, you could load like this:... (full message at <https://libera.ems.host/_matrix/media/v3/download/libera.chat/42c7bd86cadbae83070e8ed41b88aa6a7d43c39c>)
<rcurtin[m]> this is an ugly bit of the code, sorry for that inconvenience. I hope to be able to clean it up soon
<IWNMWEIWNMWE[m]> Hey rcurtin Is there any checklist or methodology mlpack follows for benchmarking models
<IWNMWEIWNMWE[m]> for example the boosting algorithms like adaboost and xgboost
<akhunti1[m]> #include "seldon/SeldonModel.hpp"... (full message at <https://libera.ems.host/_matrix/media/v3/download/libera.chat/45da261bf481b873e147dcab8bcc46e39400ab03>)
<akhunti1[m]> Hi rcurtin
<akhunti1[m]> I am using like this :
<akhunti1[m]> #include "seldon/SeldonModel.hpp"... (full message at <https://libera.ems.host/_matrix/media/v3/download/libera.chat/beed8c91e29e9e2fd96dc9de8707a51500fa7c93>)
<akhunti1[m]> but getting error : invalid declaration of member template in local class
<rcurtin[m]> your loading code doesn't appear to be in a function, it is just in the class itself; however, once you make it correct C++, that is the correct way to load the model you trained from the command line
<akhunti1[m]> ohh ok , Thanks
<akhunti1[m]> Hi rcurtin Thanks it is working fine .
<akhunti1[m]> Hi rcurtin as per the cli command [ ./mlpack4/mlpack_random_forest --input_model_file /prod_model/mlpack4_cli_rf_model.bin --test_file /prod_model/np_test.csv --test_labels_file /prod_model/test_labels.csv --probabilities_file /prod_model/test_probabilites.csv --verbose ]
<akhunti1[m]> i need to load this 2 files [ np_test.csv , test_labels.csv ] for inferencing and it will automatically generate out put [ test_probabilites.csv ]
<akhunti1[m]> for that i have written code like this :... (full message at <https://libera.ems.host/_matrix/media/v3/download/libera.chat/fb978259f2e5c242b560e72d1ac5c4856b51a0ed>)
<akhunti1[m]> but could u please help me to add predict method here ,
krushia has quit [Quit: Konversation terminated!]
<akhunti1[m]> but getting error : ["RandomForest::Classify(): no random forest trained!" ]
<akhunti1[m]> Any idea pls why i am getting this error
krushia has joined #mlpack
<akhunti1[m]> Hi rcurtin in that case just wanted to know , how it is working on this cli comand [ ./mlpack4/mlpack_random_forest --input_model_file /prod_model/mlpack4_cli_rf_model.bin --test_file /prod_model/np_test.csv --test_labels_file /prod_model/test_labels.csv --probabilities_file /prod_model/test_probabilites.csv --verbose ] in data bricks . could you pls help me to understand .
<vaibhavp[m]> <akhunti1[m]> "Any idea pls why i am getting..." <- The Random forest model has to be trained for it to classify. So, I think the input model(i.e. your .bin file) is not already trained. Either train this model after loading, or get a pre-trained model.
<zoq[m]> vaibhavp[m]: correct
<akhunti1[m]> Thanks vaibhavp and zoq for ur help .
<akhunti1[m]> vaibhavp: I have written code :
<akhunti1[m]> RandomForest<>& rf = m.rf;
<akhunti1[m]> rf.Classify(testDataset, testPredictions);
<akhunti1[m]> Row<size_t> testPredictions;
<akhunti1[m]> just wanted to know is it [ rf.Classify(testDataset, testPredictions); ] or [ m.rf.Classify(testDataset, testPredictions);]
<akhunti1[m]> which one is correct ?
<rcurtin[m]> either of those should be the same since rf is just a reference to m.rf
<akhunti1[m]> Hi All just wanted to know i executed this cli command [ ./mlpack4/mlpack_random_forest --labels_file "/prod_model/train_labels.csv" --num_trees 600 --maximum_depth 10 --training_file "/prod_model/np_train.csv" --verbose -M "/prod_model/mlpack4_cli_rf_model.bin" --print_training_accuracy --seed 250783 ] to train the model
<akhunti1[m]> And it is executed fine , And then same model i load [ mlpack4_cli_rf_model.bin ] in c++ for inferencing , then how is throwing error [ RandomForest::Classify(): no random forest trained!" ]
<rcurtin[m]> can you use the same model from the command-line successfully? e.g. in a second command to mlpack_random_forest?
<akhunti1[m]> [INFO ] Loading '/prod_model/np_train.csv' as CSV data. Size is 304 x 7279796.... (full message at <https://libera.ems.host/_matrix/media/v3/download/libera.chat/4edd623dd11e4d65e77aced641b836c05144c0b3>)
<akhunti1[m]> This is the log file generated when the cli command executed in data bricks for the same model
<rcurtin[m]> πŸ‘οΈ that looks good, but what if you run a second time to compute predictions on a test set? (or even the training set again?) I just want to see if the command-line program also reports that no model is trained
<akhunti1[m]> no command line program is not throwing error [ no random forest trained! ]
<akhunti1[m]> only when downloaded the model , and load for inferencing that time it is throwing me error .
<akhunti1[m]> only when i executed like this :... (full message at <https://libera.ems.host/_matrix/media/v3/download/libera.chat/d7172aab4ecc1199dbe48a31fb06db4d84f5e9fd>)
<akhunti1[m]> This time it is throwing me error [ [ no random forest trained! ]
<vaibhavp[m]> > <@akhunti1:matrix.org> [INFO ] Loading '/prod_model/np_train.csv' as CSV data. Size is 304 x 7279796.... (full message at <https://libera.ems.host/_matrix/media/v3/download/libera.chat/b076835f7f9df9deb23a116677c5dd2b58d8b9c0>)
<akhunti1[m]> no no sorry for confusion , it is same
<akhunti1[m]> i checked , still i am getting this error
<rcurtin[m]> did loading succeed? `data::Load()` returns a boolean, so you can check to see if that boolean is true, or, you can throw an exception if loading fails by using `data::Load("mlpack4_cli_rf_model.bin", "model", m, true)`
<akhunti1[m]> sure , i am checking
<akhunti1[m]> sorry Team , for confusion it is my mistake , naming issue .
<akhunti1[m]> <vaibhavp[m]> "> <@akhunti1:matrix.org> [INFO ]..." <- Thanks ur correct .
<vaibhavp[m]> akhunti1: No worries! Been there. Happens to all of us. πŸ™‚
<rcurtin[m]> glad you got it worked out πŸ‘οΈπŸ‘οΈ