<Guest8362>
hi, I have been working essentially all day trying to get KFold cross-validation of the perceptron model to compile, and I can't. Is there someone here who can help me?
rcurtin_matrixor has joined #mlpack
<ShikharJShikharJ>
Nice, thanks. It sounded pretty impressive that they were doing this 5 years back. There are a lot of startups now that are trying to enter the same space and solve various front-end bug fixes and QA problems.
<ShikharJShikharJ>
It's amazing they had the foresight to see this problem as a PL for ML kind of a problem too. These people were clearly pushing on a frontier.
Guest8362 has quit [Quit: Client closed]
mohamedAdhamcmoh has quit [Quit: You have been kicked for being idle]
rcurtin_matrixor has quit [Quit: You have been kicked for being idle]
Guest8328 has joined #mlpack
<Guest8328>
is this a place I can get help implementing something simple in mlpack?
rcurtin_matrixor has joined #mlpack
<Guest8328>
I spent all day yesterday trying to get perceptron to work and couldn't.
_slack_mlpack_31 has joined #mlpack
_slack_mlpack_31 is now known as BenYarn[m]
<BenYarn[m]>
I moved from the Libera channel to here. Hoping someone can help me.
<zoq[m]>
I have seen the Perceptron issue you opened, it's on my list for today.
<rcurtin[m]>
I doubt I have time to look into it today, but I did wonder if the issue here is that the hyperparameter tuning framework is expecting a `numClasses` parameter to `Train()`, but `Perceptron` does not have it because perceptrons are binary classifiers. But, my memory of the hyperparameter tuning system is not great, so I'm not sure if that's actually what is going on here. Ben Yarn sorry that you are having trouble with it!
<rcurtin[m]>
by the way, the Libera channel is bridged to here, but the bridge is... flaky? I see your second message in this channel, but not he first. there hasn't been time to figure out what is going on
<rcurtin[m]>
s/he/the/
<BenYarn[m]>
Thanks.I tried passing 1, 2, and 3 size_ts to Evaluate. I'm more and more starting to think it is a bug, but since this is my first not-copied-and-pasted mlpack model, I feel it is much more likely I am misunderstanding something. After doing a deep dive, the error is emitted by std::is_constructible<MLAlgorithm, const MatType&, const PredictionsType&,
<BenYarn[m]>
"The given MLAlgorithm is not constructible from the passed arguments");
<BenYarn[m]>
Thanks.I tried passing 1, 2, and 3 size_ts to Evaluate. I'm more and more starting to think it is a bug, but since this is my first not-copied-and-pasted mlpack model, I feel it is much more likely I am misunderstanding something. After doing a deep dive, the error is emitted by this line:
<BenYarn[m]>
`"The given MLAlgorithm is not constructible from the passed arguments");`
<BenYarn[m]>
Thanks. I tried passing 1, 2, and 3 size_ts to Evaluate, but it gave the same error. I'm more and more starting to think it is a bug, but since this is my first not-copied-and-pasted mlpack model, I feel it is much more likely I am misunderstanding something. After doing a deep dive, the error is emitted by a... (full message at https://libera.ems.host/_matrix/media/r0/download/libera.chat/fe5436b6b0820307f1541c9bb4fe46a9dc55547f)
<rcurtin[m]>
yes, definitely, more information is helpful 😄
<rcurtin[m]>
now just as an off-the-cuff suggestion, maybe you could try with `DecisionTree` instead of `Perceptron`? `DecisionTree` has a number of classes parameter, and perhaps it will work?
<BenYarn[m]>
DecisionTree compiled. I used these lines:
<BenYarn[m]>
The only thing I changed was the first template parameter of KFoldCV from Perceptron to DecisionTree.
<rcurtin[m]>
👍️ I see, then it seems probable that the issue here is a lack of support for binary classifiers---it seems like the hyperparameter tuner must expect a `numClasses` parameter, and doesn't know what to do when it's not there
<rcurtin[m]>
if you really want to get your hands dirty, I suspect that if you added a `const size_t numClasses` parameter to `Perceptron::Train()` in the same position as `DecisionTree::Train()`'s `numClasses` parameter (even if it's just plain ignored in the body of the function), then compilation would succeed
<rcurtin[m]>
(I would get my hands dirty but today is a bad meeting day, so the only thing I have bandwidth to do is dump ideas here 😢)
<zoq[m]>
rcurtin[m]: Not sure yet, if I'm able to join the meeting today.
<rcurtin[m]>
yeah unfortunately I am not going to be able to today :(
<rcurtin[m]>
oh, wait! `Perceptron` *does* have a `numClasses` parameter. I didn't realize this
<rcurtin[m]>
I think you may be right; I see that the `Perceptron::Train()` function does take an `instanceWeights` parameter, and it looks like it is in the same position as for `DecisionTree`
<rcurtin[m]>
so at least immediately to me I am not sure what the issue is, but I do at least think you are on the right track
<rcurtin[m]>
(again I hope to be able to dig deeper, I just can't manage that today)
<BenYarn[m]>
thank you for trying!
rcurtin_matrixor has quit [Quit: You have been kicked for being idle]
<BenYarn[m]>
Ok, I have a bit more info. I was focused on the overloads of TrainModel above. I don't think that is the concern here. There are two overloads of TrainAndEvaluate starting on line 235 of k_fold_cv_impl.hpp. The difference between the signatures of these overloads occur in lines 240 and 267.... (full message at https://libera.ems.host/_matrix/media/r0/download/libera.chat/05c3c7e2bdde17110c3854e8039ff9498aa80151)
<BenYarn[m]>
The answer to the question "where did weights come from?" is "It is a member of KFoldCV, it just has size 0."
<BenYarn[m]>
In conclusion, I'm calling this a bug.
<rcurtin[m]>
thanks for all the digging! this should make it easier to come up with a solution