LinearSVC doesn’t have predict_proba
Here is the quick solution for this
Hi, guys, this blog is having the solution for those who are looking for finding the probability of predicted classes for LinearSVC.
What I am talking about?
If you are looking for the probability distribution for multiclass classification the predicted class … the easiest way is using classifier.predict_proba will return you the probability distribution. as shown below
model.predict_proba()
The sample was taken from sklearn:
>>> from sklearn.datasets import load_iris
>>> from sklearn.linear_model import LogisticRegression
>>> X, y = load_iris(return_X_y=True)
>>> clf = LogisticRegression(random_state=0).fit(X, y)
>>> clf.predict(X[:2, :])
array([0, 0])
>>> clf.predict_proba(X[:2, :])
array([[9.8...e-01, 1.8...e-02, 1.4...e-08],
[9.7...e-01, 2.8...e-02, ...e-08]])
>>> clf.score(X, y)
0.97...
But What if you model is LinearSVC
If you are working on a multilabel classifier, and you got a good score with OneVsRestClassifier that worked the best for you. but unfortunate that sklearn doesn’t provide predict_proba for the same.
LinearSVC
Yes, I too searched too for it.. But the good news is here is the solution
predict_proba_dist = clf.decision_function(X_test)
you will get something like this (for me i have here 6 class multilabel clf )
Now we can use softmax on this to get the proper distribution of it.
def softmax(x):
"""Compute softmax values for each sets of scores in x."""
e_x = np.exp(x - np.max(x))
return e_x / e_x.sum(axis=0)
Now as we have the array data just need to loop through
pred_probability = []
for eachArr in new_pred_pro:
pred_probability.append(softmax(eachArr))
DONE —
your result will look like this
You got the probability guys… just like predict_proba()
Thanks for reading up to this.
If you like the article please make sure to give a clap. Please follow me for more projects and articles on my Github and my medium profile.
Don’t forget to check out the end to end deployment of a deep learning project with Android application development.
Thanks. Please comment down for any queries.