本文主要是介绍sklearn:特征与树木森林的重要性,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
这个例子展示了使用树木森林来评估特征对人工分类任务的重要性。 红色条是森林的特征重要性,以及它们的树间变异性。
正如预期的那样,该情节表明3个特征是提供信息的,而其余的则没有。
import numpy as np
import matplotlib.pyplot as pltfrom sklearn.datasets import make_classification
from sklearn.ensemble import ExtraTreesClassifier# Build a classification task using 3 informative features
X, y = make_classification(n_samples=1000,n_features=10,n_informative=3,n_redundant=0,n_repeated=0,n_classes=2,random_state=0,shuffle=False)# Build a forest and compute the feature importances
forest = ExtraTreesClassifier(n_estimators=250,random_state=0)forest.fit(X, y)
importances = forest.feature_importances_
std = np.std([tree.feature_importances_ for tree in forest.estimators_],axis=0)
indices = np.argsort(importances)[::-1]# Print the feature ranking
print("Feature ranking:")for f in range(X.shape[1]):print("%d. feature %d (%f)" % (f + 1, indices[f], importances[indices[f]]))# Plot the feature importances of the forest
plt.figure()
plt.title("Feature importances")
plt.bar(range(X.shape[1]), importances[indices],color="r", yerr=std[indices], align="center")
plt.xticks(range(X.shape[1]), indices)
plt.xlim([-1, X.shape[1]])
plt.show()
Feature ranking:
1. feature 1 (0.295902)
2. feature 2 (0.208351)
3. feature 0 (0.177632)
4. feature 3 (0.047121)
5. feature 6 (0.046303)
6. feature 8 (0.046013)
7. feature 7 (0.045575)
8. feature 4 (0.044614)
9. feature 9 (0.044577)
10. feature 5 (0.043912)

这篇关于sklearn:特征与树木森林的重要性的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!