R语言朴素贝叶斯技术怎么使用
本篇内容主要讲解“R语言朴素贝叶斯技术怎么使用”,感兴趣的朋友不妨来看看。本文介绍的方法操作简单快捷,实用性强。下面就让小编来带大家学习“R语言朴素贝叶斯技术怎么使用”吧!
网站建设哪家好,找创新互联公司!专注于网页设计、网站建设、微信开发、小程序开发、集团企业网站建设等服务项目。为回馈新老客户创新互联还提供了沙依巴克免费建站欢迎大家使用!
安装package:
> install.packages("e1071")
导入e1071:
> library(e1071)
找一个数据集:
> data(iris) > iris Sepal.Length Sepal.Width Petal.Length Petal.Width Species 1 5.1 3.5 1.4 0.2 setosa 2 4.9 3.0 1.4 0.2 setosa 3 4.7 3.2 1.3 0.2 setosa 4 4.6 3.1 1.5 0.2 setosa 5 5.0 3.6 1.4 0.2 setosa 6 5.4 3.9 1.7 0.4 setosa 7 4.6 3.4 1.4 0.3 setosa 8 5.0 3.4 1.5 0.2 setosa 9 4.4 2.9 1.4 0.2 setosa 10 4.9 3.1 1.5 0.1 setosa 11 5.4 3.7 1.5 0.2 setosa 12 4.8 3.4 1.6 0.2 setosa 13 4.8 3.0 1.4 0.1 setosa 14 4.3 3.0 1.1 0.1 setosa 15 5.8 4.0 1.2 0.2 setosa 16 5.7 4.4 1.5 0.4 setosa 17 5.4 3.9 1.3 0.4 setosa 18 5.1 3.5 1.4 0.3 setosa 19 5.7 3.8 1.7 0.3 setosa 20 5.1 3.8 1.5 0.3 setosa 21 5.4 3.4 1.7 0.2 setosa 22 5.1 3.7 1.5 0.4 setosa 23 4.6 3.6 1.0 0.2 setosa 24 5.1 3.3 1.7 0.5 setosa 25 4.8 3.4 1.9 0.2 setosa 26 5.0 3.0 1.6 0.2 setosa 27 5.0 3.4 1.6 0.4 setosa 28 5.2 3.5 1.5 0.2 setosa 29 5.2 3.4 1.4 0.2 setosa 30 4.7 3.2 1.6 0.2 setosa 31 4.8 3.1 1.6 0.2 setosa 32 5.4 3.4 1.5 0.4 setosa 33 5.2 4.1 1.5 0.1 setosa 34 5.5 4.2 1.4 0.2 setosa 35 4.9 3.1 1.5 0.2 setosa 36 5.0 3.2 1.2 0.2 setosa 37 5.5 3.5 1.3 0.2 setosa 38 4.9 3.6 1.4 0.1 setosa 39 4.4 3.0 1.3 0.2 setosa 40 5.1 3.4 1.5 0.2 setosa 41 5.0 3.5 1.3 0.3 setosa 42 4.5 2.3 1.3 0.3 setosa 43 4.4 3.2 1.3 0.2 setosa 44 5.0 3.5 1.6 0.6 setosa 45 5.1 3.8 1.9 0.4 setosa 46 4.8 3.0 1.4 0.3 setosa 47 5.1 3.8 1.6 0.2 setosa 48 4.6 3.2 1.4 0.2 setosa 49 5.3 3.7 1.5 0.2 setosa 50 5.0 3.3 1.4 0.2 setosa 51 7.0 3.2 4.7 1.4 versicolor 52 6.4 3.2 4.5 1.5 versicolor 53 6.9 3.1 4.9 1.5 versicolor 54 5.5 2.3 4.0 1.3 versicolor 55 6.5 2.8 4.6 1.5 versicolor 56 5.7 2.8 4.5 1.3 versicolor 57 6.3 3.3 4.7 1.6 versicolor 58 4.9 2.4 3.3 1.0 versicolor 59 6.6 2.9 4.6 1.3 versicolor 60 5.2 2.7 3.9 1.4 versicolor 61 5.0 2.0 3.5 1.0 versicolor 62 5.9 3.0 4.2 1.5 versicolor 63 6.0 2.2 4.0 1.0 versicolor 64 6.1 2.9 4.7 1.4 versicolor 65 5.6 2.9 3.6 1.3 versicolor 66 6.7 3.1 4.4 1.4 versicolor 67 5.6 3.0 4.5 1.5 versicolor 68 5.8 2.7 4.1 1.0 versicolor 69 6.2 2.2 4.5 1.5 versicolor 70 5.6 2.5 3.9 1.1 versicolor 71 5.9 3.2 4.8 1.8 versicolor 72 6.1 2.8 4.0 1.3 versicolor 73 6.3 2.5 4.9 1.5 versicolor 74 6.1 2.8 4.7 1.2 versicolor 75 6.4 2.9 4.3 1.3 versicolor 76 6.6 3.0 4.4 1.4 versicolor 77 6.8 2.8 4.8 1.4 versicolor 78 6.7 3.0 5.0 1.7 versicolor 79 6.0 2.9 4.5 1.5 versicolor 80 5.7 2.6 3.5 1.0 versicolor 81 5.5 2.4 3.8 1.1 versicolor 82 5.5 2.4 3.7 1.0 versicolor 83 5.8 2.7 3.9 1.2 versicolor 84 6.0 2.7 5.1 1.6 versicolor 85 5.4 3.0 4.5 1.5 versicolor 86 6.0 3.4 4.5 1.6 versicolor 87 6.7 3.1 4.7 1.5 versicolor 88 6.3 2.3 4.4 1.3 versicolor 89 5.6 3.0 4.1 1.3 versicolor 90 5.5 2.5 4.0 1.3 versicolor 91 5.5 2.6 4.4 1.2 versicolor 92 6.1 3.0 4.6 1.4 versicolor 93 5.8 2.6 4.0 1.2 versicolor 94 5.0 2.3 3.3 1.0 versicolor 95 5.6 2.7 4.2 1.3 versicolor 96 5.7 3.0 4.2 1.2 versicolor 97 5.7 2.9 4.2 1.3 versicolor 98 6.2 2.9 4.3 1.3 versicolor 99 5.1 2.5 3.0 1.1 versicolor 100 5.7 2.8 4.1 1.3 versicolor 101 6.3 3.3 6.0 2.5 virginica 102 5.8 2.7 5.1 1.9 virginica 103 7.1 3.0 5.9 2.1 virginica 104 6.3 2.9 5.6 1.8 virginica 105 6.5 3.0 5.8 2.2 virginica 106 7.6 3.0 6.6 2.1 virginica 107 4.9 2.5 4.5 1.7 virginica 108 7.3 2.9 6.3 1.8 virginica 109 6.7 2.5 5.8 1.8 virginica 110 7.2 3.6 6.1 2.5 virginica 111 6.5 3.2 5.1 2.0 virginica 112 6.4 2.7 5.3 1.9 virginica 113 6.8 3.0 5.5 2.1 virginica 114 5.7 2.5 5.0 2.0 virginica 115 5.8 2.8 5.1 2.4 virginica 116 6.4 3.2 5.3 2.3 virginica 117 6.5 3.0 5.5 1.8 virginica 118 7.7 3.8 6.7 2.2 virginica 119 7.7 2.6 6.9 2.3 virginica 120 6.0 2.2 5.0 1.5 virginica 121 6.9 3.2 5.7 2.3 virginica 122 5.6 2.8 4.9 2.0 virginica 123 7.7 2.8 6.7 2.0 virginica 124 6.3 2.7 4.9 1.8 virginica 125 6.7 3.3 5.7 2.1 virginica 126 7.2 3.2 6.0 1.8 virginica 127 6.2 2.8 4.8 1.8 virginica 128 6.1 3.0 4.9 1.8 virginica 129 6.4 2.8 5.6 2.1 virginica 130 7.2 3.0 5.8 1.6 virginica 131 7.4 2.8 6.1 1.9 virginica 132 7.9 3.8 6.4 2.0 virginica 133 6.4 2.8 5.6 2.2 virginica 134 6.3 2.8 5.1 1.5 virginica 135 6.1 2.6 5.6 1.4 virginica 136 7.7 3.0 6.1 2.3 virginica 137 6.3 3.4 5.6 2.4 virginica 138 6.4 3.1 5.5 1.8 virginica 139 6.0 3.0 4.8 1.8 virginica 140 6.9 3.1 5.4 2.1 virginica 141 6.7 3.1 5.6 2.4 virginica 142 6.9 3.1 5.1 2.3 virginica 143 5.8 2.7 5.1 1.9 virginica 144 6.8 3.2 5.9 2.3 virginica 145 6.7 3.3 5.7 2.5 virginica 146 6.7 3.0 5.2 2.3 virginica 147 6.3 2.5 5.0 1.9 virginica 148 6.5 3.0 5.2 2.0 virginica 149 6.2 3.4 5.4 2.3 virginica 150 5.9 3.0 5.1 1.8 virginica
Sepal意思是“花萼 ”,Petal意思是“ 花瓣”。很明显,前四列是花萼和花瓣的特征,第五列代表相应的分类。我们可以用这个数据集进行贝叶斯训练。
先看一下,对这个数据集summary的结果:
> summary(iris) Sepal.Length Sepal.Width Petal.Length Petal.Width Species Min. :4.300 Min. :2.000 Min. :1.000 Min. :0.100 setosa :50 1st Qu.:5.100 1st Qu.:2.800 1st Qu.:1.600 1st Qu.:0.300 versicolor:50 Median :5.800 Median :3.000 Median :4.350 Median :1.300 virginica :50 Mean :5.843 Mean :3.057 Mean :3.758 Mean :1.199 3rd Qu.:6.400 3rd Qu.:3.300 3rd Qu.:5.100 3rd Qu.:1.800 Max. :7.900 Max. :4.400 Max. :6.900 Max. :2.500
训练并查看训练结果:
> classifier<-naiveBayes(iris[,1:4], iris[,5]) > classifier Naive Bayes Classifier for Discrete Predictors Call: naiveBayes.default(x = iris[, 1:4], y = iris[, 5]) A-priori probabilities: iris[, 5] setosa versicolor virginica 0.3333333 0.3333333 0.3333333 Conditional probabilities: Sepal.Length iris[, 5] [,1] [,2] setosa 5.006 0.3524897 versicolor 5.936 0.5161711 virginica 6.588 0.6358796 Sepal.Width iris[, 5] [,1] [,2] setosa 3.428 0.3790644 versicolor 2.770 0.3137983 virginica 2.974 0.3224966 Petal.Length iris[, 5] [,1] [,2] setosa 1.462 0.1736640 versicolor 4.260 0.4699110 virginica 5.552 0.5518947 Petal.Width iris[, 5] [,1] [,2] setosa 0.246 0.1053856 versicolor 1.326 0.1977527 virginica 2.026 0.2746501 > classifier$apriori iris[, 5] setosa versicolor virginica 50 50 50 > classifier$tables $Sepal.Length Sepal.Length iris[, 5] [,1] [,2] setosa 5.006 0.3524897 versicolor 5.936 0.5161711 virginica 6.588 0.6358796 $Sepal.Width Sepal.Width iris[, 5] [,1] [,2] setosa 3.428 0.3790644 versicolor 2.770 0.3137983 virginica 2.974 0.3224966 $Petal.Length Petal.Length iris[, 5] [,1] [,2] setosa 1.462 0.1736640 versicolor 4.260 0.4699110 virginica 5.552 0.5518947 $Petal.Width Petal.Width iris[, 5] [,1] [,2] setosa 0.246 0.1053856 versicolor 1.326 0.1977527 virginica 2.026 0.2746501
classifier中:
A-priori probabilities: iris[, 5] setosa versicolor virginica 0.3333333 0.3333333 0.3333333
很好理解,就是类别的先验概率。
而:
$Petal.Width Petal.Width iris[, 5] [,1] [,2] setosa 0.246 0.1053856 versicolor 1.326 0.1977527 virginica 2.026 0.2746501
是特征Petal.Width的条件概率,在这个贝叶斯实现中,特征是数值型数据(而且还还有小数部分),这里假设概率密度符合高斯分布。比如对于特征Petal.Width,其属于setosa的概率符合mean为0.246,标准方差为0.1053856的高斯分布。
预测:
预测iris数据集中的第一个数据:
> predict(classifier, iris[1, -5]) [1] setosa Levels: setosa versicolor virginica
iris[1,-5]表示第一行的前4列。
看一下该分类器的效果:
> table(predict(classifier, iris[,-5]), iris[,5], dnn=list('predicted','actual')) actual predicted setosa versicolor virginica setosa 50 0 0 versicolor 0 47 3 virginica 0 3 47
分类效果还是不错的。
自己构造一个新的数据并预测:
> new_data = data.frame(Sepal.Length=7, Sepal.Width=3, Petal.Length=6, Petal.Width=2) > predict(classifier, new_data) [1] virginica Levels: setosa versicolor virginica
如果少一个特征(只有三个特征):
> new_data = data.frame(Sepal.Length=7, Sepal.Width=3, Petal.Length=6) > predict(classifier, new_data) [1] virginica Levels: setosa versicolor virginica
下面看一下,这个库如何处理标称型特征:
数据如下:
> model = c("H", "H", "H", "H", "T", "T", "T", "T") > place = c("B", "B", "N", "N", "B", "B", "N", "N") > repairs = c("Y", "N", "Y", "N", "Y", "N", "Y", "N") > dataset = data.frame(model, place, repairs) > dataset model place repairs 1 H B Y 2 H B N 3 H N Y 4 H N N 5 T B Y 6 T B N 7 T N Y 8 T N N
贝叶斯之:
> classifier<-naiveBayes(dataset[,1:2], dataset[,3]) > classifier Naive Bayes Classifier for Discrete Predictors Call: naiveBayes.default(x = dataset[, 1:2], y = dataset[, 3]) A-priori probabilities: dataset[, 3] N Y 0.5 0.5 Conditional probabilities: model dataset[, 3] H T N 0.5 0.5 Y 0.5 0.5 place dataset[, 3] B N N 0.5 0.5 Y 0.5 0.5
好了,预测一下:
> new_data = data.frame(model="H", place="B") > predict(classifier, new_data) [1] N Levels: N Y
perfect!
补充一下,如果某个数据缺少某些特征:
可以用NA代替该特征:
> model = c("H", "H", "H", "H", "T", "T", "T", "T") > place = c("B", "B", "N", "N", "B", "B", NA, NA) > repairs = c("Y", "N", "Y", "N", "Y", "N", "Y", "N") > dataset = data.frame(model, place, repairs) > dataset model place repairs 1 H B Y 2 H B N 3 H N Y 4 H N N 5 T B Y 6 T B N 7 TY 8 T N > classifier<-naiveBayes(dataset[,1:2], dataset[,3]) > classifier Naive Bayes Classifier for Discrete Predictors Call: naiveBayes.default(x = dataset[, 1:2], y = dataset[, 3]) A-priori probabilities: dataset[, 3] N Y 0.5 0.5 Conditional probabilities: model dataset[, 3] H T N 0.5 0.5 Y 0.5 0.5 place dataset[, 3] B N N 0.6666667 0.3333333 Y 0.6666667 0.3333333
到此,相信大家对“R语言朴素贝叶斯技术怎么使用”有了更深的了解,不妨来实际操作一番吧!这里是创新互联网站,更多相关内容可以进入相关频道进行查询,关注我们,继续学习!
文章标题:R语言朴素贝叶斯技术怎么使用
网站URL:http://pwwzsj.com/article/gjcgpo.html