Solved – K-fold repeated cross validation for classification accuracy in Caret

classificationcross-validationmachine learningr

I am new to cross-validation and I have a data-set called LDA.scores for 12 measured call-type parameters. I am trying to run a k-fold repeated cross validation with 10 folds and associated naive Bayes method. The grouping factor is Family, since I am trying to assimilate if call-type parameters between between both families are different. I am trying to run this code

 library(caret)
 train_control<-trainControl(method="repeatedcv", number=10, repeats=3)
 model<-train(Family~., data=LDA.scores, trControl=train_control,method="nb")
 predictions <- predict(model, LDA.scores[,2:13])
 confusionMatrix(predictions,LDA.scores$Family)

I keep on getting these error messages:

 Error in train.default(x, y, weights = w, ...) : 
 wrong model type for regression

I do not understand what I am doing wrong. How can I run this code to produce a naive Bayes matrix. Any advice would be deeply appreciated. I have tried everything possible with my novel capabilities. Words cannot describe my gratitude if anyone has a solution. Here is a portion of my dataframe:

      Family SBI.max.Part.1 SBI.max.Part.2 SBI.min.Part.1 SBI.min.Part.2
1         G8    -0.48055680   -0.086292700   -0.157157188   -0.438809944
2         G8     0.12600625   -0.074481895    0.057316151   -0.539013927
3         G8     0.06823834   -0.056765686    0.064711783   -0.539013927
4         G8     0.67480139   -0.050860283    0.153459372   -0.539013927
5         G8     0.64591744   -0.050860283    0.072107416   -0.472211271
6         G8     0.21265812   -0.068576492    0.057316151   -0.071395338
7         G8    -0.01841352   -0.068576492   -0.053618335   -0.071395338
8         G8     0.12600625    0.055436970    0.012942357    0.296019267
9         G8    -0.22060120    0.114491000   -0.038827070    0.563229889
10        G8     0.27042603   -0.021333268    0.049920519   -0.037994010
11        G8     0.03935439   -0.044954880    0.012942357    0.195815284
12        G8    -0.45167284    0.008193747   -0.075805232   -0.171599321
13        G8    -0.04729748   -0.056765686    0.035129254   -0.305204632
14        G8    -0.10506539    0.008193747   -0.046222702    0.062209973
15        G8     0.09712230    0.037720761    0.109085578   -0.104796666
16        G8    -0.07618143    0.014099150   -0.038827070    0.095611301
17        G8     0.29930998    0.108585597    0.057316151    0.028808645
18        G8     0.01047043   -0.074481895    0.020337989   -0.071395338
19        G8    -0.24948516    0.002288344    0.035129254    0.329420595
20        G8    -0.04729748    0.049531567    0.057316151    0.296019267
21        G8    -0.01841352    0.043626164    0.005546724   -0.171599321
22        G8    -0.19171725    0.049531567   -0.016640173   -0.071395338
23        G8    -0.48055680    0.020004552   -0.142365923    0.596631217
24        G8     0.01047043    0.008193747    0.220020063    0.062209973
25        G8    -0.42278889    0.025909955   -0.149761556    0.028808645
26        G8    -0.45167284    0.031815358   -0.134970291   -0.138197994
27        G8    -0.30725307    0.049531567    0.042524886    0.095611301
28        G8     0.24154207   -0.039049477    0.072107416   -0.104796666
29        G8     1.45466817   -0.003617059    0.064711783    0.296019267
30        G8    -0.01841352    0.002288344    0.020337989    0.028808645
31        G8     0.38596185    0.084963985    0.049920519   -0.037994010
32        G8     0.15489021   -0.080387298    0.020337989   -0.338605960
33        G8    -0.04729748    0.067247776    0.138668107    0.129012629
34        V4     0.27042603    0.031815358    0.049920519    0.195815284
35        V4    -0.07618143    0.037720761    0.020337989   -0.037994010
36        V4    -0.10506539    0.025909955   -0.083200864    0.396223251
37        V4    -0.01841352    0.126301805   -0.024035805    0.362821923
38        V4     0.01047043    0.031815358   -0.016640173   -0.138197994
39        V4     0.06823834    0.037720761   -0.038827070    0.262617940
40        V4    -0.16283329   -0.050860283   -0.038827070   -0.405408616
41        V4    -0.01841352   -0.039049477    0.005546724   -0.205000649
42        V4    -0.39390493   -0.003617059   -0.090596497    0.129012629
43        V4    -0.04729748    0.008193747   -0.009244540    0.195815284
44        V4     0.01047043   -0.039049477   -0.016640173   -0.205000649
45        V4     0.01047043   -0.003617059   -0.075805232   -0.004592683
46        V4     0.06823834    0.008193747   -0.090596497   -0.205000649
47        V4    -0.04729748    0.014099150    0.012942357   -0.071395338
48        V4    -0.22060120   -0.015427865   -0.075805232   -0.171599321
49        V4    -0.16283329    0.020004552   -0.061013967   -0.104796666
50        V4    -0.07618143    0.031815358   -0.038827070   -0.138197994
51        V4    -0.22060120    0.020004552   -0.112783394   -0.104796666
52        V4    -0.19171725   -0.033144074   -0.068409599   -0.071395338
53        V4    -0.16283329   -0.039049477   -0.090596497   -0.104796666
54        V4    -0.22060120   -0.009522462   -0.053618335   -0.037994010
55        V4    -0.13394934   -0.003617059   -0.075805232   -0.004592683
56        V4    -0.27836911   -0.044954880   -0.090596497   -0.238401977
57        V4    -0.04729748   -0.050860283    0.064711783    0.028808645
58        V4     0.01047043   -0.044954880    0.012942357   -0.305204632
59        V4     0.12600625   -0.068576492    0.042524886   -0.305204632
60        V4     0.06823834   -0.033144074   -0.061013967   -0.271803305
61        V4     0.06823834   -0.027238671   -0.061013967   -0.037994010
62        V4     0.32819394   -0.068576492    0.064711783   -0.372007288
63        V4     0.32819394    0.014099150    0.175646269    0.095611301
64        V4    -0.27836911    0.002288344   -0.068409599    0.195815284
65        V4     0.18377416    0.025909955    0.027733621    0.162413956
66        V4     0.55926557   -0.009522462    0.042524886    0.229216612
67        V4    -0.19171725   -0.009522462   -0.038827070    0.229216612
68        V4    -0.19171725    0.025909955   -0.009244540    0.396223251
69        V4     0.01047043    0.155828820    0.027733621    0.630032545
70        V4    -0.19171725    0.002288344   -0.031431438    0.463025906
71        V4    -0.01841352   -0.044954880   -0.046222702    0.496427234
72        V4    -0.07618143   -0.015427865   -0.031431438    0.062209973
73        V4    -0.13394934    0.008193747   -0.068409599   -0.071395338
74        V4    -0.39390493    0.037720761   -0.120179026    0.229216612
75        V4    -0.04729748    0.008193747    0.035129254   -0.071395338
76        V4    -0.27836911   -0.015427865   -0.061013967   -0.071395338
77        V4     0.70368535   -0.056765686    0.397515240   -0.205000649
78        V4     0.29930998    0.079058582    0.138668107    0.229216612
79        V4    -0.13394934   -0.056765686    0.020337989   -0.305204632
80        V4     0.21265812    0.025909955    0.035129254    0.396223251

   'data.frame':    80 obs. of  13 variables:
 $ Family           : Factor w/ 2 levels "G8","V4": 1 1 1 1 1 1 1 1 1 1 .                  
     $ SBI.max.Part.1   : num  -0.4806 0.126 0.0682 0.6748 0.6459 ...
 $ SBI.max.Part.2   : num  -0.0863 -0.0745 -0.0568 -0.0509 -0.0509 ...
     $ SBI.min.Part.1   : num  -0.1572 0.0573 0.0647 0.1535 0.0721 ...
 $ SBI.min.Part.2   : num  -0.439 -0.539 -0.539 -0.539 -0.472 ...

Best Answer

You should check out http://topepo.github.io/caret/Bayesian_Model.html

Right now you have a target variable that is continuous and you are trying to apply a classification algorithm to it. Instead, you should use something like brnn or bartMachine

Related Question