predict.rpart.Rd
Returns a vector of predicted responses from a fitted rpart
object.
# S3 method for rpart predict(object, newdata, type = c("vector", "prob", "class", "matrix"), na.action = na.pass, ...)
object | fitted model object of class |
---|---|
newdata | data frame containing the values at which predictions are required.
The predictors referred to in the right side of
|
type | character string denoting the type of predicted value returned. If
the |
na.action | a function to determine what should be done with
missing values in |
... | further arguments passed to or from other methods. |
A new object is obtained by
dropping newdata
down the object. For factor predictors, if an
observation contains a level not used to grow the tree, it is left at
the deepest possible node and frame$yval
at the node is the
prediction.
If type = "vector"
:
vector of predicted responses.
For regression trees this is the mean response at the node, for Poisson
trees it is the estimated response rate, and for classification trees
it is the predicted class (as a number).
If type = "prob"
:
(for a classification tree) a matrix of class probabilities.
If type = "matrix"
:
a matrix of the full responses
(frame$yval2
if this exists, otherwise frame$yval
). For
regression trees, this is the mean response, for Poisson trees it is
the response rate and the number of events at that node in the fitted
tree, and for classification trees it is the concatenation of at least
the predicted class, the class counts at that node in the fitted tree,
and the class probabilities (some versions of rpart may contain
further columns).
If type = "class"
:
(for a classification tree) a factor of classifications based on the
responses.
This function is a method for the generic function predict for class
"rpart"
. It can be invoked by calling predict
for an object
of the appropriate class, or directly by calling predict.rpart
regardless of the class of the object.
#> Eagle Summit 4 Ford Escort 4 #> 30.93333 30.93333 #> Ford Festiva 4 Honda Civic 4 #> 30.93333 30.93333 #> Mazda Protege 4 Mercury Tracer 4 #> 30.93333 30.93333 #> Nissan Sentra 4 Pontiac LeMans 4 #> 30.93333 30.93333 #> Subaru Loyale 4 Subaru Justy 3 #> 30.93333 30.93333 #> Toyota Corolla 4 Toyota Tercel 4 #> 30.93333 30.93333 #> Volkswagen Jetta 4 Chevrolet Camaro V8 #> 30.93333 20.40909 #> Dodge Daytona Ford Mustang V8 #> 23.80000 20.40909 #> Ford Probe Honda Civic CRX Si 4 #> 25.62500 30.93333 #> Honda Prelude Si 4WS 4 Nissan 240SX 4 #> 25.62500 23.80000 #> Plymouth Laser Subaru XT 4 #> 23.80000 30.93333 #> Audi 80 4 Buick Skylark 4 #> 25.62500 25.62500 #> Chevrolet Beretta 4 Chrysler Le Baron V6 #> 25.62500 23.80000 #> Ford Tempo 4 Honda Accord 4 #> 23.80000 23.80000 #> Mazda 626 4 Mitsubishi Galant 4 #> 23.80000 25.62500 #> Mitsubishi Sigma V6 Nissan Stanza 4 #> 20.40909 23.80000 #> Oldsmobile Calais 4 Peugeot 405 4 #> 25.62500 25.62500 #> Subaru Legacy 4 Toyota Camry 4 #> 23.80000 23.80000 #> Volvo 240 4 Acura Legend V6 #> 23.80000 20.40909 #> Buick Century 4 Chrysler Le Baron Coupe #> 23.80000 23.80000 #> Chrysler New Yorker V6 Eagle Premier V6 #> 20.40909 20.40909 #> Ford Taurus V6 Ford Thunderbird V6 #> 20.40909 20.40909 #> Hyundai Sonata 4 Mazda 929 V6 #> 23.80000 20.40909 #> Nissan Maxima V6 Oldsmobile Cutlass Ciera 4 #> 20.40909 23.80000 #> Oldsmobile Cutlass Supreme V6 Toyota Cressida 6 #> 20.40909 20.40909 #> Buick Le Sabre V6 Chevrolet Caprice V8 #> 20.40909 20.40909 #> Ford LTD Crown Victoria V8 Chevrolet Lumina APV V6 #> 20.40909 20.40909 #> Dodge Grand Caravan V6 Ford Aerostar V6 #> 20.40909 20.40909 #> Mazda MPV V6 Mitsubishi Wagon 4 #> 20.40909 20.40909 #> Nissan Axxess 4 Nissan Van 4 #> 20.40909 20.40909fit <- rpart(Kyphosis ~ Age + Number + Start, data = kyphosis) predict(fit, type = "prob") # class probabilities (default)#> absent present #> 1 0.4210526 0.5789474 #> 2 0.8571429 0.1428571 #> 3 0.4210526 0.5789474 #> 4 0.4210526 0.5789474 #> 5 1.0000000 0.0000000 #> 6 1.0000000 0.0000000 #> 7 1.0000000 0.0000000 #> 8 1.0000000 0.0000000 #> 9 1.0000000 0.0000000 #> 10 0.4285714 0.5714286 #> 11 0.4285714 0.5714286 #> 12 1.0000000 0.0000000 #> 13 0.4210526 0.5789474 #> 14 1.0000000 0.0000000 #> 15 1.0000000 0.0000000 #> 16 1.0000000 0.0000000 #> 17 1.0000000 0.0000000 #> 18 0.8571429 0.1428571 #> 19 1.0000000 0.0000000 #> 20 1.0000000 0.0000000 #> 21 1.0000000 0.0000000 #> 22 0.4210526 0.5789474 #> 23 0.4285714 0.5714286 #> 24 0.4210526 0.5789474 #> 25 0.4210526 0.5789474 #> 26 1.0000000 0.0000000 #> 27 0.4210526 0.5789474 #> 28 0.4285714 0.5714286 #> 29 1.0000000 0.0000000 #> 30 1.0000000 0.0000000 #> 31 1.0000000 0.0000000 #> 32 0.8571429 0.1428571 #> 33 0.8571429 0.1428571 #> 34 1.0000000 0.0000000 #> 35 0.8571429 0.1428571 #> 36 1.0000000 0.0000000 #> 37 1.0000000 0.0000000 #> 38 0.4210526 0.5789474 #> 39 1.0000000 0.0000000 #> 40 0.4285714 0.5714286 #> 41 0.4210526 0.5789474 #> 42 1.0000000 0.0000000 #> 43 0.4210526 0.5789474 #> 44 0.4210526 0.5789474 #> 45 1.0000000 0.0000000 #> 46 0.8571429 0.1428571 #> 47 1.0000000 0.0000000 #> 48 0.8571429 0.1428571 #> 49 0.4210526 0.5789474 #> 50 0.8571429 0.1428571 #> 51 0.4285714 0.5714286 #> 52 1.0000000 0.0000000 #> 53 0.4210526 0.5789474 #> 54 1.0000000 0.0000000 #> 55 1.0000000 0.0000000 #> 56 1.0000000 0.0000000 #> 57 1.0000000 0.0000000 #> 58 0.4210526 0.5789474 #> 59 1.0000000 0.0000000 #> 60 0.4285714 0.5714286 #> 61 0.4210526 0.5789474 #> 62 0.4210526 0.5789474 #> 63 0.4210526 0.5789474 #> 64 1.0000000 0.0000000 #> 65 1.0000000 0.0000000 #> 66 1.0000000 0.0000000 #> 67 1.0000000 0.0000000 #> 68 0.8571429 0.1428571 #> 69 1.0000000 0.0000000 #> 70 1.0000000 0.0000000 #> 71 0.8571429 0.1428571 #> 72 0.8571429 0.1428571 #> 73 1.0000000 0.0000000 #> 74 0.8571429 0.1428571 #> 75 1.0000000 0.0000000 #> 76 1.0000000 0.0000000 #> 77 0.8571429 0.1428571 #> 78 1.0000000 0.0000000 #> 79 0.8571429 0.1428571 #> 80 0.4210526 0.5789474 #> 81 1.0000000 0.0000000#> 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 #> 2 1 2 2 1 1 1 1 1 2 2 1 2 1 1 1 1 1 1 1 1 2 2 2 2 1 #> 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 #> 2 2 1 1 1 1 1 1 1 1 1 2 1 2 2 1 2 2 1 1 1 1 2 1 2 1 #> 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 #> 2 1 1 1 1 2 1 2 2 2 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 #> 79 80 81 #> 1 2 1#> 1 2 3 4 5 6 7 8 9 10 #> present absent present present absent absent absent absent absent present #> 11 12 13 14 15 16 17 18 19 20 #> present absent present absent absent absent absent absent absent absent #> 21 22 23 24 25 26 27 28 29 30 #> absent present present present present absent present present absent absent #> 31 32 33 34 35 36 37 38 39 40 #> absent absent absent absent absent absent absent present absent present #> 41 42 43 44 45 46 47 48 49 50 #> present absent present present absent absent absent absent present absent #> 51 52 53 54 55 56 57 58 59 60 #> present absent present absent absent absent absent present absent present #> 61 62 63 64 65 66 67 68 69 70 #> present present present absent absent absent absent absent absent absent #> 71 72 73 74 75 76 77 78 79 80 #> absent absent absent absent absent absent absent absent absent present #> 81 #> absent #> Levels: absent present#> [,1] [,2] [,3] [,4] [,5] [,6] #> 1 2 8 11 0.4210526 0.5789474 0.23456790 #> 2 1 12 2 0.8571429 0.1428571 0.17283951 #> 3 2 8 11 0.4210526 0.5789474 0.23456790 #> 4 2 8 11 0.4210526 0.5789474 0.23456790 #> 5 1 29 0 1.0000000 0.0000000 0.35802469 #> 6 1 29 0 1.0000000 0.0000000 0.35802469 #> 7 1 29 0 1.0000000 0.0000000 0.35802469 #> 8 1 29 0 1.0000000 0.0000000 0.35802469 #> 9 1 29 0 1.0000000 0.0000000 0.35802469 #> 10 2 3 4 0.4285714 0.5714286 0.08641975 #> 11 2 3 4 0.4285714 0.5714286 0.08641975 #> 12 1 29 0 1.0000000 0.0000000 0.35802469 #> 13 2 8 11 0.4210526 0.5789474 0.23456790 #> 14 1 12 0 1.0000000 0.0000000 0.14814815 #> 15 1 29 0 1.0000000 0.0000000 0.35802469 #> 16 1 29 0 1.0000000 0.0000000 0.35802469 #> 17 1 29 0 1.0000000 0.0000000 0.35802469 #> 18 1 12 2 0.8571429 0.1428571 0.17283951 #> 19 1 29 0 1.0000000 0.0000000 0.35802469 #> 20 1 12 0 1.0000000 0.0000000 0.14814815 #> 21 1 29 0 1.0000000 0.0000000 0.35802469 #> 22 2 8 11 0.4210526 0.5789474 0.23456790 #> 23 2 3 4 0.4285714 0.5714286 0.08641975 #> 24 2 8 11 0.4210526 0.5789474 0.23456790 #> 25 2 8 11 0.4210526 0.5789474 0.23456790 #> 26 1 12 0 1.0000000 0.0000000 0.14814815 #> 27 2 8 11 0.4210526 0.5789474 0.23456790 #> 28 2 3 4 0.4285714 0.5714286 0.08641975 #> 29 1 29 0 1.0000000 0.0000000 0.35802469 #> 30 1 29 0 1.0000000 0.0000000 0.35802469 #> 31 1 29 0 1.0000000 0.0000000 0.35802469 #> 32 1 12 2 0.8571429 0.1428571 0.17283951 #> 33 1 12 2 0.8571429 0.1428571 0.17283951 #> 34 1 29 0 1.0000000 0.0000000 0.35802469 #> 35 1 12 2 0.8571429 0.1428571 0.17283951 #> 36 1 29 0 1.0000000 0.0000000 0.35802469 #> 37 1 12 0 1.0000000 0.0000000 0.14814815 #> 38 2 8 11 0.4210526 0.5789474 0.23456790 #> 39 1 12 0 1.0000000 0.0000000 0.14814815 #> 40 2 3 4 0.4285714 0.5714286 0.08641975 #> 41 2 8 11 0.4210526 0.5789474 0.23456790 #> 42 1 12 0 1.0000000 0.0000000 0.14814815 #> 43 2 8 11 0.4210526 0.5789474 0.23456790 #> 44 2 8 11 0.4210526 0.5789474 0.23456790 #> 45 1 29 0 1.0000000 0.0000000 0.35802469 #> 46 1 12 2 0.8571429 0.1428571 0.17283951 #> 47 1 29 0 1.0000000 0.0000000 0.35802469 #> 48 1 12 2 0.8571429 0.1428571 0.17283951 #> 49 2 8 11 0.4210526 0.5789474 0.23456790 #> 50 1 12 2 0.8571429 0.1428571 0.17283951 #> 51 2 3 4 0.4285714 0.5714286 0.08641975 #> 52 1 29 0 1.0000000 0.0000000 0.35802469 #> 53 2 8 11 0.4210526 0.5789474 0.23456790 #> 54 1 29 0 1.0000000 0.0000000 0.35802469 #> 55 1 29 0 1.0000000 0.0000000 0.35802469 #> 56 1 29 0 1.0000000 0.0000000 0.35802469 #> 57 1 12 0 1.0000000 0.0000000 0.14814815 #> 58 2 8 11 0.4210526 0.5789474 0.23456790 #> 59 1 12 0 1.0000000 0.0000000 0.14814815 #> 60 2 3 4 0.4285714 0.5714286 0.08641975 #> 61 2 8 11 0.4210526 0.5789474 0.23456790 #> 62 2 8 11 0.4210526 0.5789474 0.23456790 #> 63 2 8 11 0.4210526 0.5789474 0.23456790 #> 64 1 29 0 1.0000000 0.0000000 0.35802469 #> 65 1 29 0 1.0000000 0.0000000 0.35802469 #> 66 1 12 0 1.0000000 0.0000000 0.14814815 #> 67 1 29 0 1.0000000 0.0000000 0.35802469 #> 68 1 12 2 0.8571429 0.1428571 0.17283951 #> 69 1 12 0 1.0000000 0.0000000 0.14814815 #> 70 1 29 0 1.0000000 0.0000000 0.35802469 #> 71 1 12 2 0.8571429 0.1428571 0.17283951 #> 72 1 12 2 0.8571429 0.1428571 0.17283951 #> 73 1 29 0 1.0000000 0.0000000 0.35802469 #> 74 1 12 2 0.8571429 0.1428571 0.17283951 #> 75 1 29 0 1.0000000 0.0000000 0.35802469 #> 76 1 29 0 1.0000000 0.0000000 0.35802469 #> 77 1 12 2 0.8571429 0.1428571 0.17283951 #> 78 1 12 0 1.0000000 0.0000000 0.14814815 #> 79 1 12 2 0.8571429 0.1428571 0.17283951 #> 80 2 8 11 0.4210526 0.5789474 0.23456790 #> 81 1 12 0 1.0000000 0.0000000 0.14814815sub <- c(sample(1:50, 25), sample(51:100, 25), sample(101:150, 25)) fit <- rpart(Species ~ ., data = iris, subset = sub) fit#> n= 75 #> #> node), split, n, loss, yval, (yprob) #> * denotes terminal node #> #> 1) root 75 50 setosa (0.33333333 0.33333333 0.33333333) #> 2) Petal.Length< 2.45 25 0 setosa (1.00000000 0.00000000 0.00000000) * #> 3) Petal.Length>=2.45 50 25 versicolor (0.00000000 0.50000000 0.50000000) #> 6) Petal.Width< 1.75 27 3 versicolor (0.00000000 0.88888889 0.11111111) * #> 7) Petal.Width>=1.75 23 1 virginica (0.00000000 0.04347826 0.95652174) *#> #> setosa versicolor virginica #> setosa 25 0 0 #> versicolor 0 25 2 #> virginica 0 0 23