Run PCA on a dataset, then use it in a neural network model

pcaNNet(x, ...)

# S3 method for formula
pcaNNet(formula, data, weights, ..., thresh = 0.99,
  subset, na.action, contrasts = NULL)

# S3 method for default
pcaNNet(x, y, thresh = 0.99, ...)

# S3 method for pcaNNet
print(x, ...)

# S3 method for pcaNNet
predict(object, newdata, type = c("raw", "class",
  "prob"), ...)

Arguments

x

matrix or data frame of x values for examples.

...

arguments passed to nnet, such as size, decay, etc.

formula

A formula of the form class ~ x1 + x2 + ...{}

data

Data frame from which variables specified in formula are preferentially to be taken.

weights

(case) weights for each example - if missing defaults to 1.

thresh

a threshold for the cumulative proportion of variance to capture from the PCA analysis. For example, to retain enough PCA components to capture 95 percent of variation, set thresh = .95

subset

An index vector specifying the cases to be used in the training sample. (NOTE: If given, this argument must be named.)

na.action

A function to specify the action to be taken if NAs are found. The default action is for the procedure to fail. An alternative is na.omit, which leads to rejection of cases with missing values on any required variable. (NOTE: If given, this argument must be named.)

contrasts

a list of contrasts to be used for some or all of the factors appearing as variables in the model formula.

y

matrix or data frame of target values for examples.

object

an object of class pcaNNet as returned by pcaNNet.

newdata

matrix or data frame of test examples. A vector is considered to be a row vector comprising a single case.

type

Type of output

Value

For pcaNNet, an object of "pcaNNet" or "pcaNNet.formula". Items of interest in the output are:

pc

the output from preProcess

model

the model generated from nnet

names

if any predictors had only one distinct value, this is a character string of the remaining columns. Otherwise a value of NULL

Details

The function first will run principal component analysis on the data. The cumulative percentage of variance is computed for each principal component. The function uses the thresh argument to determine how many components must be retained to capture this amount of variance in the predictors.

The principal components are then used in a neural network model.

When predicting samples, the new data are similarly transformed using the information from the PCA analysis on the training data and then predicted.

Because the variance of each predictor is used in the PCA analysis, the code does a quick check to make sure that each predictor has at least two distinct values. If a predictor has one unique value, it is removed prior to the analysis.

References

Ripley, B. D. (1996) Pattern Recognition and Neural Networks. Cambridge.

See also

Examples

data(BloodBrain) modelFit <- pcaNNet(bbbDescr[, 1:10], logBBB, size = 5, linout = TRUE, trace = FALSE) modelFit
#> Neural Network Model with PCA Pre-Processing #> #> Created from 208 samples and 10 variables #> PCA needed 9 components to capture 99 percent of the variance #> #> a 9-5-1 network with 56 weights #> options were - linear output units #>
predict(modelFit, bbbDescr[, 1:10])
#> [,1] #> 1 0.935339458 #> 2 -0.157597462 #> 3 0.238524168 #> 4 0.238524168 #> 5 0.238524168 #> 6 0.142903174 #> 7 0.113959662 #> 8 1.055009021 #> 9 1.026065509 #> 10 1.026065509 #> 11 0.331953732 #> 12 1.055009021 #> 13 0.754508385 #> 14 -1.154188265 #> 15 -0.842850885 #> 16 -0.842850885 #> 17 -0.066859585 #> 18 -0.542350250 #> 19 -0.542350250 #> 20 -0.542350250 #> 21 0.238524168 #> 22 1.026065509 #> 23 1.026065509 #> 24 1.026065509 #> 25 -0.311004556 #> 26 1.026065509 #> 27 -0.542350250 #> 28 -0.839687841 #> 29 0.238524168 #> 30 0.238524168 #> 31 -0.157597462 #> 32 -1.659335738 #> 33 -0.842850885 #> 34 -0.446729256 #> 35 0.238524168 #> 36 0.238524168 #> 37 0.113959662 #> 38 -0.945138803 #> 39 0.215144184 #> 40 0.230795657 #> 41 1.055009021 #> 42 -0.157597462 #> 43 0.245190409 #> 44 1.026065509 #> 45 -0.542350250 #> 46 1.055009021 #> 47 -0.542350250 #> 48 1.157296938 #> 49 1.026065509 #> 50 0.340812085 #> 51 0.962336941 #> 52 0.607070893 #> 53 0.238524168 #> 54 1.505611976 #> 55 0.463923884 #> 56 0.273764086 #> 57 -0.542350250 #> 58 -0.157597462 #> 59 -0.842850885 #> 60 -0.157597462 #> 61 -0.842850885 #> 62 -0.061976468 #> 63 0.245191091 #> 64 -0.542350250 #> 65 -0.542350250 #> 66 -0.446729256 #> 67 0.369755597 #> 68 -0.571293762 #> 69 0.069254962 #> 70 0.004873995 #> 71 0.340857215 #> 72 0.369755597 #> 73 0.340812085 #> 74 0.340812085 #> 75 0.340812085 #> 76 0.340812085 #> 77 -0.542350250 #> 78 1.055009021 #> 79 0.754508385 #> 80 -0.313009209 #> 81 -0.974082315 #> 82 0.490075285 #> 83 0.549966187 #> 84 0.238524168 #> 85 -0.033032956 #> 86 0.238035959 #> 87 -0.542350250 #> 88 0.113331436 #> 89 1.026065509 #> 90 0.238524168 #> 91 1.044447802 #> 92 1.023420684 #> 93 -0.360788314 #> 94 -0.747229208 #> 95 0.238524168 #> 96 0.142903174 #> 97 0.340794576 #> 98 0.324241958 #> 99 0.754508385 #> 100 -0.061976468 #> 101 0.142903174 #> 102 1.026065509 #> 103 -0.945138803 #> 104 0.404767849 #> 105 -0.033032956 #> 106 -1.659321497 #> 107 -0.022226558 #> 108 -0.842850885 #> 109 -0.467893593 #> 110 0.238524168 #> 111 0.113959662 #> 112 -0.945138803 #> 113 0.245191091 #> 114 -0.157597462 #> 115 -0.842850885 #> 116 0.930444515 #> 117 0.142903174 #> 118 -0.638653280 #> 119 -0.157597462 #> 120 0.142903174 #> 121 -0.974082315 #> 122 0.631333171 #> 123 -0.945138803 #> 124 -0.974082315 #> 125 -0.061976468 #> 126 -0.061976468 #> 127 -0.157597462 #> 128 -0.061976468 #> 129 0.113959662 #> 130 1.026065509 #> 131 1.026065509 #> 132 1.026065509 #> 133 1.026065509 #> 134 -0.974082315 #> 135 0.340812085 #> 136 0.113959662 #> 137 -1.411174773 #> 138 -0.842850885 #> 139 0.113959662 #> 140 -0.157597462 #> 141 -0.157597462 #> 142 1.026065509 #> 143 -1.630392226 #> 144 1.026065509 #> 145 0.113959662 #> 146 1.026065509 #> 147 -0.747229891 #> 148 0.340812085 #> 149 0.520376623 #> 150 -0.061976468 #> 151 -0.157597462 #> 152 0.238524168 #> 153 0.414766714 #> 154 -0.061976468 #> 155 -0.157597462 #> 156 -0.157597462 #> 157 -0.157597462 #> 158 -0.571293762 #> 159 -0.157597462 #> 160 -0.061976468 #> 161 -0.157597462 #> 162 0.238524168 #> 163 -0.157597462 #> 164 -0.157597462 #> 165 -0.157597462 #> 166 0.238589653 #> 167 0.369755597 #> 168 -0.059790161 #> 169 -0.061976468 #> 170 0.536649203 #> 171 -0.157597462 #> 172 -0.849517809 #> 173 0.238524168 #> 174 0.113959662 #> 175 0.142903174 #> 176 -0.571293762 #> 177 -0.974082315 #> 178 0.164025152 #> 179 0.930444515 #> 180 -0.974082315 #> 181 -1.630392226 #> 182 -0.842850885 #> 183 -1.630392226 #> 184 -1.630392226 #> 185 -1.630392226 #> 186 -1.630392226 #> 187 -0.542350250 #> 188 -0.157597462 #> 189 0.113959662 #> 190 -0.157597462 #> 191 -0.157597462 #> 192 0.854284744 #> 193 -0.673581679 #> 194 -0.253417243 #> 195 1.026065509 #> 196 -0.157597462 #> 197 -1.358835103 #> 198 -0.542350250 #> 199 0.069254962 #> 200 0.640258938 #> 201 0.238524168 #> 202 -1.630392226 #> 203 -0.542350250 #> 204 -0.100931640 #> 205 1.026065509 #> 206 0.113960005 #> 207 -0.974082315 #> 208 -0.842850885