leaps(x, y, wt=<<see below>>, int=T, method="Cp", keep.int=T, keep=<<see below>>, nbest=10, names=<<see below>>, df=nrow(x), dropint=T)
The best known criterion for regression is the coefficient of determination (R-squared). This has definite limitations in the context of the leaps function since the largest R-squared is the full set of explanatory variables. To take account of the number of parameters being fit an adjusted R-squared can be used. The higher the adjusted R-squared (which, by the way, can be negative), the better. It has been noted that the adjusted R-squared tends to favor large regressions over smaller ones.
Another method of selecting regressions is with Mallow's Cp. Small values of Cp close to or less than $p$ are good.
Seber, G. A. F. (1977). Linear Regression Analysis. Wiley, New York.
Weisberg, S. (1985). Applied Linear Regression (second edition). Wiley, New York.
r <- leaps(x, y)lsfit( x[,r$which[3,]], y ) #regression corresponding # to third subset
longley.wt <- lmsreg(longley.x, longley.y)$wt longley.leap <- leaps(longley.x, longley.y, longley.wt, names=c("D", "G", "U", "A", "P", "Y")) plot(longley.leap$size, longley.leap$Cp, type="n", ylim=c(0,15)) text(longley.leap$size, longley.leap$Cp, longley.leap$label) abline(0, 1) legend(2,15,pch="DGUAPY",legend=dimnames(longley.x)[[2]]) title(main="Cp Plot for Longley Data")