Model II multiple regression

Home Forums Default Forum Model II multiple regression

Viewing 5 posts - 1 through 5 (of 5 total)
  • Author
    Posts
  • #2904

    Dear Real Statisticians (and I do mean psychologists): I am a Canadian archaeologist (BSc in Chemistry) researching the past human use of European Atlantic shellfish.  After two decades of practice I am finally getting a MA. I am seeing if the habitat or size of harvested mussels (Mytilus edulis) can be reconstructed from measurements of the umbo (the only bit that survives well in archaeological deposits) using log-transformed measurements (or allometry; relationships between dimensions are more likely exponential than linear). 

     
    Of course multivariate regressions in most statistics packages (Minitab, SPSS, SAS) assume you are trying to predict one variable from all the others (a Model I regression), and use ordinary least squares to fit the regression line.  For organismal dimensions this makes little sense, since all the dimensions are (at least in theory) free to change their mutual proportions during growth.  So there is no predictor and predicted: mutual variation of all the dimensions is the response.  The fitted regression line must give equal weight to all the dimensions (aModel II regression)  Common methods are major-axis (perpendicular distances between the line and all the points are minimised, in a principal-component-analysis way) and reduced major axis or standard-major-axis (perpendicular distances between the standardised points and the line are fitted, and then converted back to the original variables).
     
    Is it possible to carry out major-axis or reduced-major-axis fitting in multiple linear regressions in SPSS, SAS or Systat (I know that it can’t be done in Minitab)? If so, how?
    If not, are there applications in R that carry out this type of analysis?
    Greg Campbell, The Naive Chemist
     
    #2908
    Anonymous
    Inactive

    R package’s lmodel2 seems to be what you should be looking for… what is true of R tends to be true of SAS as well so i’m sure either they implement it or there’s a macro/proc out there that can do it… SPSS certainly does not have it a sa default but, once again, with the appropriate syntax i’m sure you can make it work…

    #2907

    Thanks Oscar (got my BSc at UBC), but lmodel2 does bi-variate fitting, and the SAS fixes also work on bivariate case, not the multiple regression. 

    Any not-too-painful errors-in-variables regression programs?

    Greg

    #2906
    Anonymous
    Inactive

    really? well, that’s pretty kewl !!! it’s nice to have other fellow UBCers around…heh.

     

    ‘nyawys, so unfortunately i tend not to work with error-in-variables models very often because a lot of the work i do is through structural equation modeling so i can parse out the error there… it’s a pretty flexible set of techniques which i think could be useful in your case if you suspect the error may come by and screw your results up. googling around a li’l bit i found there’s the ‘decon’ package in R that, although doesnt do model 2 regression, works through kernel density estimation instead to separate apart the distributions… worse comes to worst i think there are enough functions in the R packages to code it yourslef… perhaps you can do an extreme dimension reduction (PCA-like approach) and run the regression on it? or i’m going through the documentation and they provide an intersting citation (Legendre & Legendre 1998) on how to compute ranged (not reduced though) major axis regression… it doesnt seem to horrible to code by yourself, i believe…

    #2905

    Hi Greg

    this function in R computes multivariate model 2 regresion in two alternatives you can specify

    Hope this be of help

    multireg2=function(x, dep, method)
    {

    # x is the matrix containing the data
    # dep is the column of that matrix that holds the dependent variable
    # method =1 major axis
    # method =2 reduced major axis or standard minor axis

    nf = nrow(as.matrix(x))
    nc = ncol(as.matrix(x))

    if(method ==1)
    S=cov(x)
    else S=cor(x)

    Bcoeffs=matrix(NA, nc, 1)

    jaco = svd(S, LINPACK = TRUE)$v[, nc]

    mX = mean(x)
    sX = sd(x)

    if(method ==1)
    {
    Bcoeffs[2:nc, 1] = – jaco[-dep] / jaco[dep]
    Bcoeffs[1, 1] = mX[dep] + sum(jaco[-dep] / jaco[dep] * mX[-dep])
    }

    else
    {
    Bcoeffs[2:nc, 1] = – jaco[-dep] / jaco[dep] * sX[dep] / sX[-dep]
    Bcoeffs[1, 1] = mX[dep] + sum(jaco[-dep] / jaco[dep] * mX[-dep]* sX[dep] / sX[-dep])
    }

    xx = as.matrix(cbind(rep(1,nf), x[,-dep]))
    Yh = xx %*% Bcoeffs
    r2 = 1 – sum((x[, dep] – Yh)^2)/ sum((x[, dep]- mean(x[, dep]))^2)

    list(r2=r2, Bcoeffs=Bcoeffs)

    }

Viewing 5 posts - 1 through 5 (of 5 total)
  • The forum ‘Default Forum’ is closed to new topics and replies.