A^k for matrix multiplication in R?

If A is diagonizable, you could use eigenvalue decomposition:

matrix.power <- function(A, n) {   # only works for diagonalizable matrices
   e <- eigen(A)
   M <- e$vectors   # matrix for changing basis
   d <- e$values    # eigen values
   return(M %*% diag(d^n) %*% solve(M))
}

When A is not diagonalizable, the matrix M (matrix of eigenvectors) is singular. Thus, using it with A = matrix(c(0,1,0,0),2,2) would give Error in solve.default(M) : system is computationally singular.


The expm package has an %^% operator:

library("sos")
findFn("{matrix power}")
install.packages("expm")
library("expm")
?matpow
set.seed(10);t(matrix(rnorm(16),ncol=4,nrow=4))->a
a%^%8

Although Reduce is more elegant, a for-loop solution is faster and seems to be as fast as expm::%^%

m1 <- matrix(1:9, 3)
m2 <- matrix(1:9, 3)
m3 <- matrix(1:9, 3)
system.time(replicate(1000, Reduce("%*%" , list(m1,m1,m1) ) ) )
#   user  system elapsed 
#  0.026   0.000   0.037 
mlist <- list(m1,m2,m3)
m0 <- diag(1, nrow=3,ncol=3)
system.time(replicate(1000, for (i in 1:3 ) {m0 <- m0 %*% m1 } ) )
#   user  system elapsed 
#  0.013   0.000   0.014 

library(expm)  # and I think this may be imported with pkg:Matrix
system.time(replicate(1000, m0%^%3))
# user  system elapsed 
#0.011   0.000   0.017 

On the other hand the matrix.power solution is much, much slower:

system.time(replicate(1000, matrix.power(m1, 4)) )
   user  system elapsed 
  0.677   0.013   1.037 

@BenBolker is correct (yet again). The for-loop appears linear in time as the exponent rises whereas the expm::%^% function appears to be even better than log(exponent).

> m0 <- diag(1, nrow=3,ncol=3)
> system.time(replicate(1000, for (i in 1:400 ) {m0 <- m0 %*% m1 } ) )
   user  system elapsed 
  0.678   0.037   0.708 
> system.time(replicate(1000, m0%^%400))
   user  system elapsed 
  0.006   0.000   0.006 

Tags:

Matrix

R