Is there a 3-dimensional "matrix" by "matrix" product?

The general procedure is called tensor contraction. Concretely it's given by summing over various indices. For example, just as ordinary matrix multiplication $C = AB$ is given by

$$c_{ij} = \sum_k a_{ik} b_{kj}$$

we can contract by summing across any index. For example, we can write

$$c_{ijlm} = \sum_k a_{ijk} b_{klm}$$

which gives a $4$-tensor ("$4$-dimensional matrix") rather than a $3$-tensor. One can also contract twice, for example

$$c_{il} = \sum_{j,k} a_{ijk} b_{kjl}$$

which gives a $2$-tensor.

The abstract details shouldn't matter terribly unless you explicitly want to implement mixed variance, which as far as I know nobody who writes algorithms for manipulating matrices does.


Sorry to revive the thread, but what I found might answer the original question and help others who might stumble into this in the future. This came up for me when I wanted to avoid using for-loop and instead do one big multiplication on 3D matrices.

So first, let's look how matrix multiplication works. Say you have A[m,n] and B[n,p]. One requirement is that number of columns of A must match the number of rows of B. Then, all you do is iterate over rows of A (i) and columns of B (j) and the common dimension of both (k) (matlab/octave example):

m=2;n=3;p=4;A=randn(m,n);B=randn(n,p);
C=zeros(m,p);
for i = 1:m
    for j = 1:p
        for k = 1:n
             C(i,j) = C(i,j) + A(i,k)*B(k,j);   
        end
    end
end
C-A*B %to check the code, should output zeros

So the common dimension n got "contracted" I believe (Qiaochu Yuan's answer made so much sense once I started coding it).

Now, assuming you want something similar to happen in 3D case, ie one common dimension to contract, what would you do? Assume you have A[l,m,n] and B[n,p,q]. The requirement of the common dimension is still there - the last one of A must equal the first one of B. Then theoretically (this is just one way to do it and it just makes sense to me, no other foundation for this), n just cancels in LxMxNxNxPxQ and what you get is LxMxPxQ. The result is not even the same kind of creature, it is not 3-dimensional, instead it grew to 4D (just like Qiaochu Yuan pointed out btw). But oh well, how would you compute it? Well, just append 2 more for loops to iterate over new dimensions:

l=5;m=2;n=3;p=4;q=6;A=randn(l,m,n);B=randn(n,p,q);
C=zeros(l,m,p,q);
for h = 1:l
    for i = 1:m
        for j = 1:p
            for g = 1:q
                for k = 1:n
                    C(h,i,j,g) = C(h,i,j,g) + A(h,i,k)*B(k,j,g);
                end
            end
        end
    end
end

At the heart of it, it is still row-by-column kind of operation (hence only one dimension "contracts"), just over more data.

Now, my real problem was actually A[m,n] and B[n,p,q], where the creatures came from different dimensions (2D times 3D), but it seems doable nonetheless (afterall matrix times vector is 2D times 1D). So for me the result is C[m,p,q]:

m=2;n=3;p=4;q=5;A=randn(m,n);B=randn(n,p,q);
C=zeros(m,p,q);Ct=C;
for i = 1:m
    for j = 1:p
        for g = 1:q
            for k = 1:n
                C(i,j,g) = C(i,j,g) + A(i,k)*B(k,j,g);
            end
        end
    end
end

which checks out against using the full for-loops:

for j = 1:p
    for g = 1:q
        Ct(:,j,g) = A*B(:,j,g); %"true", but still uses for-loops
    end
end
C-Ct

but doesn't achieve my initial goal of just calling some built-in matlab function to do the work for me. Still, it was fun to play with this.