Block-diagonalizing an antisymmetric matrix
Without knowing what level of understanding you're looking for, I'm going to respond with some high level remarks. Feel free to ask for clarification if something strikes your interest.
A widely referenced work for eigenvalue and decomposition algorithms is Golub and van Loan's Matrix Computations (3rd ed., 1996), and Chapter 7 ("The Unsymmetric Eigenvalue Problem") in particular. In Sec. 7.4 you would read about the real Schur decomposition in which real matrix $A$ can be factored as $Q^T T Q$ where $Q$ is orthogonal and $T$ is quasi-triangular (block upper triangular such that the diagonal blocks are either $1\times1$ or $2\times2$ blocks). Hereafter we may refer to sections of this book by prefixing GvL to the section numbers.
In your case there is special structure in that $A$ is real(?) antisymmetric (or skew symmetric as many prefer), and its eigenvalues will occur in purely imaginary conjugate pairs. Thus the matrix is orthogonally similar to a block diagonal form with $2\times2$ blocks like:
$$ \begin{pmatrix} 0 & \sigma_i \\ -\sigma_i & 0 \end{pmatrix} $$
It is natural to ask if structure-preserving algorithms will have any special advantage in speed or accuracy, and the answer seems to be yes.
Note that the related matrix $iA$ is Hermitian, and by standard direct algorithms (GvL8.3.1 and problem P8.3.5 at end of section 8.3) is unitarily similar to a real tridiagonal matrix. This suggests there should exist a real orthogonal $Q$ such that $Q^T A Q = T$ is antisymmetric and tridiagonal (and thus has all zeros on the diagonal). Such a direct computation is the first problem in this course assignment (see also problem GvL8.3 P8.3.8).
As it happens, M. Wimmer in Efficient numerical computation of the Pfaffian for dense and banded skew-symmetric matrices (2011) connects the tridiagonalization of a even-dimensional real antisymmetric matrix with its ("real" Jordan) canonical form (Sec. I.B). Appendix A spells out the details of this connection, which requires the SVD of a bidiagonal matrix easily derived from $T$. Sec. II compares several ways of tridiagonalizing a real antisymmetric matrix, including by the Householder transformations required by the references above.
Here is my quick-and-dirty version of tridiagonalizing real antisymmetric matrices:
#{
Skew Symmetric Tridiagonalization
Real skew symmetric matrix A can be reduced by direct
orthogonal (Householder) transformations to a similar
tridiagonal skew symmetric matrix T.
The implementation below using Octave (Matlab) syntax is
motivated by this course assigment, though lacking its
thoughtful performance optimizations:
http://www.cs.cornell.edu/courses/CS6210/2010fa/A6/A6.pdf
See also Golub & van Loan (3rd ed.) as in my answer here:
http://math.stackexchange.com/a/167191/3111
Copyright Chip Eastham, July 2012, and dual licensed under
GNU GPLv3, see http://www.gnu.org/licenses/gpl-3.0.html
or CC-Wiki, see http://creativecommons.org/licenses/by-sa/3.0/
with attribution/share-alike required per StackExchange policy.
#}
function [Q, T] = SkewReduce (A)
# usage: [Q, T] = SkewReduce (A)
#
# A : real skew symmetric (square) matrix
# returning:
# Q : real orthogonal matrix s.t. A = QTQ' where
# T : tridiagonal real skew symmetric matrix
n = size(A,1); % assume n > 2 (done if n =< 2)
Q = eye(n); % initialize to accumulate product
T = A; % copy of A to be tridiagonalized
for k = 1:n-2
v = T(k+1:n,k);
r = norm(v);
v(1) = v(1) + r;
q = v' * v;
P = blkdiag(eye(k),eye(n-k) - (2/q)*v*v');
Q = P*Q;
T = P*T*P; % because P is symmetric orthogonal
endfor
endfunction
A brief test case is this:
octave:3> A = [0,2,-1;-2,0,-4;1,4,0]
A =
0 2 -1
-2 0 -4
1 4 0
octave:4> [Q,T] = SkewReduce(A)
Q =
1.00000 0.00000 0.00000
0.00000 0.89443 -0.44721
0.00000 -0.44721 -0.89443
T =
0.00000 2.23607 -0.00000
-2.23607 -0.00000 4.00000
0.00000 -4.00000 0.00000
Note that this tridiagonalization does not depend on matrix size being even.
The above procedure is equivalent to the Hessenberg transformation and may be achieved in Matlab using
[Q, T] = hess(A)