Column Vectors orthogonal implies Row Vectors also orthogonal?

Recall that two vectors are orthogonal if and only if their inner product is zero. You are incorrect in asserting that if the columns of $Q$ are orthogonal to each other then $QQ^T = I$; this follows if the columns of $Q$ form an orthonormal set (basis for $\mathbb{R}^n$); orthogonality is not sufficient. Note that "$Q$ is an orthogonal matrix" is not equivalent to "the columns of $Q$ are pairwise orthogonal".

With that clarification, the answer is that if you only ask that the columns be pairwise orthogonal, then the rows need not be pairwise orthogonal. For example, take $$A = \left(\begin{array}{ccc}1& 0 & 0\\0& 0 & 1\\1 & 0 & 0\end{array}\right).$$ The columns are orthogonal to each other: the middle column is orthogonal to everything (being the zero vector), and the first and third columns are orthogonal. However, the rows are not orthogonal, since the first and third rows are equal and nonzero.

On the other hand, if you require that the columns of $Q$ be an orthonormal set (pairwise orthogonal, and the inner product of each column with itself equals $1$), then it does follow: precisely as you argue. That condition is equivalent to "the matrix is orthogonal", and since $I = Q^TQ = QQ^T$ and $(Q^T)^T = Q$, it follows that if $Q$ is orthogonal then so is $Q^T$, hence the columns of $Q^T$ (i.e., the rows of $Q$) form an orthonormal set as well.


Even if $A$ is non-singular, orthogonality of columns by itself does not guarantee orthogonality of rows. Here is a 3x3 example: $$ A = \left( \begin{matrix} 1 & \;2 & \;\;5 \\ 2 & \;2 & -4 \\ 3 & -2 & \;\;1 \end{matrix} \right) $$ Column vectors are orthogonal, but row vectors are not orthogonal.

On the other hand, orthonormality of columns guarantees orthonormality of rows, and vice versa.

As a footnote, one of the forms of Hadamard's inequality concerns the absolute value of the determinant of a matrix given the norms of the column vectors. That absolute value will be maximum when those vectors are orthogonal. The determinant, in absolute value, will be equal to the product of the norms. In the case of the above matrix, as the columns are orthogonal, 84 is the maximum possible absolute value of the determinant $-$ det(A) is -84 $-$ for column vectors with the given norms ($\sqrt {14}, 2\sqrt 3$ and $ \sqrt {42}$ respectively).

Although $det(A)=det(A^T)$, Hadamard's inequality does not imply neither orthogonality of the rows of A nor that the absolute value of the determinant is maximum for the given norms of the row vectors ($ \sqrt{30}, 2\sqrt 6$ and $ \sqrt{14}$ respectively; their product is $ 12 \sqrt{70} \cong 100.4 $).


This condition says that $Q^{-1} = Q^t$. That means that you have $$Q^tQ = Q Q^t = I.$$
Yes, if the rows are orthonormal (basis -- oops my omission), so are the columns.