Is a matrix multiplied with its transpose something special?
The main thing is presumably that $AA^T$ is symmetric. Indeed $(AA^T)^T=(A^T)^TA^T=AA^T$. For symmetric matrices one has the Spectral Theorem which says that we have a basis of eigenvectors and every eigenvalue is real.
Moreover if $A$ is invertible, then $AA^T$ is also positive definite, since $$x^TAA^Tx=(A^Tx)^T(A^Tx)> 0$$
Then we have: A matrix is positive definite if and only if it's the Gram matrix of a linear independent set of vectors.
Last but not least if one is interested in how much the linear map represented by $A$ changes the norm of a vector one can compute
$$\sqrt{\left<Ax,Ax\right>}=\sqrt{\left<A^TAx,x\right>}$$
which simplifies for eigenvectors $x$ to the eigenvalue $\lambda$ to
$$\sqrt{\left<Ax,Ax\right>}=\sqrt \lambda\sqrt{\left<x,x\right>},$$
The determinant is just the product of these eigenvalues.
$AA^T$ is positive semi-definite, and in a case in which $A$ is a column matrix, it will be a rank 1 matrix and have only one non-zero eigenvalue which equal to $A^TA$ and its corresponding eigenvector is $A$. The rest of the eigenvectors are the null space of $A$ i.e. $\lambda^TA = 0$.
Indeed, independent of the size of $A$, there is a useful relation in the eigenvectors of $AA^T$ to the eigenvectors of $A^TA$; based on the property that $rank(AA^T)=rank(A^TA)$. That the rank is identical implies that the number of non-zero eigenvectors is identical. Moreover, we can infer the eigenvectors of $A^TA$ from $AA^T$ and vice versa. The eigenvector decomposition of $AA^T$ is given by $AA^Tv_i = \lambda_i v_i$. In case $A$ is not a square matrix and $AA^T$ is too large to efficiently compute the eigenvectors (like it frequently occurs in covariance matrix computation), then it's easier to compute the eigenvectors of $A^TA$ given by $A^TAu_i = \lambda_i u_i$. Pre-multiplying both sides of this equation with $A$ yields
$AA^TAu_i=\lambda_iAu_i$.
Now, the originally searched eigenvectors $v_i$ of $AA^T$ can easily be obtained by $v_i:=Au_i$. Note, that the resulted eigenvectors are not yet normalized.
One could name some properties, like if $B=AA^T$ then
$$B^T=(AA^T)^T=(A^T)^TA^T=AA^T=B,$$
so
$$\langle v,Bw\rangle=\langle Bv,w\rangle=\langle A^Tv,A^Tw\rangle.$$