Prescribing areas of parallelograms (or 2x2 principal minors)

In the case where the matrix $(a^2_{ij})\_{i,j=1,\ldots,n}$ is nonsingular, then the problem reduces to the condition that it has a single positive eigenvalue. In fact, we have the following for any $n\times n$ nonzero symmetric matrix $A$ with nonnegative components and zero diagonal (I'll remove the square from $a_{ij}$, as it doesn't seem to help).

If there exist $\nu_i\in \mathbb{R}^n$ such that $$ \begin{align} A_{ij}=\lVert\nu_i\rVert^2\lVert\nu_j\rVert^2-\langle\nu_i,\nu_j\rangle^2&&{\rm(1)} \end{align} $$ then $A$ has a single positive eigenvalue (counting multiplicities).

Conversely, if $A$ is nonsingular and has a single positive eigenvalue, then there exists $\nu_i\in\mathbb{R}^n$ satisfying (1).

First, suppose that (1) holds. Then, there exist nonnegative reals $\lambda_i$ and a positive semidefinite matrix $S$ such that $A_{ij}=\lambda_i\lambda_j(1-S_{ij}^2)$, simply by taking $\lambda_i=\lVert\nu_i\rVert^2$ and $S_{ij}=\langle\hat\nu_i,\hat\nu_j\rangle$, where $\hat\nu_i=1_{\lbrace\nu_i\not=0\rbrace}\nu_i/\lVert\nu_i\rVert$. Let $\lambda=(\lambda_i)\_{i=1,\ldots,n}\in\mathbb{R}^n$. Using the fact that the componentwise square of a positive semidefinite matrix is itself positive semidefinite, $$ x^{\rm T} A x= \langle x,\lambda\rangle^2-\sum_{ij}(\lambda_ix_i)S_{ij}^2(\lambda_jx_j)\le\langle x,\lambda\rangle^2. $$ In particular, $x^{\rm T}A x\le0$ for all vectors orthogonal to $\lambda$. So, the space generated by eigenvectors with positive eigenvalues cannot contain any nonzero members orthogonal to $\lambda$, and has dimension at most one. But, as the trace of $A$ is zero, it must have at least one positive eigenvalue.

Conversely, suppose that $A$ has a single positive eigenvalue and is nonsingular. Diagonalization gives $$ A=u u^{\rm T}-\sum_{\alpha=1}^{n-1} v_{\alpha}v_{\alpha}^{\rm T} $$ for nonzero orthogonal $u,\nu_\alpha\in\mathbb{R}^n$. As the diagonal of $A$ is zero, $$ u_i^2=\sum_\alpha\nu_{\alpha,i}^2. $$ Using Cauchy–Schwarz, $$ A_{ij}\le u_iu_j+\sqrt{\sum_\alpha v_{\alpha,i}^2\sum_\beta v_{\beta,j}^2} =u_iu_j+\vert u_iu_j\vert. $$ So, the $u_i$ are all nonzero, otherwise $A$ would have a row with no positive elements. Then, $u_iu_j > 0$.Writing $S=\lbrace i=1,2,\ldots,n\colon u_i > 0\rbrace$ we would have $A_{ij}=0$ for $i\in S$ and $j\not\in S$. Breaking $A$ down into two blocks on which the row and column indices are respectively in $S$ and not in $S$, we can break the problem down to the case where $u_i$ are all of the same sign. W.l.o.g., take $u_i > 0$. For $1 > \epsilon > 0$, define the matrix $$ \begin{align} S_{ij}&=\sqrt{1-\epsilon u_i^{-1}u_j^{-1} A_{ij}}\cr &=1-\frac12\epsilon u_i^{-1}u_j^{-1} A_{ij}+O(\epsilon^2)\cr &=1-\epsilon/2+\frac\epsilon2\sum_{\alpha}u_i^{-1}u_j^{-1}v_{\alpha,i}v_{\alpha,j}+O(\epsilon^2) \end{align} $$ As the vectors $u,v_{\alpha}$ are linearly independent, the same is true of the vectors $\tilde u=(1,1,\ldots,1)$ and $\tilde v_\alpha=(u_1^{-1}v_{\alpha,1},\ldots,u_n^{-1}v_{\alpha,n})$. Then, $$ x^{\rm T}Sx=(1-\epsilon)\langle\tilde u,x\rangle^2+\frac\epsilon2\left(\langle\tilde u,x\rangle^2+\sum_\alpha\langle\tilde v_\alpha,x\rangle^2\right)+O(\epsilon^2\lVert x\rVert^2) $$ is nonnegative for all $\epsilon$ small enough and $x\in\mathbb{R}^n$. In this case $S$ is positive definite and (by Gram-Schmidt, for example) there are $\hat\nu_i\in\mathbb{R}^n$ with $S_{ij}=\langle\hat\nu_i,\hat\nu_j\rangle$. Setting $\nu_i=\epsilon^{-1/4}u_i^{1/2}\hat\nu_i$ gives (1).


That concludes the case where $A$ is nonsingular. The singular case is, I think, considerably more complicated.