A matrix involving distances of $n$ points in $\mathbb{R}^3$
$\def\RR{\mathbb{R}}$Numerical experiments suggest that this matrix is negative semidefinite on the plane $\sum v_i=0$. Specifically, I generated 20 sets of 10 points each, drawn uniformly from the unit cube, and this was true every time. I repeated the experiment with points in 2, 4 and 5 dimensions, and the same held.
I am reminded of this answer by Noam Elkies but can't make a precise connection.
Switching this answer to CW to write up darij's proof from the comments. We will show that:
If $x_i$ are any $n$ points in $\RR^d$ and $v_i$ are any scalars with $\sum v_i=0$, then $\sum v_i v_j |x_i-x_j| \leq 0$ and
If the $x_i$ are distinct and the $v_i$ are not all $0$, we have strict inequality.
The latter shows that the matrix $\left[ |x_i-x_j| \right]$ times the vector $\left[ v_i \right]$ is nonzero. We start, however, by proving the former. We turn to the issue of when zero occurs after the horizontal line.
We start with the averaging trick: By rotational and scaling invariance, we can see that there is some positive $c$ such that $$\int_{|w|=1} \left| \langle w \cdot x \rangle \right| = c |x|.$$ So $$\sum v_i v_j |x_i-x_j| = c^{-1} \int_{|w|=1} \sum v_i v_j \left| \langle w \cdot (x_i-x_j) \rangle \right|$$ and thus it is sufficient to show $\sum v_i v_j \left| \langle w \cdot (x_i-x_j) \rangle \right|\leq 0$ for a particular vector $w$. Now, $w \cdot (x_i-x_j)$ only depends on the orthogonal projections of $x_i$ and $x_j$ onto the line $\RR w$, so we may (and do) assume all the $x_i$ lie on a line. Our goal is now to show, for any $n$ values $x_i \in \RR$, that $\sum v_i v_j |x_i-x_j| \leq 0$.
We have $|z| = \max(z,0) + \max(-z,0)$, so $\sum v_i v_j |x_i-x_j|=2 \sum v_i v_j \max(x_i-x_j,0)$. We use the notation $\left[ \mbox{statement} \right]$ to mean $1$ if the statement is true and $0$ if it is false. So $$\max(x_i-x_j,0) = \int_{t \in \RR} \left[x_j < t < x_i \right] dt$$ and $$\sum_{i,j} v_i v_j \max(x_i-x_j,0) = \int_{t \in \RR} \sum_{i,j} v_i v_j \left[x_j < t < x_i \right] dt.$$ So it is enough to show that, for any $t$, we have $$\sum_{x_i < t < x_j} v_i v_j \leq 0 . $$ Let $I = \{ i : x_i < t \}$ and $J = \{ i : x_j > t \}$. (For almost all $t$, none of the $x_i$ equal $t$, so we can disregard the boundary case.) Then $$\sum_{x_i < t < x_j} v_i v_j = \sum_{i \in I,\ j \in J} v_i v_j = \left( \sum_{i \in I} v_i \right) \left( \sum_{j \in J} v_j \right) = - \left(\sum_{i \in I} v_i \right)^2 \leq 0 .$$ In the final equality, we have finally used the hypothesis $\sum v_k=0$.
Now, let's consider what happens for distinct $x_i$. As long as the $x_i$ as distinct, for almost as $w$, the orthogonal projections of the $x_i$ onto $\RR w$ will stay distinct. In order to get $0$, we must get $0$ from all these choices of $w$. Let's say the orthogonal projections are ordered as $x_1 < x_2 < \cdots < x_n$. Once again, we must get $0$ from every choice of $t$. So we must have $\left( \sum_{i=1}^k v_i \right)^2 =0$ for all $k$, and this means that all $v_i$ are zero.