Requirements on fields for determinants to bust dependence.
In any field, the determinant is non-zero if and only if the matrix is invertible (which is true if and only if the columns are linearly independent).
One direction is obvious: if $A$ has determinant zero, then $\det(AB)$ will be zero for any matrix $B$, which means that there can be no "inverse" $B$. That is, we can't have $\det(AB) = \det(I) = 1$
For the other direction, it suffices to consider the formula $$ A \operatorname{adj}(A) = \det(A)I $$ because this formula holds over the integers and requires no division, it will also hold over any field. If $\det(A)$ is non-zero, then it has a multiplicative inverse, so we have $$ A \frac{\operatorname{adj}(A)}{\det(A)} = I $$ which means that $A$ has inverse $\frac{\operatorname{adj}(A)}{\det(A)}$.
The determinant always gives dependence, yes. To see that, we have to notice that a linear dependence of the columns of a matrix $A$ is equivalent to a non-zero vector in the kernel of said matrix.
It is now not that hard to prove that a matrix has determinant zero if and only if it has non-zero elements in the kernel. If the determinant is non-zero, then the matrix is invertible (using its adjoint) and hence has no non-trivial kernel. If the determinant is zero, we use that over a field we have the reduced row echelon form, that then also has determinant zero. As this will be a triangular matrix, it is then easy to see that we have non-trivial elements in the kernel.
In fact, one can show for every commutative ring that the kernel of a matrix is trivial if and only if the determinant is neither zero nor a zero-divisor, but the proof gets more complicated as soon as we loose the integral domain (and thus the chance to work in the quotient field).
The two identities ($ A $ is a square matrix over a commutative ring $ R $)
$$ A \operatorname{adj}(A) = \det(A) I $$
$$ \det(AB) = \det(A) \det(B) $$
respectively imply that if $ \det(A) $ is a unit, then $ \operatorname{adj}(A) (\det(A))^{-1} $ is an inverse for $ A $; and if $ A $ has an inverse, then $ \det(A) $ must be a unit, because $ \det(A^{-1}) $ is then an inverse for $ \det(A) $ in $ R $. In other words, a square matrix with entries in a commutative ring is invertible iff its determinant is invertible in the ground ring.