How to generalize the determinant as function
Notice that $$ D_\beta(\ldots, \underbrace{w_i+w_j}_{k\text{-th place}},\ldots, \underbrace{w_i+w_j}_{m\text{-th place}},\ldots)=0 $$ because there are two equal vectors. On the other hand, by linearity: $$ D_\beta(\ldots, {w_i+w_j},\ldots, {w_i+w_j},\ldots)= D_\beta(\ldots, w_i,\ldots, w_j,\ldots)+ D_\beta(\ldots, w_j,\ldots, w_i,\ldots), $$ because $D_\beta(\ldots, w_i,\ldots, w_i,\ldots)=D_\beta(\ldots, w_j,\ldots, w_j,\ldots)=0$. You get then: $$ D_\beta(\ldots, w_i,\ldots, w_j,\ldots)= -D_\beta(\ldots, w_j,\ldots, w_i,\ldots), $$ that is your function, applied to basis vectors, changes of sign whenever you exchange two of its arguments. Together with the request $D_\beta(w_1, \ldots, w_n)=1$ this completely determines the values of $D_\beta$ when its arguments are basis vectors. It follows by linearity that $D_\beta$ is uniquely determined.
To show that applying $D_\beta$ to some vectors will be the same as taking the determinant of the coordinate representation of those vectors, one would need a definition of determinant. My favourite definition of determinant is indeed the same as the definition of $D_\beta$.