Divergence of matrix-vector product
I agree with Tommaso Seneci. This question deserves a better answer. Yes, it is just vector calculus, but there are some non-trivial tricks that deserve to be noted.
Inspired by this note by Piaras Kelly, I can write down that $$ \nabla \cdot (\mathbf{A}\mathbf{v}) = (\nabla \cdot \mathbf{A}) \mathbf{v} + \text{tr}(\mathbf{A}\text{grad}\mathbf{v}) $$ where $$ \text{grad}\mathbf{v} = \begin{pmatrix} \frac{\partial v_1}{\partial x_1} & \frac{\partial v_1}{\partial x_2} & \frac{\partial v_1}{\partial x_3} \\ \frac{\partial v_2}{\partial x_1} & \frac{\partial v_2}{\partial x_2} & \frac{\partial v_2}{\partial x_3} \\ \frac{\partial v_3}{\partial x_1} & \frac{\partial v_3}{\partial x_2} & \frac{\partial v_3}{\partial x_3} \\ \end{pmatrix} $$ and $$ \nabla \cdot \mathbf{A} = [\frac{\partial}{\partial x_1} \quad \frac{\partial}{\partial x_2} \quad \frac{\partial}{\partial x_3}] \mathbf{A} = \begin{pmatrix} \frac{\partial A_{11}}{\partial x_1}+\frac{\partial A_{21}}{\partial x_2}+\frac{\partial A_{31}}{\partial x_3} \\ \frac{\partial A_{12}}{\partial x_1}+\frac{\partial A_{22}}{\partial x_2}+\frac{\partial A_{32}}{\partial x_3} \\ \frac{\partial A_{13}}{\partial x_1}+\frac{\partial A_{23}}{\partial x_2}+\frac{\partial A_{33}}{\partial x_3} \\ \end{pmatrix}^T . $$
The trick to do this calculation is this formula $$ \nabla \cdot \mathbf{v} = \text{tr}(\text{grad}\mathbf{v}). $$
First compute $\text{grad}(\mathbf{A}\mathbf{v})$ by product rule: $$ \text{grad}(\mathbf{A}\mathbf{v}) = [(\frac{\partial}{\partial x_1} \mathbf{A})\mathbf{v} \quad (\frac{\partial}{\partial x_2} \mathbf{A})\mathbf{v} \quad (\frac{\partial}{\partial x_3} \mathbf{A})\mathbf{v}] + \mathbf{A} \text{grad}(\mathbf{v}) $$ Then take trace of the two terms. The trace of first term, by carefully simplifying, becomes $(\nabla \cdot \mathbf{A})\mathbf{v}$.
Please correct me if there is any mistake in the calculation.
Let us write the matrix-vector product ${\bf M}\cdot {\bf c}$ in index notation (Einstein convention). Using the product rule, the divergence of $({\bf M}\cdot {\bf c})_{i} = M_{ij} c_j$ satisfies $$ \nabla\cdot({\bf M}\cdot {\bf c}) = M_{ij,i} c_j + M_{ij} c_{j,i} = {\bf c}\cdot\left(\nabla\cdot({\bf M}^\top)\right) + {\bf M}^\top\! : \nabla{\bf c}\, , $$ where ${\bf A}:{\bf B} = \text{tr}({\bf A}^\top\!\cdot{\bf B}) = \text{tr}({\bf A}\cdot{\bf B}^\top)$. Similarly, one shows that the vector-matrix product $({\bf c}\cdot {\bf M})_{j} = c_i M_{ij}$ satisfies $$ \nabla\cdot ({\bf c}\cdot{\bf M}) = c_{i,j} M_{ij} + c_i M_{ij,j} = {\bf c}\cdot(\nabla\cdot {\bf M}) + {\bf M} : \nabla{\bf c} \, . $$