Jacobian matrix of the Rodrigues' formula (exponential map)
I don't think you need the general Jacobian $\frac{\partial}{\partial \mathbf p}\exp(\hat{\mathbf p})$, but only the much simpler Jacobian $\left.\frac{\partial}{\partial \mathbf p}\exp(\hat{\mathbf p})\right|_{\mathbf p=\mathbf 0}$ with $\mathbf p$ being at the identity.
Background
The group of 3d rotations (SO3) is a matrix lie group. Thus, in general we have the matrix exponential:
$$\exp(\mathtt M) := \sum_{k\ge 0} \frac{1}{k!} \mathtt M^k$$
which maps an element of the matrix Lie algebra onto the set of the matrix Lie group elemets. Furthermore we have a function $$\hat \cdot: \mathbb R^n \rightarrow \mathbb R^{m\times m}, \quad \hat{\mathbf a} = \sum_{k=0}^n a_i\mathtt G_i$$ which maps an $n$-vector onto the set of matrix Lie algebra elements (=matrices). Thus, for SO3 the generators are: $$ \mathtt G_1 = \begin{bmatrix} 0 & 0 & 0 \\ 0 & 0 & -1 \\ 0 & 1 & 0 \end{bmatrix},\quad \mathtt G_2 =\begin{bmatrix} 0 & 0 & 1 \\ 0 & 0 & 0 \\ -1 & 0 & 0 \end{bmatrix},\quad \mathtt G_3 =\begin{bmatrix} 0 & -1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix} $$
Derivative at the identity
For matrix Lie groups, it can be shown that $$ \left.\frac{\partial}{\partial a_k}\exp(\hat {\mathbf a})\right|_{\mathbf a=\mathbf 0} = \mathtt G_k.$$
Thus for SO3 $$ \left.\frac{\partial}{\partial \mathbf p}\exp(\hat {\mathbf p})\right|_{\mathbf p=\mathbf 0} = \begin{bmatrix}\mathtt G_1 & \mathtt G_2 & \mathtt G_3\end{bmatrix}$$ we receive a $3$d row vector of $3\times 3$ matrices, hence a $3\times 3 \times 3$ Jacobian tensor. (Alternatively, one could stack the columns of $\mathtt G_i$ into a 9-vectors and one would receive a $9\times 3$ Jacobian matrix.)
Background: Least square optimization
Let us assume we would like to minimize the function $F:\mathbb R^n \rightarrow \mathbb R$ with respect to a Euclidean vector $\mathbf x$. Furthermore let use assume we have a least square problem with
$$F = f(\mathbf x)^\top f(\mathbf x)$$
Following Gauss-Newton we repeatedly solve for an update $\delta$ $$ \frac{\partial f}{\partial \mathbf x^{(m)}}^\top\frac{\partial f}{\partial \mathbf x^{(m)}} \delta = -\frac{\partial f}{\partial \mathbf x^{(m)}}^\top f(\mathbf x^{(m)})$$ and update our estimate $$ \mathbf x^{(m+1)}=\delta + \mathbf x^{(m)}$$
Least square optimization on matrix Lie groups
This scheme is only valid for Euclidean vector spaces and need to be adapted for matrix Lie groups. Especially, we calculate the derivative in the tangent space around the identity:
$$\mathtt J :=\left.\frac{\partial f(\exp(\hat{\mathbf p})\cdot\mathtt R^{(m)})}{\partial \mathbf p} \right|_{\mathbf p =\mathbf 0}$$ (Can be calculated from the generators and using the chain rule.)
Then we solve for $\delta$:
$$ \mathtt J^\top\mathtt J \delta = -\mathtt J^\top f(\mathtt R^{(m)})$$
Finally we adapt the update rule: $$ \mathtt R^{(m+1)}= \exp(\hat\delta)\cdot \mathtt R^{(m)}.$$
The approach of Least square optimization on matrix Lie groups is also explained in a technical report on Minimization on the Lie Group SO(3) and Related Manifolds.
A recent paper by Guillermo Gallego and Anthony Yezzi suggests a compact formula for deriving the rotation matrix in the exponential map coordinates: http://arxiv.org/pdf/1312.0788v1.pdf
Formula (III.7) on page 5 is what you were looking for.