Find the bearing angle between two points in a 2D space
Define the bearing angle $\theta$ from a point $A(a_1,a_2)$ to a point $B(b_1,b_2)$ as the angle measured in the clockwise direction from the north line with $A$ as the origin to the line segment $AB$.
Then,
$$ (b_1,b_2) = (a_1 + r\sin\theta, a_2 + r\cos\theta), $$
where $r$ is the length of the line segment $AB$. It follows that $\theta$ satisfies the equation
$$ \tan\theta = \frac{b_1 - a_1}{b_2 - a_2} $$
As suggested by @rogerl we can use the $\mathrm{atan2}$ function to compute $\theta$. Let
$$ \hat{\theta} = \mathrm{atan2}(b_1 - a_1, b_2 - a_2) \in (-\pi,\pi] $$
Then the bearing angle $\theta\in[0,2\pi)$ is given by
$$ \theta = \left\{ \begin{array}{ll} \hat{\theta}, & \hat{\theta} \geq 0\\ 2\pi + \hat{\theta}, & \hat{\theta} < 0 \end{array}\right. $$
Note that the equations are given in terms of Cartesian coordinates, so it is necessary to transform to screen coordinates. I believe the formula for $\hat{\theta}$ in terms of screen coordinates $(a_1,a_2)$ and $(b_1,b_2)$ is $\hat{\theta} = \mathrm{atan2}(b_1 - a_1,a_2 - b_2)$.
You could code this function in C++ as follows.
#include <cmath>
// Computes the bearing in degrees from the point A(a1,a2) to
// the point B(b1,b2). Note that A and B are given in terms of
// screen coordinates.
double bearing(double a1, double a2, double b1, double b2) {
static const double TWOPI = 6.2831853071795865;
static const double RAD2DEG = 57.2957795130823209;
// if (a1 = b1 and a2 = b2) throw an error
double theta = atan2(b1 - a1, a2 - b2);
if (theta < 0.0)
theta += TWOPI;
return RAD2DEG * theta;
}