Calculating a Point that lies on an Ellipse given an Angle
If the ellipse is centered at the origin, the equation of the ellipse is $\frac{x^2}{a^2}+\frac{y^2}{b^2}=1$. The equation of the line is $y=x\tan \theta $ So you have $\frac{x^2}{a^2}+\frac{(x\tan \theta )^2}{b^2}=1$ or $x=\pm \frac{ab}{\sqrt{b^2+a^2(\tan \theta)^2}}$ where the sign is + if $ -\pi/2 \lt \theta \lt \pi/2$
You can also use parametric equations:
$$x=a\cos(\theta)$$ $$y=b\sin(\theta)$$
Where $a$ is the radius on the horizontal axis, and $b$ is the radius on the vertical axis.
If the ellipse is centered at $(0,0)$, $2a$ wide in the $x$-direction, $2b$ tall in the $y$-direction, and the angle you want is $\theta$ from the positive $x$-axis, the coordinates of the point of intersection are $$\left(\frac{a b}{\sqrt{b^2+a^2\tan^2(\theta)}},\frac{a b \tan(\theta)}{\sqrt{b^2 + a^2\tan^2(\theta)}}\right) \text{ if }0\le\theta< 90°\text{ or }270°<\theta\le360°$$ or $$\left(-\frac{a b}{\sqrt{b^2+a^2\tan^2(\theta)}},-\frac{a b \tan(\theta)}{\sqrt{b^2 + a^2\tan^2(\theta)}}\right) \text{ if }90°<\theta< 270°.$$