Calculating angle between two vectors in python

Here is what I ended up using, all using numpy and the range is between -𝛑 to 𝛑

import numpy as np
def get_angle(p0, p1=np.array([0,0]), p2=None):
    ''' compute angle (in degrees) for p0p1p2 corner
    Inputs:
        p0,p1,p2 - points in the form of [x,y]
    '''
    if p2 is None:
        p2 = p1 + np.array([1, 0])
    v0 = np.array(p0) - np.array(p1)
    v1 = np.array(p2) - np.array(p1)

    angle = np.math.atan2(np.linalg.det([v0,v1]),np.dot(v0,v1))
    return np.degrees(angle)

Your angle formula will fail if

pt2.getX() == pt1.getX()

(that is, if pt1 and pt2 lie on a vertical line) because you can not divide by zero. (m2, the slope, would be infinite.)

Also

m1 = (pt1.getY() - pt1.getY())/1

will always be zero. So at the very least, your formula could be simplified to the arctan of the slope. However, I wouldn't bother since the formula does not work for all possible points.

Instead, a more robust method (indeed, the standard method) for calculating the angle between two vectors (directed line segments) is to use the dot product formula:

enter image description here

where if a = (x1, y1), b = (x2, y2), then <a,b> equals x1*x2 + y1*y2, and ||a|| is the length of vector a, i.e. sqrt(x1**2 + y1**2).


import math

def angle(vector1, vector2):
    x1, y1 = vector1
    x2, y2 = vector2
    inner_product = x1*x2 + y1*y2
    len1 = math.hypot(x1, y1)
    len2 = math.hypot(x2, y2)
    return math.acos(inner_product/(len1*len2))

def calculate(pt, ls):
    i = 2
    for x in ls:
        pt2 = (x, i)
        i += 1
        ang = math.degrees(angle(pt, pt2))
        ang = ang * (-1)
        print(ang)

pt = (3, 1)
ls = [1,7,0,4,9,6,150]

calculate(pt, ls)

Tags:

Python

Math