Connection Between Bézout's Identity and Linear Algebra
Bezout's identity with polynomials is used in linear algebra when you want to decompose a vector space according to the action on it by a linear operator.
For example, we'll show a vector space is a direct sum of its generalized eigenspaces for different eigenvalues. Let $V$ be a finite-dimensional complex vector space and $A \colon V \to V$ be linear with minimal polynomial $f(X) = \prod_{i=1}^m (X - \lambda_i)^{e_i}$: the $\lambda_i$'s are the eigenvalues of $A$. Set $V_i = \ker((A - \lambda_i)^{e_i}) = \{v \in V : (A - \lambda)^{e_i}(v) = 0\}$. We want to show $V = \bigoplus_{i=1}^m V_i$.
Step 1: $V = \sum_{i=1}^m V_i$.
Set $g_i(X) = f(X)/(X - \lambda_i)^{e_i}$. Since $(X - \lambda_i)^{e_i}g_i(X) = f(X)$, substituting $A$ for $X$ gives us $(A - \lambda_i)^{e_i} g_i(A) = f(A) = O$, so $g_i(A) \colon V \to V$ has image in $\ker((A - \lambda_i)^{e_i})$.
The polynomials $g_i(A)$ are relatively prime as an $m$-tuple: $\gcd(g_1(X), \ldots, g_m(X)) = (1)$, so by Bezout some $\mathbf C[X]$-linear combination of them is 1: $g_1(X)h_1(X) + \cdots + g_m(X)h_m(X) = 1$ in $\mathbf C[X]$. Therefore $g_1(A)h_1(A) + \cdots + g_m(A)h_m(A) = I$, so for each $v \in V$ we have $$ v = g_1(A)(h_1(A)v) + \cdots + g_m(A)(h_m(A)v). $$ The image of $g_i(A) \colon V \to V$ is inside $V_i$, so $V = \sum_{i=1}^m V_i$.
Step 2: The sum is direct.
Suppose $v_1 + \cdots + v_m = 0$ where $v_i \in V_i$. We want to prove each $v_i$ is $0$. By symmetry, we will prove $v_1 = 0$.
The case $m = 1$ is trivial ($V = V_1$), so take $m \geq 2$. Apply $\prod_{i=2}^m (A - \lambda_i)^{e_i}$ to both sides of $v_1 + \cdots + v_m = 0$ to kill off all but the first term: we get $\prod_{i=2}^m (A - \lambda_i)^{e_i}(v_1) = 0$. Thus $v_1$ is killed by $\prod_{i=2}^m(A - \lambda_i)^{e_i}$. Also $(A - \lambda_1)^{e_1}(v_1) = 0$ from the definition of $V_1$. The polynomials $\prod_{i=2}^m (X - \lambda_i)^{e_i}$ and $(X - \lambda_1)^{e_1}$ are relatively prime in $\mathbf C[X]$, so by Bezout some $\mathbf C[X]$-linear combination of them is $1$: $u(X)\prod_{i=2}^m (X - \lambda_i)^{e_i} + v(X)(X - \lambda_1)^{e_1} = 1$. Replacing $X$ with $A$ and applying both sides to $v_1$, we get $$ u(A)(\prod_{i=2}^m(A - \lambda_i)^{e_i}(v_1)) + v(A)((A - \lambda_1)^{e_1}(v_1)) = v_1. $$ On the left side, $\prod_{i=2}^m(A - \lambda_i)^{e_i}(v_1) = 0$ and $(A - \lambda_1)^{e_1}(v_1) = 0$, so $0 = v_1$.