If $A,B$ are upper triangular matrices such that $AX=XA\implies BX=XB$ for upper triangular $X$, is $B$ a polynomial in $A$?
This is false! Let $$A = \begin{bmatrix} 0&0&0&1 \\ &0&1&0 \\ &&0&0 \\ &&&0 \\ \end{bmatrix}.$$ Imposing that $XA=AX$ for upper triangular $X$ gives linear equations on the $10$ entries of $X$. Solving them, I get that this occurs precisely for $X$ of the form $$X=\begin{bmatrix} a&0&\ast&\ast \\ &b&\ast&\ast \\ &&b&0 \\ &&& a \\ \end{bmatrix}.$$ In turn, an upper triangular matrix commutes with all such $X$ if and only if it is of the form $$B=\begin{bmatrix} c&0&0&d \\ &c&e&0 \\ &&c&0 \\ &&&c \\ \end{bmatrix}.$$ But such a $B$ is only a polynomial in $A$ if $d=e$.
If $A$ has pairwise distinct diagonal elements one can prove the theorem by induction as follows:
- By block partitioning one can put $A\in T_{n}({\mathbb F})$ to the form $$ A = \begin{pmatrix} \alpha & a \left(\alpha I - A'\right)\\ 0 & A'\end{pmatrix}$$ with $\alpha \in{\mathbb F}$, $a\in {\mathbb F}^{n-1}$ a row vector and $A'\in T_{n-1}({\mathbb F})$. Here one uses that $\alpha I - A'$ is non singular. For $p \in {\mathbb F}[t]$ one has $$ p(A) = \begin{pmatrix} p(\alpha) & a \left(p(\alpha) I - p(A')\right)\\ 0 & p(A')\end{pmatrix}.$$
- For $X \in T_{n}({\mathbb F})$, one has $[A,X]=0$ iff $X$ can be put to the form $$ X = \begin{pmatrix} \xi & a \left(\xi I - X'\right)\\ 0 & X'\end{pmatrix}$$ with $[A',X']=0$.
- Now, if $B\in T_{n}({\mathbb F})$ satisfies $[A,X] = 0\Rightarrow [B,X]=0$ it must be (by Step 2) of the form $$ B = \begin{pmatrix} \beta & a \left(\beta I - B'\right)\\ 0 & B'\end{pmatrix},$$ where $[A',X'] = 0\Rightarrow [B',X']=0$. By induction hypothesis, there is a polynomial $q\in {\mathbb F}[t]$ such that $B'=q(A')$.
- Now, if one denotes the pairwise distinct diagonal elements (i.e., eigenvalues) of the upper triangular matrix $A$ by $\alpha, \lambda_1,\ldots,\lambda_{n-1}$ the polynomial $p \in {\mathbb F}[t]$ solving the interpolation problem $p(\alpha)=\beta$ and $p(\lambda_j)=q(\lambda_j)$ ($j=1,\ldots,n-1$) yields $p(A')=q(A') = B'$ and hence, by Step 1, $p(A)=B$.
In particular, the thus constructed polynomial $p$ has degree at most $n-1$.
(An observation, not an answer.) If A has pairwise distinct diagonal elements then any matrix $X$ commuting with $A$ is necessarily upper triangular. It is easy to see this once you compute the lower part of $D=[A,X]$, because $$D_{ij}=(A_{ii}-A_{jj})X_{ij}+\sum_{k>i} A_{ik} X_{kj}-\sum_{k<j} A_{kj} X_{ik}.$$ For example, $D_{n1}=(A_{nn}-A_{11})X_{n1}$. Thus $X_{n1}=0$ and then $X_{ij}=0$ by induction on $n-(i-j)$, which only breaks down when $i-j=0$. As $B$ commutes with any matrix commuting with $A$, it is a polynomial of $A$.
What we need to prove is $$C(C({\mathbb F}[A])\cap T_n({\mathbb F}))\cap T_n({\mathbb F})=C(C({\mathbb F}[A])),$$ where $C(R)$ denotes a centralizer of a subalgebra $R\subset M_n({\mathbb F})$. In the above case it is trivial because $C({\mathbb F}[A])\subset T_n({\mathbb F})$.