Matrix A has eigenvalue $λ$ , Prove the eigenvalues of Matrix $(A+kI)$ is (λ + k)
If you want to start with $x$, then this means you want to find $x$ such that $$(A+kI)x = (\lambda + k)x$$ since $(A+kI)x = Ax + kx$, this reduces to find $x$ such that $$Ax + kx = (\lambda + k)x$$ i.e. $$Ax = \lambda x$$ then one will naturally think of taking $x = e$
So what you don't want is an "ad hoc" proof that doesn't make it clear where it comes from, is that it?
Then you could use the characteristic polynomial of A :
Let P be that polynomial :
P(X) = det(A - XI) = det( A+kI - (X+k)I) = Q(X+k) , where Q is the characteristic polynomial of A+kI
P(X) = 0 <=> X= λ, with λ an eigenvalue of A.
An eigenvalue λ' of A+kI is defined as such : λ' eigenvalue of A+kI <=> Q(λ') =0
But : Q(λ')=0 = P(λ' -k ) <=> λ'-k is an eigenvalue of A, since it verifies P(λ'-k) = 0
So you can say that it exists a λ, eigenvalue of A, such as : λ' -k = λ
So you get the result you were looking for, and you can even affirm that the set of eigenvalues of A+k*I is {λ +k, λ eigenvalue of A} since the proof is made with equivalency all along the way