What is the most efficient way to find the inverse of large matrix?
Let's solve $Ax = b$. The first equation in this linear system tells us that $-x_0 + a_0 x_n = b_0$, or $$ x_0 = a_0 x_n - b_0. $$ The second equation tells us that \begin{align} & x_0 - x_1 + a_1 x_n = b_1 \\ \implies & x_1 = (a_0 + a_1) x_n - b_0 - b_1. \end{align} The third equation tells us that \begin{align} &x_1 - x_2 + a_2 x_n = b_2 \\ \implies & x_2 = (a_0 + a_1 + a_2) x_n - b_0 - b_1 - b_2. \end{align} Continuing like this, the second-to-last equation tells us that \begin{align} & x_{n-1} = (a_0 + \cdots + a_{n-1}) x_n - b_0 - \cdots - b_{n-1}. \end{align} The final equation tells us that \begin{align} &x_{n-1} + (a_n - 1) x_n = b_n \\ \implies& (a_0 + \cdots + a_{n-1}) x_n - b_0 - \cdots - b_{n-1} + (a_n - 1) x_n = b_n \\ \implies & (-1 + \sum_{i=0}^n a_i)x_n - \sum_{i=0}^n b_i = 0 \\ \implies & x_n = \frac{\sum_{i=0}^n b_i }{-1 + \sum_{i=0}^n a_i}. \end{align} We can now do back-substitution to get the values of $x_{n-1}, \ldots, x_0$.
So we can solve $Ax = b$ in linear time.
Gaussian Elimination should be plenty fast, so perhaps the issue is how you are implementing it. We want to solve
$$\begin{bmatrix} -1 & 0 & 0 &\cdots & 0 & a_0\\ 1 & -1 & 0 &\cdots & 0 & a_1\\ 0 & 1 & -1 &\cdots & 0 & a_2\\ \vdots & \vdots & \vdots &\ddots & \vdots & \vdots\\ 0 & 0 & 0 &\cdots & -1 & a_{n-1}\\ 0 & 0 & 0 &\cdots & 1 & a_n-1\\ \end{bmatrix}\begin{bmatrix}x_0 \\ x_1 \\ x_2 \\ \vdots \\ x_{n-1} \\ x_n\end{bmatrix} =\begin{bmatrix}b_0 \\ b_1 \\ b_2 \\ \vdots \\ b_{n-1} \\ b_n\end{bmatrix}$$
By applying $R_{i+1} \leftarrow R_{i+1} + R_i$ for $i=0, 1, \ \ldots \ , n-1$ we have
$$\begin{bmatrix} -1 & 0 & 0 &\cdots & 0 & \alpha_0\\ 0 & -1 & 0 &\cdots & 0 & \alpha_1\\ 0 & 0 & -1 &\cdots & 0 & \alpha_2\\ \vdots & \vdots & \vdots &\ddots & \vdots & \vdots\\ 0 & 0 & 0 &\cdots & -1 & \alpha_{n-1}\\ 0 & 0 & 0 &\cdots & 0 & \alpha_n-1\\ \end{bmatrix}\begin{bmatrix}x_0 \\ x_1 \\ x_2 \\ \vdots \\ x_{n-1} \\ x_n\end{bmatrix} =\begin{bmatrix}\beta_0 \\ \beta_1 \\ \beta_2 \\ \vdots \\ \beta_{n-1} \\ \beta_n\end{bmatrix}$$
where $\alpha_k = \sum_{i=0}^k a_i$ and $\beta_k = \sum_{i=0}^k b_i.$ Make sure to compute the $\alpha_i, \beta_i$ recursively so that you only use $2n$ additions (instead of computing each naively, which would use about $n^2$ additions).
Now we do back substitution. First we have $x_n = \beta_n (\alpha_n-1)^{-1}$ which costs $1$ division. Then for $i=0, 1, 2, \ \ldots, n-1$ we have $x_i = \alpha_i x_n - \beta_i,$ which takes $n$ multiplications and $n$ subtractions to compute.
$$\mathrm{A} = \left( \mathrm{S} - \mathrm{I}_{n+1} \right) + \mathrm{a} \mathrm{e}_{n+1}^\top$$
where $\rm{S}$ is a (nilpotent) lower shift matrix. Using Sherman-Morrison,
$$\mathrm{A}^{-1} = \left( \mathrm{S} - \mathrm{I}_{n+1} \right)^{-1} - \frac{\left( \mathrm{S} - \mathrm{I}_{n+1} \right)^{-1} \mathrm{a} \mathrm{e}_{n+1}^\top\left( \mathrm{S} - \mathrm{I}_{n+1} \right)^{-1}}{1 + \mathrm{e}_{n+1}^\top \left( \mathrm{S} - \mathrm{I}_{n+1} \right)^{-1} \mathrm{a}}$$
where, using the nilpotence of $\rm{S}$,
$$\left( \mathrm{S} - \mathrm{I}_{n+1} \right)^{-1} = - \left( \mathrm{I}_{n+1} + \mathrm{S} + \mathrm{S}^2 + \cdots + \mathrm{S}^n \right) = - \underbrace{\begin{bmatrix} 1 & 0 & 0 & \cdots & 0 & 0\\ 1 & 1 & 0 & \cdots & 0 & 0\\ 1 & 1 & 1 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ 1 & 1 & 1 & \cdots & 1 & 0\\ 1 & 1 & 1 & \cdots & 1 & 1\\ \end{bmatrix}}_{=: \rm{L}}$$
and, hence,
$$1 + \mathrm{e}_{n+1}^\top \left( \mathrm{S} - \mathrm{I}_{n+1} \right)^{-1} \mathrm{a} = 1 - \mathrm{e}_{n+1}^\top \mathrm{L} \, \mathrm{a} = 1 - \Bbb{1}_{n+1}^\top \mathrm{a}$$
and, thus,
$$\mathrm{A}^{-1} = - \mathrm{L} - \frac{\mathrm{L} \, \mathrm{a} \Bbb{1}_{n+1}^\top}{1 - \Bbb{1}_{n+1}^\top \mathrm{a}} = \left(\frac{1}{\Bbb{1}_{n+1}^\top \mathrm{a} - 1}\right) \mathrm{L} \, \mathrm{a} \Bbb{1}_{n+1}^\top - \mathrm{L} = \color{blue}{\mathrm{L} \left( \left(\frac{1}{\Bbb{1}_{n+1}^\top \mathrm{a} - 1}\right) \, \mathrm{a} \Bbb{1}_{n+1}^\top - \mathrm{I}_{n+1} \right)}$$
which seems to agree with Daniel's answer.