The expected payoff of a dice game
Let's suppose we have only 1 roll. What is the expected payoff? Each roll is equally likely, so it will show $1,2,3,4,5,6$ with equal probability. Thus their average of $3.5$ is the expected payoff.
Now let's suppose we have 2 rolls. If on the first roll, I roll a $6$, I would not continue. The next throw would only maintain my winnings of $6$ (with $1/6$ chance) or make me lose. Similarly, if I threw a $5$ or a $4$ on the first roll, I would not continue, because my expected payoff on the last throw would be a $3.5$. However, if I threw a $1,2$ of $3$, I would take that second round. This is again because I expect to win $3.5$.
So in the 2 roll game, if I roll a $4,5,6$, I keep those rolls, but if I throw a $1,2,3$, I reroll. Thus I have a $1/2$ chance of keeping a $4,5,6$, or a $1/2$ chance of rerolling. Rerolling has an expected return of $3.5$. As the $4,5,6$ are equally likely, rolling a $4,5$ or $6$ has expected return $5$. Thus my expected payout on 2 rolls is $.5(5) + .5(3.5) = 4.25$.
Now we go to the 3 roll game. If I roll a $5$ or $6$, I keep my roll. But now, even a $4$ is undesirable, because by rerolling, I'd be playing the 2 roll game, which has expected payout of $4.25$. So now the expected payout is $\frac{1}{3}(5.5) + \frac{2}{3}(4.25) = 4.\overline{66}$.
Does that make sense?
This problem is solved using the theory of optimal stopping for Markov chains. I will explain some of the theory, then turn to your specific question. You can learn more about this fascinating topic in Chapter 4 of Introduction to Stochastic Processes by Gregory F. Lawler.
Think of a Markov chain with state space $\cal S$ as a game.
A payoff function $f:{\cal S}\to[0,\infty)$ assigns a monetary "payoff'' to each state of the Markov chain. This is the amount you would collect if you stop playing with the chain in that state.
In contrast, the value function $v:{\cal S}\to[0,\infty)$ is defined as the greatest expected payoff possible from each starting point; $$v(x)=\sup_T \mathbb{E}_x(f(X_T)).$$ There is a single optimal strategy $T_{\rm opt}$ so that $v(x)=\mathbb{E}_x(f(X_{T_{\rm opt}})).$ It can be described as $T_{\rm opt}:=\inf(n\geq 0: X_n\in{\cal E})$, where ${\cal E}=\{x\in {\cal S}\mid f(x)=v(x)\}$. That is, you should stop playing as soon as you hit the set $\cal E$.
Example:
You roll an ordinary die with outcomes $1,2,3,4,5,6$. You can keep the value or roll again. If you roll, you can keep the new value or roll a third time. After the third roll you must stop. You win the amount showing on the die. What is the value of this game?
Solution: The state space for the Markov chain is $${\cal S}=\{\mbox{Start}\}\cup\left\{(n,d): n=2,1,0; d=1,2,3,4,5,6\right\}.$$ The variable $n$ tells you how many rolls you have left, and this decreases by one every time you roll. Note that the states with $n=0$ are absorbing.
You can think of the state space as a tree, the chain moves forward along the tree until it reaches the end.
The function $v$ is given above in green, while $f$ is in red.
The payoff function $f$ is zero at the start, and otherwise equals the number of spots on $d$.
To find the value function $v$, let's start at the right hand side of the tree. At $n=0$, we have $v(0,d)=d$, and we calculate $v$ elsewhere by working backwards, averaging over the next roll and taking the maximum of that and the current payoff. Mathematically, we use the property $v(x)=\max(f(x),(Pv)(x))$ where $Pv(x)=\sum_{y\in {\cal S}} p(x,y)v(y).$
The value of the game at the start is \$4.66. The optimal strategy is to keep playing until you reach a state where the red number and green number are the same.
Suppose you have only two rolls of dice. then your best strategy would be to take the first roll if its outcome is more than its expected value (ie 3.5) and to roll again if it is less. Hence the expected payoff of the game rolling twice is: \begin{equation} \frac{1}{6}(6+5+4) + \frac{1}{2}3.5 = 4.25. \end{equation} If we have three dice your optimal strategy will be to take the first roll if it is 5 or greater otherwise you continue and your expected payoff will be: \begin{equation} \frac{1}{6}(6+5) + \frac{2}{3}4.25 = 4+\frac{2}{3}. \end{equation}