Optimal strategy for cutting a sausage?

TLDR: $a_n=\log_2(1+1/n)$ works, and is the only smooth solution.

This problem hints at a deeper mathematical question, as follows. As has been observed by Pongrácz, there is a great deal of possible variation in solutions to this problem. I would like to find a "best" solution, where the sequence of pieces is somehow as evenly distributed as possible, given the constraints.

Let us fix the following strategy: at stage $n$ there are $n$ pieces, of lengths $a_n,\dots,a_{2n-1}$, ordered in decreasing length. You cut $a_n$ into two pieces, forming $a_{2n}$ and $a_{2n+1}$. We have the following constraints:

$$a_1=1\qquad a_n=a_{2n}+a_{2n+1}\qquad a_n\ge a_{n+1}\qquad a_n<2a_{2n-1}$$

I would like to find a nice function $f(x)$ that interpolates all these $a_n$s (and possibly generalizes the relation $a_n=a_{2n}+a_{2n+1}$ as well).

First, it is clear that the only degree of freedom is in the choice of cut, which is to say if we take any sequence $b_n\in (1/2,1)$ then we can define $a_{2n}=a_nb_n$ and $a_{2n+1}=a_n(1-b_n)$, and this will completely define the sequence $a_n$.

Now we should expect that $a_n$ is asymptotic to $1/n$, since it drops by a factor of $2$ every time $n$ doubles. Thus one regularity condition we can impose is that $na_n$ converges. If we consider the "baseline solution" where every cut is at $1/2$, producing the sequence

$$1,\frac12,\frac12,\frac14,\frac14,\frac14,\frac14,\frac18,\frac18,\frac18,\frac18,\frac18,\frac18,\frac18,\frac18,\dots$$ (which is not technically a solution because of the strict inequality, but is on the boundary of solutions), then we see that $na_n$ in fact does not tend to a limit - it varies between $1$ and $2$.

If we average this exponentially, by considering the function $g(x)=2^xa_{\lfloor 2^x\rfloor}$, then we get a function which gets closer and closer to being periodic with period $1$. That is, there is a function $h(x):[0,1]\to\Bbb R$ such that $g(x+n)\to h(x)$, and we need this function to be constant if we want $g(x)$ itself to have a limit.

There is a very direct relation between $h(x)$ and the $b_n$s. If we increase $b_1$ while leaving everything else the same, then $h(x)$ will be scaled up on $[0,\log_2 (3/2)]$ and scaled down on $[\log_2 (3/2),1]$. None of the other $b_i$'s control this left-right balance - they make $h(x)$ larger in some subregion of one or the other of these intervals only, but preserving $\int_0^{\log_2(3/2)}h(x)\,dx$ and $\int_{\log_2(3/2)}^1h(x)\,dx$.

Thus, to keep these balanced we should let $b_1=\log_2(3/2)$. More generally, each $b_n$ controls the balance of $h$ on the intervals $[\log_2(2n),\log_2(2n+1)]$ and $[\log_2(2n+1),\log_2(2n+2)]$ (reduced$\bmod 1$), so we must set them to $$b_n=\frac{\log_2(2n+1)-\log_2(2n)}{\log_2(2n+2)-\log_2(2n)}=\frac{\log(1+1/2n)}{\log(1+1/n)}.$$

When we do this, a miracle occurs, and $a_n=\log_2(1+1/n)$ becomes analytically solvable: \begin{align} a_1&=\log_2(1+1/1)=1\\ a_{2n}+a_{2n+1}&=\log_2\Big(1+\frac1{2n}\Big)+\log_2\Big(1+\frac1{2n+1}\Big)\\ &=\log_2\left[\Big(1+\frac1{2n}\Big)\Big(1+\frac1{2n+1}\Big)\right]\\ &=\log_2\left[1+\frac{2n+(2n+1)+1}{2n(2n+1)}\right]\\ &=\log_2\left[1+\frac1n\right]=a_n. \end{align}

As a bonus, we obviously have that the $a_n$ sequence is decreasing, and if $m<2n$, then \begin{align} 2a_m&=2\log_2\Big(1+\frac1m\Big)=\log_2\Big(1+\frac1m\Big)^2=\log_2\Big(1+\frac2m+\frac1{m^2}\Big)\\ &\ge\log_2\Big(1+\frac2m\Big)>\log_2\Big(1+\frac2{2n}\Big)=a_n, \end{align}

so this is indeed a proper solution, and we have also attained our smoothness goal — $na_n$ converges, to $\frac 1{\log 2}=\log_2e$. It is also worth noting that the difference between the largest and smallest piece has limit exactly $2$, which validates Henning Makholm's observation that you can't do better than $2$ in the limit.

It looks like this (rounded to the nearest hundred, so the numbers may not add to 100 exactly):

  • $58:42$, ratio = $1.41$
  • $42:32:26$, ratio = $1.58$
  • $32:26:22:19$, ratio = $1.67$
  • $26:22:19:17:15$, ratio = $1.73$
  • $22:19:17:15:14:13$, ratio = $1.77$

If you are working with a sequence of points treated$\bmod 1$, where the intervals between the points are the "sausages", then this sequence of segments is generated by $p_n=\log_2(2n+1)\bmod 1$. The result is beautifully uniform but with a noticeable sweep edge:

                                  sausages

A more concrete optimality condition that picks this solution uniquely is the following: we require that for any fraction $0\le x\le 1$, the sausage at the $x$ position (give or take a sausage) in the list, sorted in decreasing order, should be at most $c(x)$ times smaller than the largest at all times. This solution achieves $c(x)=x+1$ for all $0\le x\le 1$, and no solution can do better than that (in the limit) for any $x$.


YES, it is possible!

You mustn't cut a piece in half, because eventually you have to cut one of them, and then you violate the requirement. So in fact, you must never have two equal parts. Make the first cut so that the condition is not violated, say $60:40$.

From now on, assume that the ratio of biggest over smallest is strictly less than $2$ in a given round, and no two pieces are equal. (This holds for the $60:40$ cut.) We construct a good cut that maintains this property.

So at the next turn, pick the biggest piece, and cut it in two non-equal pieces in an $a:b$ ratio, but very close to equal (so $a/b\approx 1$). All you have to make sure is that

  • $a/b$ is so close to $1$ that the two new pieces are both smaller that the smallest piece in the last round.
  • $a/b$ is so close to $1$ that the smaller piece is bigger than half of the second biggest in the last round (which is going to become the biggest piece in this round).

Then the condition is preserved. For example, from $60:40$ you can move to $25:35:40$, then cut the fourty to obtain $19:21:25:35$, etc.


You can't do better than a factor of $2$.

Assume to the contrary that you have a strategy such that the ratio between the largest and smallest remaining piece is always $<R$ for some $R<2$.

Then, first we can see that for every $\varepsilon>0$ there will eventually be two pieces whose ratio is at most $1+\varepsilon$. Otherwise the ratio between largest and smallest piece would be at least $(1+\varepsilon)^n$ which eventually exceeds $R$.

Once you have two pieces of length $a$ and $b$, with $a < b < (1+\varepsilon)a$, eventually you will have to cut one of them.

If you cut $a$ first, one of the pieces will be at most $\frac12 a < \frac12b$, so your goal has immediately failed.

However, if you cut $b$ first, one of the pieces will be at most $\frac12b < \frac{1+\varepsilon}2 a$, which means you've lost if we choose $\varepsilon$ to be small enough that $\frac{1+\varepsilon}2 < \frac1R$. And that is always possible if $R<2$.