Explaining a solution to a calculus problem.

At zero, the function $f$ is either continuous or has a jump; by considering a small enough interval around $0$, you can conclude that $f$ has to be continuous. That gives you $b=-1$.

I assume that by "Lagrange's Theorem" they mean the Mean Value Theorem. And what they are likely using it ("the consequences of Lagrange's theorem") is to say that any two antiderivatives of $f$ differ by a constant (that's what lets you find the antiderivative as your are used to, and by adding a constant be sure that you have all possible antiderivatives). Because if a function $g$ has zero derivative on an interval $(a,b)$, then for any $x,y\in (a,b)$ we have $$ g(x)-g(y)=g'(c)(x-y)=0,$$so $g$ is constant.

Because the above reasoning requires open intervals, they cannot apply it at $0$; that's why there are three cases initially. From there they work with continuity to reduce the constants.