Sum of exact differentials is inexact?
Even if $dX_i$ is exact, $f_i~dX_i$ might not be if $f_i$ is not constant with respect to the $X_i$.
This is one of those cases where I think we need a sort of "food pyramid" for physics, with examples at the bottom, equations somewhere in the middle, and physical theories on top, or so. Here's a great example of an inexact differential made of exact ones:$$\delta s = -y ~dx + x~dy.$$Remember what this means?
If not, that's OK, we'll get there. We're talking about a state space $\mathcal S$, in this case $(x, y)$ pairs are here, and there exist some "state functions" which take points in $\mathcal S$ (i.e. states) as input and give us back outputs, let's say points in $\mathbb R$ (i.e. real numbers). And because we're doing physics we often like those functions to be "nice", and one of the ways they can be nice is if, say, the response to a very small deviation in the state coordinates has a linear approximation: $$s(x + dx, y + dy) = s(x, y) + c_1 ~dx + c_2 ~ dy.$$If you chart $s$ as a $z$-component of a 3D space then $s(x,y)$ is some sort of 2D curved surface, and the above tries to approximate it with a tangent plane at some point $(x, y).$ These "partial derivatives" $c_1 = \left(\frac{\partial s}{\partial x}\right)_y$ and $c_2 = \left(\frac{\partial s}{\partial y}\right)_x$ therefore vary as functions of $\mathcal S$ as well, because you can make a tangent plane around any point and then measure the slopes of the lines which make it up.
In this case we would write $s(x + dx, y+dy) - s(x, y)$ as the exact differential $ds$.
Conversely, when I say that the above is an inexact differential I am saying that there exists no such nice state function $s(x,y)$ which leads to this choice of $c_{1,2}(x, y).$ I might also saying that there might be no such nice state function and I am not sure, because you can always pretend an exact differential is inexact (just don't derive the $s(x, y)$ function!), but you can't pretend that an inexact one is exact (because there is no $s(x, y)$ function and pretending that one exists will lead you into trouble).
How am I so sure that the above is inexact? Well call it $\delta s = c_1~dx + c_2 ~ dy$ and let's just integrate those components: for the first one we get $\int dx~c_1(x,y) = -xy + C_1(y)$ and for the second one we get $\int dy~c_2(x,y) = +xy + C_2(x)$ and it doesn't matter how you choose $C_1(y), C_2(x)$, you cannot come out of this rut with a consistent $s(x, y),$ because that $xy$ term has different signs and refuses to go gently.
Now sometimes, you can get away with simply augmenting $\mathcal S,$ a great example of this is $$\delta s_2 = \frac{-y}{x^2 + y^2}~dx + \frac{x}{x^2 + y^2}~dy.$$In this case we have indeed that $\frac{\partial c_1}{\partial y} = \frac{\partial c_2}{\partial x}$ where those are defined, but this is not defined at $(0, 0).$ It is the almost-state function $d\theta$ which has a discontinuity at $2\pi$ when you try to represent it as a function $\mathcal S \to \mathbb R.$ Instead the idea is to choose a curve, any curve, from $0$ out to $\infty$ that does not self-intersect. And in addition to your place on $\mathcal S$ we will store an integer (in $\mathbb Z$) saying how many times you passed around the origin, we will increment it when you pass over the curve one way, or decrement it when you pass over the curve in the other way. Then $d\theta$ becomes an exact differential on this space $\mathbb Z \times \big(\mathcal S - \{(0,0)\}\big).$
But very often we can't and we're stuck, especially when we don't have this $\nabla \times [c_1, c_2] = 0$ type of equation to help us.
The sum of exact differentials is exact: Given $\mathrm{d}f$ and $\mathrm{d}g$ as exact differentials, their sum is the exact differential $\mathrm{d}(f+g)$ - that is, the sum of differentials of functions is the differential of the sum of the functions. It's a basic consequence of the exterior derivative being linear.
However, you are not taking the "sum of exact differentials" there. A differential $f(x_1,\dots,x_n)\mathrm{d}x_1$ is not exact - if you assume there existed a function $F(x_1,\dots,x_n)$ such that $\mathrm{d}F = f\mathrm{d}x_1$ you quickly find a contradiction unless $F$ (and thereby $f$) does not depend on the $x_2,\dots,x_n$ at all, which the $f_i$ in your differential do. So $\delta W$ is the sum of inexact differentials an can therefore be inexact (note that this doesn't prove $\delta W$ is inexact, since the sum of inexact differentials can easily be exact, consider the trivial case $\omega + (-\omega)$ for some inexact $\omega$).