Why do later fusion stages in a star last shorter?
In general, the minimum luminosity a star has is whilst it is on the main sequence, "burning" hydrogen in its core. Subsequent changes are driven by changes in the core composition and opacity. Fewer electrons per mass unit for heavier species leads to lower radiative opacities and higher luminosities; fewer particles per mass unit means higher temperatures are required to generate a given pressure; whilst greater Coulomb repulsion between heavier elements means that higher temperatures are required to supply these luminosities.
The timescale for a burning phase will be roughly given by the number of nuclei available to react, multiplied by the binding energy released in the fusion processes, divided by the luminosity.
All of these factors conspire to shorten the lifetime of subsequent nuclear burning phases: there are fewer reactants of greater atomic mass in a core that stays at roughly the same mass; each reaction produces less energy as the change in binding energy per nucleon is maximised in turning hydrogen to helium, but is an order of magnitude lower (and getting smaller) for subsequent burning phases; the luminosity gradually increases as the star becomes more evolved.