Why do DMMs have so low update/refresh rates?
Most DMMs are integrating type (of which the dual-slope was one of the first methods developed suitable for high resolution and accuracy).
An advantage of integrating is that the integration period can be designed to be a multiple of 50Hz and 60Hz line frequencies. For example, 300 msec is 18 cycles of 60Hz and 15 cycles of 50Hz. This has the effect of a natural notch filter at mains frequency so that hum caused by mains noise is cancelled out and the reading does not jump around as much.
Integrating converters can also be built with high resolution and pretty good (< 0.1%) linearity with cheap parts (all the errors cancel out to a first order except for the reference voltage)- 0.1% linearity and accuracy, even with a 5% film capacitor, 5% resistors, a crude RC clock, and with adequate refresh rate for visual purposes (a few Hz)
For UI reasons you really don't want to update the display too too fast, 2-5Hz is about right- if the display rate is too fast, it could jump back and forth between (say) 201 and 101 and it would look almost like 301. If it is too slow, you don't get to see the reading stabilize and how much apparent noise there is in it.
More modern high-resolution converters are often made using sigma-delta techniques, which eliminates at least one second-order effect (capacitor dielectric absorption) from the integrating converter error budget (improving linearity*). They can be used in a DMM by low-pass filtering and decimating the result to an appropriate display rate (or just averaging over a suitable time). You'll also see low-end voltmeters and ammeters that use the built-in 10 or 12-bit successive-approximation converter built into a micro and add some averaging to get a sort-of acceptable reading with a crummy (low) input impedance.
*Although most users can't see nonlinearity without a better instrument to compare the DMM against, they can simply flip the leads on a stable DC source and see that (say) a reading of +10.00V reads -9.98V when the leads are reversed. Of course these days such an effect could just be fiddled out with a microcontroller.
Modern higher-end DMMs like my Agilent have options for reading speed, here is an example table showing capabilities:
NPLC refers to the number of power-line cycles that the reading is integrated over.
ADC converters tend to get more expensive with increasing resolution and sample rates, so for a regular DMM there's probably not much point to trying to make them faster at an increased cost. Afterall who can read an LCD display updating more than four times a second?
I have a multimeter that can take 1000 readings per second, but that only makes sense because it can also record readings to a USB stick and transfer over ethernet etc for later analysis. Also for some of the in-built math functions it can make sense to have a higher sample rate, but that's not the sort of thing hand-held meters are normally made for at the lower end of the market.