How to measure gigaohm resistors?
Many Fluke meters (e.g. 87,287) have a nanoSiemens conductivity range which will measure up to 100 GigaOhms - it needs to be manually ranged up from the ohms range. \$\mathrm{1 G \Omega = 1 nS}\$, \$\mathrm{10 G \Omega = 0.1 nS}\$.
Alternatively, most DMMs have with a 10M input impedance (easily checked with a second meter), so a resistor with value R in series with the millivolt range will form an R+10M/10M voltage divider. So applying 10 volts through a 1 gigohm resistor will read around 99 millivolts. A close-enough approximation for high value resistors from a 10V supply would be resistance in gigohms = 100/millivolts.
You need insulation testers. The ones I've seen had 2 GOhm range. Not necessary Flukes, there are cheaper ones.
And for the future, I would try to add some protective insulation on top of such nasty things :-)
I'll assume that you're able to isolate the resistor from the rest of the circuit.
You probably need to construct a high-impedance analog buffer. It doesn't need to be super-fast, but it does need to be high impedance. A very high impedance amplifier is National's LMP7721, requiring only 3 femtoamps of bias current.
Once you have your buffer, get another resistor with a resistance comparable to the one you want to test (a known value). Connect one side of this resistor to ground, and the other to a probe and to your buffer. Then, apply a voltage to one side of your resistor, and connect your buffered probe to the other side. Measure the voltage at the output of your buffer and solve the voltage divider to determine the unknown resistance
You may not need a buffer if your meter has extremely low impedance when measuring voltage.