HDMI and I\$^2\$C
DDC history in HDMI goes via DVI all the way down to VGA. It is implemented in a way that you can simply hook up a standard I²C EEPROM memory chip on the monitor side, which are almost as cheap as dirt (AT24C01 and compatible).
I2C signal should probably use higher than normal voltages to avoid too much noise
Nope. The +5 Volts tell you a different story. What they might do is a lower clock frequency on the bus. HDMI cables are usually shielded well, too.
So why would they choose I2C?
It was there in DVI (which HDMI is compatible to) and works and is cheap.
I2C is very inexpensive and simple to implement for a number of reasons. It is often used when just a few bytes need to be transferred. It is also a very structured interface, with protocol defined for who should be talking at a given time. I2C, due to its age, is also well supported among I2C manufacturers (hence why it's inexpensive and simple to implement). Due to the slow data rate, SNR is really not an issue and 3.3V is a typical bus voltage and it can be heavily low-pass filtered, if necessary.
I think it's important to point out HOW the I2C would be used in a monitor. Not only would the I2C would allow communication to multiple monitors, but to multiple devices (e.g. multiple ICs) within each monitor, although there is likely a separate I2C bus for each HDMI cable in most host systems. The I2C interface would likely be used to establish the connection with the host, where the host would query the monitor to find out things like its resolution, frame rate, manufacturer, name, and probably other things. I2C would not be fast enough to transfer image and sound data, that information goes through the TDMS wires, which will be high speed and low-SNR.