What if our Sun were located in the middle of a globular cluster?
The night sky would certainly be spectacular from within a globular cluster! To get an estimate for the total brightness of the stars, let's say there are 6000 stars visible to the naked eye1.
Let's make the globular cluster a big one, like Omega Centauri, with ten million stars2.
The roughest approximation would be to say that those ten million stars would have exactly the same average brightness and color distribution as our night sky, so your Omega Centauran sky would be simply 10^7/6000 = 1667 times brighter than our sky. [EDIT: In my initial analysis, I lumped skyglow together with direct starlight. Since a Centauran sky would presumably have the same background, atmospheric glow, and the only difference is that there are more stars, the correct answer will be smaller than 1667. Does not affect the final answer of REALLY SPECTACULAR!, though. I haven't found a resource yet that gives the average sky brightness due only to direct starlight. The table in http://en.wikipedia.org/wiki/Sky_brightness, for instance, gives only how much of the diffuse glow of the sky is due to scattered starlight.] Every factor of 100 in brightness corresponds to five magnitudes, so a little logarithm magic tells us that a factor of 1667 is eight magnitudes brighter.
A typical figure for a moonless, Earthly night sky3 is 21 or 22 mag/arcsec. That gives us a sky brightness for Omega Centaurans of 13 or 14. Bright urban skies4 are about 17 mag/arcsec, so we're talking much brighter than even New York City skies, by the three magnitudes. That is roughly half the perceived difference in brightness between the brightest stars in the sky and the dimmest perceivable stars on a good, dark night.
Corrections to be made to our crude estimate:
Globular clusters are old, old, old. All the hot, bright, blue stars (except a few aberrations called blue stragglers...) are already dead, leaving only the dimmer, redder stars, compared to the ensemble of stars Earthlings see.
Earth is in the middle of a relatively homogeneous area. Omega Centauri is much denser in the middle and sparser outside, so that would mean the true average brightness ought to be increased somewhat for this correction.
Humans can't see to the edge of the Milky Way (even in the direction perpendicular to its plane), so Earth's sky stars trail off into infinity and beyond, as far as we can tell. Omega Centaurans would have a relatively sharp cutoff at the edge of the cluster, with perhaps only a couple of non-cluster stars visible. I can't calculate in my head how that would change the overall sky brightness. It might not make a difference.
Footnotes:
http://answers.google.com/answers/threadview/id/742414.html
http://apod.nasa.gov/apod/ap090301.html
http://www.ing.iac.es/Astronomy/observing/conditions/skybr/skybr.html
http://mysite.verizon.net/vze55p46/id18.html
PS What is a mag/arcsec, you ask? It's a handy measurement of the brightness of extended objects (like galaxies and nebulae, unlike stars) for astronomers but really, really confusing for novices. If you cut out an average swatch of the image of some object that is 1 arcsec X 1 arcsec, then that swatch's total brightness, expressed in magnitudes, would be the surface brightness of the object. But be very careful in basing calculations on surface brightnesses, because they do not act like ordinary things that are somethings per unit somethings, like velocity or current! That is a whole 'nother discussion, beyond the scope of this answer.