This past weekend my family went to a religious convention with about 2,200 other people and sat in a fairly large auditorium as speakers discussed a variety of bible-themed topics. Because the auditorium was quite vast, there were a pair of large (200-inch?) projection screens flanking the stage. From where we sat in the middle, it was easy enough to see the speaker “live” but the screens offered a larger, more detailed image. The speaker stood in front of a relatively neutral background that, looking at it “live,” was clearly some shade of gray.
After sitting there for a bit, my daughter leaned over to me and whispered, “Dad, why does the picture on the screen look blue when it is really gray?”
First, I was pretty impressed from a videophile standpoint that my daughter would (A) notice this difference; (B) be bothered enough by it to comment on it; and (C) just assume that I, the resident electronics expert in her life, would know the answer.
Even though she is only eight, I always try to answer her questions seriously and honestly, even if they are something like, “How much time would it take to eat every ice cream in the world?” So, I took a moment to ponder her question. I leaned over to answer her, but then stopped and thought about it a bit more. I finally leaned over and whispered back, “That’s actually a really complicated question with no simple answer.”
She shrugged her shoulders, no real concern as to the enormity of her video question, and went back to drawing.
But, the truth is, the color of gray and the grayscale has always been somewhat of a contentious thing for videophiles. The grayscale — as you doubtless know from either attending an Imaging Science Foundation or THX course or just from having read any TV review in the past dozen years — is all shades of gray from 0-255, or from 0-100 IRE. Whether it is totally black or brightest white, it’s all just shades of gray in the world of video.
What I wanted to explain to Lauryn was that the image she was seeing on the video screen compared with the live image was affected by many things, and there was no one single or simple answer as to why the screen looked blue.
The color differences were actually not this profound in real life, showing how the camera in my iPhone6 further influenced the color of gray.
For one, the room’s overhead fluorescent lighting doubtless played havoc on how we interpreted both the color of the actual backdrop and the image presented on the screen. This lighting has a fairly cool color temperature, which can certainly push things toward looking bluish. This lighting also affected how the video camera captured the image.
Further, our eye in looking at the “live” image was being color biased by the golden colored drapes and paneling surrounding the stage. This is why even though it might look luxurious or rich, you would want to avoid putting any bright or primary colors around your screen as they will definitely bias the image quality of what you see on screen.
The video camera being used to capture the image can also play a large part in the image disparity, with its quality, lens, and sensor playing large roles in how it captures the image. Also potentially adding to the problem was the camera’s white balance settings. White balance is crucial in digital cameras and refers to “the process of removing unrealistic color casts, so that objects which appear white in person are rendered white” (Cambridge in Colour). Proper camera white balance takes into account the color temperature of light sources — like this room’s fluorescent lighting — something digital cameras have great difficulty with. This alone could have given us a bluish gray.
Another potential issue could have been somewhere in the video signal path from the camera to the projector. Was a cable poorly terminated? Did the matrix switch have some setting that affected image quality?
There is also the matter of the video projector. How well was it capable of accurately reproducing the grayscale? What picture setting was the projector in, or had it been calibrated? If it had been calibrated, how accurate was the calibration equipment used? Had the calibration gear been calibrated? And how good was the calibrator? Did they calibrate it in a fully dark room, or in the same lighting used for presentation? And is it possible that the projector was no longer set to the calibrated settings, but rather switched to something like “normal” or “movie?”
Finally we get to the projection screen. What was the material? What was its gain? Further, was it introducing any color shift based on our off-center seating position?
You can see why such a seemingly simple question required me to hedge my answer.
In this instance, the grayscale accuracy and caliber of video equipment was of little importance; the screens were there merely to allow lots of people to get a better look at what was happening on stage, and likely no one in attendance was coming to marvel over the video quality. And it isn’t often that we can immediately compare “real life” — or what the cinematographer shot or what the director intended — to what is on the screen. Of course, this is why Joel Silver has been preaching about the importance of proper calibration to D65 so you can get your home experience as close as possible to recreating the director’s vision.
There are certainly cases where a customer can compare real life experiences — “I know what the gold of the Fighting Irish helmets should look like,” “I’ve seen the grass at Augusta National,” “I know the color of the Blue Angels’ jets” — but unfortunately, even the best video display can be thrown off by something in the broadcast chain that comes before it.
And, as with the case of my eight-year-old, there is no telling who the harshest video critic in the room will be.
John Sciacca is principal of Custom Theater and Audio in Myrtle Beach, SC.