How is brightness calculated?
One of the key specifications that customers use to determine what display might be right for their application is brightness. Brightness is measured in candelas per meter squared (cd/m2) or nits. Like “horse power” is to automobile engine power, so are nits to displays. They are simply a measure of how many candles of brightness a display is equivalent to. A 300 nit desktop monitor has the light power of 300 candles lighting a square meter of space. A 1,500 nit outdoor display has the power of 1,500 candles (which seems like a lot, until you do the unfair comparison of a display to the sun, sustains life on earth and is significantly brighter than that).
As an aside, a nit is not be confused with an ANSI lumen (a common specification in front projection solutions). A lumen is a measure of light energy reflect off a square meter area at a constant distance of one candle. You use lumens to measure reflected light, when nit is used to measure direct light. Which is why projectors are specified in lumens and displays (even rear projection displays or cubes, RPC) are specified in nits. Sometimes projection installations are measured in foot-lamberts which is a measure of brightness equal to one lumen per square meter with the goal of having brighter rooms generating higher perceived brightness screen reflections. A nit is equal to approximately 0.292 foot-lamberts. And with that you are officially inducted into our club. Okay, back to display brightness.
Now, in order to measure brightness of a display, you need something on the screen. In most cases, display brightness is measured when the screen is showing a full-white image, from edge to edge, top to bottom. For displays like LCD displays, that require a light source from behind the liquid crystal component, this kind of full-white measurement will tell you what maximum light output the display is capable of. Although it is unlikely that a customer would how a full-white image on the screen, which is the brightest the display will measure.
For emissive displays (like plasma, OLED, or direct view LED), the calculation is a little more difficult. Because each pixel is addressed directly (and turned on or off depending on the content directed to that pixel), the brightness of each pixel will vary as the power to drive the pixels is shared among all of the pixels on the display. For instance, if you put up a total white field on a emissive display, that brightness measurement will be less than if you put up a small white square in the middle of a display. One is the standard brightness measurement (say “typical”) and one is a peak brightness measurement.
This makes the question “how bright is this display” a bit of a trick question. It depends on the measurements, of course, but also what kind of display technology you are evaluating and what content is being shown on the screen when the measurement is taken.