Programming:  Grayscale images

Shades of gray, on lcd monochrome displays, can be obtained by turning on/off pixels according to some pattern. Different patterns will show different gray levels.

Patterns are obtained by bitplanes and the order they are displayed. A bitplane is a monochrome bitmap. Using two bitplanes allows us to assign two bits each pixel, one on the first plane, one on the second plane. Two bits allow us for four shades of gray. 00 - white, 01 - light gray, 10 dark gray, 11 black.

We start to place these planes on the time axis. How about the order? Plane1, Plane2, Plane1, Plane2, and so on? This wouldn't work. Every plane would stay "on" half the time, so '10' and '01' would be the same level. We would end up getting only three levels.

What we need is to give different weight to different planes. On each period, the darker plane must be showed more.

P2, P1, P2, P1, P2. This sounds good. If the period is T, the first plane stays on for (2/5) * T and the second plane (3/5) * T. This makes '10' and '01' combination different. Notice that this is not a pattern. We get a pattern assigning this model a combination of bits. If this sequence is repeated fast enough, our eyes get tricked. They perform what is called a temporal integration.

Plane2 (3/5)Plane1 (2/5)Result
00white 0
01light gray 2/5
10dark gray3/5
11black 5/5

It's fundamental that plane sequences are mixed in a uniform way.

Flicker. Now if we're not fast enough, it's likely we see the image flashing. More planes -> more bytes to display -> more flicker. This may not be the only reason. But BE romantic, a bit of flicker adds that little something to a picture.

The code shows a 16 gray levels picture viewer.

Show Asm Code

Show C Code