metalev: Generating false color images on the Nexus One using only grayscale pixels


Generating false color images on the Nexus One using only grayscale pixels

In today's digital world, images are an integral part of our daily lives. From social media posts to professional presentations, the use of visuals can greatly enhance the impact of our communication. However, not everyone has access to high-quality cameras or expensive image editing software to create visually appealing images. This is where workday courses online can help, particularly with generating false color images on the Nexus One using only grayscale pixels.

EDIT: I want to say up front, don't go out and sell your phones, people! And don't change your mind about purchasing a Nexus One without looking at the screen yourself. The N1 display is beautiful and vivid with dark blacks and incredible photo reproduction, and much better than the iPhone screen for text reproduction. Only when compared with an extremely high-res LCD screen like on the Droid are the text fuzziness comments even justified -- and that's a high standard to hold the screen to.  My only comment was that both screens were specced at almost the same resolution so it would be nice if the N1's screen looked just as sharp.
I just published  showing how the effective display resolution on the Nexus One is not 480x800 as repeatedly claimed by both  and  -- at least not the way screen resolutions are normally calculated, with one pixel equal to one RGB triplet.  If you are interested in the arguments about display resolution on the Nexus One, please see my other post on that topic.

In the Ars Technica article, I also showed a few nefarious test patterns, formed only of pure grayscale pixels, that leverage color fringing on the N1 display to appear as color images. The color disappears as soon as you scale the image in or out from 100% by a few percent, and the images appear gray on a standard LCD screen such as on the Motorola Droid or on your laptop.  Lots of examples are given below.  [Since there is a lot of public discussion going on about the article, I'm collecting some of the most insightful feedback here.]
IMPORTANT NOTE: The greyscale stippled images in the Ars article cannot induce color on the N1 display because Ars decided to scale them down to 75% and not provide links to the full-res versions -- you must download the full-resolution example images at the links below if you want to try this on your own Nexus One!

Left: Closeup (200% zoom) of a pure grayscale stipple image that appears colored on a Nexus One's PenTile display when displayed at 100%.  Right: Perceived color.
Since a lot of people probably came here for the download links, I'll give those up front. Read on below for more details about what is going on.
DOWNLOAD LINKS: (N.B. here's a short-link to this blog post, so you can easily type it into the Nexus One browser to get to these links:  )
  • Tap to download in N1 browser, then view in Gallery:
    Test patterns:
    | | | |
    Example false-color stippled images:
    | | | | | | | | | | | | | | | |
  • You can also download all images as a  if you want to unzip them all in a single step to your Nexus One's SD card.
  • for the algorithm that generated the images is also available under the GPLv2 license, please drop me a note if you do anything interesting with it.  The algorithm is described below.

Note: the color fringing shown here is specific to the configuration of the Nexus One display, and may not work the same way or at all on other PenTile AMOLED configurations, as described in my other post.

Downloading and Viewing the Example Images on the Nexus One

Download and view the individual sample images above, or download all images as using the single zipfile link, unzip them to your N1 SD card, and open the images in the Gallery application.  Then double-tap or pinch-zoom to zoom in and out of 100% to see the colors magically appear at 100% and disappear at other scale values. The grayscale stipple images images must be displayed at exactly 100% (1-to-1) zoom on the Nexus One screen for the color artifacts to be observed.  On the Android browser, you may not be able to get the colors to appear, firstly because the images below are shown at 50% unless you click on them, but secondly when you do, the browser makes it particularly hard to view images at exactly 100%. In each of the examples shown here, the grayscale stipple image is shown on the left hand side at about 50% zoom, the way it appears on the N1 screen is shown on the right hand side for reference.  (Click to view full-sized versions in a desktop browser.)

Example Image

Stipple image (left) and how it appears on the N1 screen (right).  You can force the PenTile display to show color fringes in pretty much every color of the rainbow... albeit sometimes a bit washed out. How can Nouvoyance claim this display is exactly equivalent to a standard RGB-striped LCD panel in its color resolution?
This image on the left would show as grayscale on a standard LCD panel, for example the screen on the Motorola Droid (and probably the screen you are viewing this on right now), but in full rainbow color on the Nexus One display when viewed at 100%.  The Droid's screen has almost the exact same resolution as is claimed for the Nexus One (480x854 on the Droid vs. 480x800 on the N1).  If the N1 screen really were the resolution that is claimed, the image would show as a grayscale image on the N1 just like on the Droid.  Clearly the N1's screen is not capable of the same physical resolution as the Droid, or there would be no color artifacts.

How it works

I created the following two test images where a 3x3 stipple image is stretched to 4x4, with the phase (stipple pattern offset) continuously varies with angle about the center and the intensity of the stipple pattern varies with radius:

Test images
I then took a photo of the N1 screen (slightly blurred to remove moiré patterns), manually corrected the color curves so that the image on my desktop LCD screen matched what I saw on my N1 screen, and de-warped back to 480x800, producing the following two reference images.

Reference images (photos of the test images viewed at 100% on the N1 screen)
Then given an input image (e.g. the Mona Lisa), my algorithm finds the pixel in one or other reference image that most closely matches the the color of each pixel in the input image, and reads off phase and intensity.  These phase and intensity values are used to output a single stipple pixel in the output image.  The algorithm also smooths the resulting image by performing local averaging of phase values before outputting the final stipple pattern.
The reference images therefore give us a palette to work with.  The resulting colors are a bit washed out and dull (they each cover less than 1% of the total 16-bit colorspace), but as the image of the rainbow above demonstrates, every color can be generated this way, albeit over a limited saturation and/or lightness range. The reference image step using manual color adjustment is a real hack. :-)  The color mapping could be greatly improved using real color calibration hardware.  This would enable most color gradients shown in the test images here to appear completely smooth.

Note: The reference images above are real photos, but all other images in this article that show how a stipple image is perceived on an N1 screen (e.g. the righthand of the two rainbow images above) are actually a recoloring of the input image to use the closest color in the reference image palette for each pixel.

Different Ways of Stippling

Once those reference images have been obtained, for every input image, my algorithm outputs not only the stipple image that best reconstructs the color display using only grayscale pixels, but also outputs an approximation for how the image will look to a human observer on the N1 screen.  For example, given the following input image,

the algorithm outputs the following grayscale stipple image and an approximation of how the image will be perceived.  (The perceived colors are less saturated than the original.)


The above stippled version of the Mona Lisa image uses a pattern of a 1-pixel-wide diagonal black line set on a 3x3 white background (the first of the two test images from which the reference images are generated), stretched to a ratio of 4/3, and then offset appropriately at each pixel position to generate the correct color at that pixel location.  The following image inverts the colors of the stipple pattern, i.e. uses a 1-pixel-wide diagonal white line set on a 3x3 diagonal black background (the second of the two test images, which generates a different palette of observable colors in its corresponding reference image).  The colors in this second rendering of the Mona Lisa really jump out on the N1 screen, more so than the righthand image would indicate.


Lots more examples

Again, stippled image is on the left (click to view full-size), an approximation of how the image will look on the screen is on the right.

My algorithm takes a very rough cut at finding the closest color available, it could be substantially improved, probably to the point of making the above sunset image appear almost completely smooth.  I doubt I'll actually do that sort of fine-tuning, this algorithm was just developed for demonstration purposes, and it's good enough for this as it stands.  Here's another image that shows a color gradient that could be made much smoother.
Starry Night (stippled version) looks great on the PenTile display!  The blue and yellow colors in the sky look electric on the N1 screen, and the dark greens are really rich and dark, because in this image I use both of the two different stipple patterns used above (for the two different Mona Lisa images), and switch between them when it reduces the color error for a given pixel.
Not only the density of the stipple pattern but the overall average intensity of pixels in the stipple pattern can be varied to give colors with a range of different luminance values.  Note that changing the average luminance of a stipple pattern can change the apparent chrominance (apparent color), so the color selection algorithm has to take this into account.
Another image showing all the colors of the rainbow in one scene by exploiting color fringe artifacts of the PenTile matrix display.
An image of Yellowstone using the same white-on-black stipple pattern used for the first Mona Lisa image
The same image using the black-on-white stipple image used for the second Mona Lisa image (the colors are actually more significantly different on the N1 screen between these two examples than they appear here)
The color gradations in the sky could be smoother here again, it would just take a bit more time polishing the mapping from stipple phase offset to perceived color.
The first of these two examples uses the "white-on-black" stipple pattern, the second example switches dynamically from white-on-black to black-on-white, as with the Starry Night image above (again the difference in color depth can only be seen on the N1 screen).
 There are lots of different ways to dither the same image all to similar effect.



I came some how code my way out of brown paper bag but this is way over my head. Excellent work. Cheers.


Typo correction: I can some how...


there is the human eye reception factor, I mean your algorithm to make the pics gray might just "hide" the colors so the human eyes cannot see it, .. but when u transfer it to N1, the colors factor gets amplified (due to the AMOLED and other factors) so, it appears more to the eye as colored.
Am not into image processing though, just my 2 cents.
nice article nevertheless


Arkan Hadi,
While this may be true, he wouldn't be able to photograph the phenomenon, and it would happen on more displays than just this one.


You have the determination to hack! I adore you.


What about full blue or red images? The resolution would be half of the specification? Is it perceivable?


Excellent job, mortal! You have embraced the true meaning of techno-worship.


and one more thing, ... the pixels count are 480*800, that is not something you can change, its in the hardware, ... however, ... each pixel is subdivided into 2 sub-pixels, instead of 3 sub-pixels as the tradition RGB rendering, so its just different way of rendering, if you have 2 sub-pixels in each pixel, that does NOT mean that you have less overall pixels. you have the same amount of pixels, but each pixel have 2 sub-pixels, not 3


I reckon the Samsung Galaxy S is probably going to suffer from this as well no?


Fabio: good question, not sure, it would be interesting to view pure red or blue surfaces and see if they look "holey"...

Arkan: obviously -- that's the whole point of the post.

Mori: No idea :/ However almost all high-end Android devices in the future will have AMOLED screens, and for now almost all AMOLED screens will be PenTile, so it's likely.

Post a Comment