metalev

2010-03-25

Followup to the Ars article

I wanted to collect a few of the most insightful comments I get in response to the Ars article in one place.

Lone Shepherd said:
I must say I'm continually surprised at the lower quality of comments on front page articles compared to the Ars forum. What's with all the fanboism, accusations of FUD, and people insulting the author because he dared investigate and criticize a design decision?
You guys need to take this article for what it is: a technical investigation on the display tech used on the Nexus One. If you're a tech enthusiast, you might find this sort of analysis interesting. I know I did. The article isn't about bashing the N1, or making the iPhone or Droid look better, or whatever. It's talking about tech, period. It's not about dissing your phone, or ignoring another phone, or making yet another phone look better in comparison, or any of that partisan, fanboy crap.
Get a grip, people.
lhopitalified said:
Your comment about rods and cones is not entirely correct. Density of rods and cones varies depending on angular distance from the fovea (i.e. cones are most dense at the fovea, rods are most dense about 20 degrees away), which complicates the matter of a single numeric comparison. Moreover, the whole "rods are for luminance" and "cones are for color" argument is rather simplistic. Cones are the only photoreceptors that get mapped into color channels, but that does not mean that they are not used for luminance. Unlike rods, fewer cones map into an individual retinal ganglion cells (the actual "pixels" of the eye). This is a method used to boost the low-light sensitivity of rods.
When reading, it is clear that the cones are being used -- when you focus on one letter of text, it is very difficult to make out letters that are a short distance away unless the text is really big because the resolution of cones decreases dramatically. The opposite occurs at night when viewing dim stars -- if you view them directly, they disappear because the cones are not sensitive enough, but reappear when you shift your focus point away and let the rods do the work.
My main point is that the human visual system is a LOT more complex than most people give it credit for!
neatchee had a great comment for balance:
OHS NOES! Images crafted with the sole purpose of causing irregularities on the Nexus One's screen cause irregularities on the Nexus One's screen?! Whatever shall we do!
Seriously, this article is a whole bunch of sensationalism. Luke has a valid point in there somewhere but it's lost among the cries of "oh em gee it doesn't follow the exact specifications I expected and other screens have used!" NONE of these examples show a real world scenario. Stippled images? When the hell will I be viewing a stippled image on my N1 except in this article? Not to mention, if you change the zoom level by even 1%, the effect disappears. It's like my kid saying "it hurts when I twist my head like this, and put my arm over here, while I jump up and down."
In practice the N1 screen is vibrant, and text is about as readable as it comes. If you're specifically looking for fringing then I'm sure you can find it. But you'll have to hold the phone so it's touching your nose, and squint, and mutter something akin to "I think...I'm pretty sure I see it...yeah, I think I see it." Text is not "blurry" it's solid as compared to the Droid screen where I can actually discern individual pixels in a solid color area (it's like looking at a white wall and seeing each individual molecule). Here's a tip: when an image has a white background, I want it to look like a solid white background, not hundreds of white dots.
I should say that I asked Ars to remove some of the sensationalist language that they added in, in a final round of edits, and the editor rejected my changes.  I guess I'll self-publish from now on.

klassobaneiras said:
Smartphones are sold on their awesome specs, and who lives by the specsheet dies by the specsheet.
Plus, you can't blame people for wanting to feel they got what they paid for.
alexvroger said:
Why all the nexus hate ?
No matter what tech site I check (Wired, Engadget, Gizmondo) there's some bad press about Nexus One.
I have Nexus One and it's screen is absolutely the best I've ever used. The Iphone compare to it is a joke (and I had 3GS). 
Nexus One is easily the best smartphone on the market, so please stop all the hate

2010-03-24

More on the Resolution of the Nexus One

I just published showing the resolution of the Nexus One screen is not as high as claimed, and posted a number of example images on this blog showing how the weird color fringing on the Nexus One display can be leveraged to display pure grayscale images in full color.  I will expand on the resolution argument here in anticipation of discussion that will likely ensue.

All 480x800 physical pixels on the N1 screen have green subpixel elements on the, but half the physical pixels have red subpixel elements and no blue subpixel element, and the other half have blue subpixel elements but no red subpixel element.  This subpixel layout is known as PenTile by Nouvoyance (formerly Clairvoyante).

The subpixel layout on the Nexus One screen
To compare the resolution of the N1 display to a standard RGB LCD screen, in the Ars article I took a weighted sum to convert subpixel resolution in each channel to the total number of effective RGB pixels: (480*800/2)*2/3 + (480*800)*1/3 = 256,000, exactly two thirds the claimed total number of pixels (480*800=384,000).  This is equivalent to a screen with edge dimensions sqrt(256/384)=82% of the claimed length: (480*82%)*(800*82%) = 392*653 = 256k.  The Nexus One's effective resolution is 392x653 using this method of calculation, not 480x800.  Any effective resolution arguments are complicated, however, so see the Ars article for all the gory details.

Nouvoyance's Claims

PenTile's creater, Nouvoyance (formerly Clairvoyante),  that the PenTile's resolution is exactly comparable to an LCD display of the same number of physical pixels.  It is completely impossible for this claim to be correct, given that there are fewer total subpixel elements on the PenTile screen than on an RGB-striped LCD screen of the same number of physical pixels.

Nouvoyance's claims are based on grille test patterns from Vesa FPDM Standard Section 302-2.  These are alternating black and white lines in either horizontal or vertical configurations, and for a screen manufacturer to claim a certain resolution, they have to determine the maximum density of black/white lines that exceed 50% contrast modulation.  Unfortunately the PenTile layout "cheats" in that this methodology is insufficient to test the display's true resolution.

The Vesa method indeed supports the claim that the PenTile display resolution is 480 pixels horizontally and 800 pixels vertically, because it can easily produce a pattern of alternating horizontal or vertical black lines in one axis at a time, using an alternating pattern of RG and BG pixels at full intensity to generate white.  However PenTile cannot handle diagonal lines at full resolution, because the distribution of RG and BG subpixels is in diagonal bands.  This means that when intensity transitions are not long horizontal or verticle edges, color fringes are generated, most predominantly in shades of green, because the density of green subpixel elements is twice as high.  The following test pattern demonstrates this.

 


Test pattern (left, click for full-sized version), and photo of the test pattern being displayed on a Nexus One screen.  Many of these stipple patterns appear a shade of green.

The spatial density of green subpixel elements is twice that of blue or red, so hard intensity transitions are most likely to be compensated for in the green channel, as seen when the test pattern is displayed on the N1 screen -- therefore luminance (intensity) can in some circumstances be converted to chrominance (color).  However because every physical pixel on a PenTile display is missing one color channel, the display has to compensate by dispersing red and blue color intensities to two or four physical pixels in a neighborhood to display any given RGB color at a point.  This produces color fringes in arbitrary colors at hard intensity transitions, as seen in my blog post that exploits these artifacts to display grayscale stippled images in full color on the Nexus One screen.

UPDATE: The CEO of Nouvoyance contacted me about this blog post, and has indicated that there are numerous register values that can be tweaked to change the behavior of the PenTile display driver hardware.  By "reducing the amplitude of the modulation, such that the locally adaptive filter detector is no longer triggered", apparently these color fringes can be eliminated. Therefore claims made in the Ars article or on this blog apply specifically to the Nexus One's specific usage of PenTile technology, and statements about color fringing may not necessarily apply to other PenTile configurations.  However:

  • I have to wonder what the tradeoff is of turning off the "locally adaptive filter detector", they implemented it for a reason
  • I find it almost impossible to believe that color fringes can be 100% eliminated, given that the display relies heavily upon such a complex set of postprocessing rules (small local convolutions, subpixel positioning, color correction curves, etc.)
  • Though the claims of color fringing may apply only to the N1's specific PenTile register settings, I still stand by claims that the effective resolution is lower than claimed -- there just aren't enough subpixel elements for the PenTile display to have exactly the same effective spatial addressing ability as an LCD of the same resolution.


Effective pixel size

One thing I don't talk about in the article is the effective size of a single pixel on the screen.  Assuming we're talking about putting one white pixel on a black screen, the display can choose to fully illuminate one RG pixel and one BG pixel (both Gs need to be illuminated so the G intensity is equal to R and B), in either a 2x1 or a 1x2 pixel block.  Conceivably the display could choose a different arrangement of subpixels, e.g. GBG on one row and just R on the next row, affecting three physical pixels, but ultimately a sum of two physical pixels worth of screen area has to be illuminated to show one white pixel.  On average, for consistency between the axes, intensities have to be dispersed in both axes equally.  So on average the two pixels that must be addressed (1x2 or 2x1 or similar) are equivalent to a single effective square pixel of edge length sqrt(2)*sqrt(2)=1.4*1.4.

A little more esoterically, if we compare this measure of effective pixel size with the previous calculations about effective resolution, it turns out that these conceptual effective pixels overlap by about 15% of their width when laid out on the effective pixel grid, because 1.4*82% = 1.15.  This is basically a very rough measure of the maximum amount of color bleed between effective pixels, assuming no postprocessing.  Of course this is cleaned up in some measure by the various PenTile signal processing algorithms, but is another way of looking at perceived display fuzziness.


Arguments about subpixel positioning

As described in the Ars Technica article, I believe that taking a weighted average of the total number of R, G and B subpixels to produce a number of effective RGB triplets is as reasonable a measure as any other method.  However I mentioned in the article that signal processing (in particular subpixel addressing) has to be ignored for this number to make sense.  Signal processing on the N1 display significantly muddies the waters when it comes to determining effective display resolution because of the complexity of the signal processing algorithms involved.

That said, I don't think ignoring the subpixel positioning on PenTile displays is totally unreasonable, or that it changes the conclusion much.  There is no subpixel positioning at all in the vertical axis of the display, and horizontally there are only two possible subpixel alignments: pixel-aligned or horizontally offset by two-thirds of a pixel.  One way to think about subpixel alignment on a PenTile display is that subpixel alignment allows positioning within the 2x1 or 1x2 or 2x2 grid of pixels that must be addressed, e.g. a white pixel could be represented as GBGR, BGRG, GRGB or RGBG.  Thus any benefit subpixel positioning might be claimed to have is actually halved, because the positioning is effective within a two pixel wide window.

Summary

This is a messy area, and it could be analyzed in a dozen different ways.  My particular analysis may not be the best one, but I think it's at least a reasonable start.


This is all conceptual, of course -- stating reasonable numbers for a display that isn't laid out like a normal RGB-striped LCD is very hard to do.  It is simply an effort to back the subjective impression that the display is fuzzy with somewhat justifiable numbers.


Debate among yourselves :-)

2010-03-13

Generating false color images on the Nexus One using only grayscale pixels

EDIT: I want to say up front, don't go out and sell your phones, people! And don't change your mind about purchasing a Nexus One without looking at the screen yourself. The N1 display is beautiful and vivid with dark blacks and incredible photo reproduction, and much better than the iPhone screen for text reproduction. Only when compared with an extremely high-res LCD screen like on the Droid are the text fuzziness comments even justified -- and that's a high standard to hold the screen to.  My only comment was that both screens were specced at almost the same resolution so it would be nice if the N1's screen looked just as sharp.
I just published  showing how the effective display resolution on the Nexus One is not 480x800 as repeatedly claimed by both  and  -- at least not the way screen resolutions are normally calculated, with one pixel equal to one RGB triplet.  If you are interested in the arguments about display resolution on the Nexus One, please see my other post on that topic.

In the Ars Technica article, I also showed a few nefarious test patterns, formed only of pure grayscale pixels, that leverage color fringing on the N1 display to appear as color images. The color disappears as soon as you scale the image in or out from 100% by a few percent, and the images appear gray on a standard LCD screen such as on the Motorola Droid or on your laptop.  Lots of examples are given below.  [Since there is a lot of public discussion going on about the article, I'm collecting some of the most insightful feedback here.]
IMPORTANT NOTE: The greyscale stippled images in the Ars article cannot induce color on the N1 display because Ars decided to scale them down to 75% and not provide links to the full-res versions -- you must download the full-resolution example images at the links below if you want to try this on your own Nexus One!

 
Left: Closeup (200% zoom) of a pure grayscale stipple image that appears colored on a Nexus One's PenTile display when displayed at 100%.  Right: Perceived color.
Since a lot of people probably came here for the download links, I'll give those up front. Read on below for more details about what is going on.
DOWNLOAD LINKS: (N.B. here's a short-link to this blog post, so you can easily type it into the Nexus One browser to get to these links:  )
  • Tap to download in N1 browser, then view in Gallery:
    Test patterns:
    | | | |
    Example false-color stippled images:
    | | | | | | | | | | | | | | | |
  • You can also download all images as a  if you want to unzip them all in a single step to your Nexus One's SD card.
  • for the algorithm that generated the images is also available under the GPLv2 license, please drop me a note if you do anything interesting with it.  The algorithm is described below.

Note: the color fringing shown here is specific to the configuration of the Nexus One display, and may not work the same way or at all on other PenTile AMOLED configurations, as described in my other post.


Downloading and Viewing the Example Images on the Nexus One

Download and view the individual sample images above, or download all images as using the single zipfile link, unzip them to your N1 SD card, and open the images in the Gallery application.  Then double-tap or pinch-zoom to zoom in and out of 100% to see the colors magically appear at 100% and disappear at other scale values. The grayscale stipple images images must be displayed at exactly 100% (1-to-1) zoom on the Nexus One screen for the color artifacts to be observed.  On the Android browser, you may not be able to get the colors to appear, firstly because the images below are shown at 50% unless you click on them, but secondly when you do, the browser makes it particularly hard to view images at exactly 100%. In each of the examples shown here, the grayscale stipple image is shown on the left hand side at about 50% zoom, the way it appears on the N1 screen is shown on the right hand side for reference.  (Click to view full-sized versions in a desktop browser.)


Example Image

 
Stipple image (left) and how it appears on the N1 screen (right).  You can force the PenTile display to show color fringes in pretty much every color of the rainbow... albeit sometimes a bit washed out. How can Nouvoyance claim this display is exactly equivalent to a standard RGB-striped LCD panel in its color resolution?
This image on the left would show as grayscale on a standard LCD panel, for example the screen on the Motorola Droid (and probably the screen you are viewing this on right now), but in full rainbow color on the Nexus One display when viewed at 100%.  The Droid's screen has almost the exact same resolution as is claimed for the Nexus One (480x854 on the Droid vs. 480x800 on the N1).  If the N1 screen really were the resolution that is claimed, the image would show as a grayscale image on the N1 just like on the Droid.  Clearly the N1's screen is not capable of the same physical resolution as the Droid, or there would be no color artifacts.

How it works

I created the following two test images where a 3x3 stipple image is stretched to 4x4, with the phase (stipple pattern offset) continuously varies with angle about the center and the intensity of the stipple pattern varies with radius:

 
Test images
I then took a photo of the N1 screen (slightly blurred to remove moiré patterns), manually corrected the color curves so that the image on my desktop LCD screen matched what I saw on my N1 screen, and de-warped back to 480x800, producing the following two reference images.

  
Reference images (photos of the test images viewed at 100% on the N1 screen)
Then given an input image (e.g. the Mona Lisa), my algorithm finds the pixel in one or other reference image that most closely matches the the color of each pixel in the input image, and reads off phase and intensity.  These phase and intensity values are used to output a single stipple pixel in the output image.  The algorithm also smooths the resulting image by performing local averaging of phase values before outputting the final stipple pattern.
The reference images therefore give us a palette to work with.  The resulting colors are a bit washed out and dull (they each cover less than 1% of the total 16-bit colorspace), but as the image of the rainbow above demonstrates, every color can be generated this way, albeit over a limited saturation and/or lightness range. The reference image step using manual color adjustment is a real hack. :-)  The color mapping could be greatly improved using real color calibration hardware.  This would enable most color gradients shown in the test images here to appear completely smooth.

Note: The reference images above are real photos, but all other images in this article that show how a stipple image is perceived on an N1 screen (e.g. the righthand of the two rainbow images above) are actually a recoloring of the input image to use the closest color in the reference image palette for each pixel.

Different Ways of Stippling

Once those reference images have been obtained, for every input image, my algorithm outputs not only the stipple image that best reconstructs the color display using only grayscale pixels, but also outputs an approximation for how the image will look to a human observer on the N1 screen.  For example, given the following input image,


the algorithm outputs the following grayscale stipple image and an approximation of how the image will be perceived.  (The perceived colors are less saturated than the original.)

  

The above stippled version of the Mona Lisa image uses a pattern of a 1-pixel-wide diagonal black line set on a 3x3 white background (the first of the two test images from which the reference images are generated), stretched to a ratio of 4/3, and then offset appropriately at each pixel position to generate the correct color at that pixel location.  The following image inverts the colors of the stipple pattern, i.e. uses a 1-pixel-wide diagonal white line set on a 3x3 diagonal black background (the second of the two test images, which generates a different palette of observable colors in its corresponding reference image).  The colors in this second rendering of the Mona Lisa really jump out on the N1 screen, more so than the righthand image would indicate.

 

Lots more examples

Again, stippled image is on the left (click to view full-size), an approximation of how the image will look on the screen is on the right.

 
My algorithm takes a very rough cut at finding the closest color available, it could be substantially improved, probably to the point of making the above sunset image appear almost completely smooth.  I doubt I'll actually do that sort of fine-tuning, this algorithm was just developed for demonstration purposes, and it's good enough for this as it stands.  Here's another image that shows a color gradient that could be made much smoother.
 
Starry Night (stippled version) looks great on the PenTile display!  The blue and yellow colors in the sky look electric on the N1 screen, and the dark greens are really rich and dark, because in this image I use both of the two different stipple patterns used above (for the two different Mona Lisa images), and switch between them when it reduces the color error for a given pixel.
 
Not only the density of the stipple pattern but the overall average intensity of pixels in the stipple pattern can be varied to give colors with a range of different luminance values.  Note that changing the average luminance of a stipple pattern can change the apparent chrominance (apparent color), so the color selection algorithm has to take this into account.
 
Another image showing all the colors of the rainbow in one scene by exploiting color fringe artifacts of the PenTile matrix display.
 
An image of Yellowstone using the same white-on-black stipple pattern used for the first Mona Lisa image
 
The same image using the black-on-white stipple image used for the second Mona Lisa image (the colors are actually more significantly different on the N1 screen between these two examples than they appear here)
 
The color gradations in the sky could be smoother here again, it would just take a bit more time polishing the mapping from stipple phase offset to perceived color.
 
 
The first of these two examples uses the "white-on-black" stipple pattern, the second example switches dynamically from white-on-black to black-on-white, as with the Starry Night image above (again the difference in color depth can only be seen on the N1 screen).
 
 
 
 There are lots of different ways to dither the same image all to similar effect.

2010-03-08

First post

Welcome to my new blog, intended for high-quality entries about hacking on Android.  This blog replaces my old one at .  I'll keep the old blog up for a while.