
Why Retina Isn’t Enough (2012) by Tomte
Apple’s new MacBook Pro follows the fine tradition of the iPhone 4 and third-gen iPad in that it has a super high-resolution Retina display: a 2880 x 1800 panel with an amazing 220 pixels packed in per inch.
It’s an incredible display. In fact, it’s such an incredible display that it actually has about one million, seven hundred thousand pixels more than it needs to satisfy Apple’s definition of Retina, leading some to claim that those pixels are all going to waste.
Nothing could be further from the truth.
Apple’s new MacBook Pros have absolutely great displays, but they need every single pixel they have, because the truth of the matter is that Apple’s got a long way to go before it catches its display tech up to the incredible power of human vision. And that’s a good thing, because it means we’ve got a lot to look forward to.
Editor’s Note: Throughout this article, we will be talking about two types of Retina displays: Apple’s Retina displays, and a theoretical Retina display that would have much greater pixel density. For the purposes of this article, the latter type of display will be called True Retina displays: all other mentions of Retina displays can be taken to refer to Apple’s technology.
What Apple Means By Retina

At this point, everyone knows that Steve Jobs, the master showman, fudged facts a little bit when he introduced the iPhone 4’s Retina display back in 2010.
What Steve Jobs said at the time was this:
It turns out that there is a magic number right around 300 pixels per inch that, when you hold something around 10 or 12 inches away from your eyes, is the limit of the human retina[‘s ability] to differentiate the pixels.
Apple called such a display a Retina display, and when the new iPad was announced in March as having one, Tim Cook clarified a bit on their definition of Retina:
You may recall that with an iPhone held at a normal distance your retina can’t discern individual pixels. When the iPad is held at a normal distance [15-inches] it’s the same result.
In other words, because the average person holds a tablet further from their face than their phone, the new iPad didn’t need to have pixels that were as small as the iPhone’s, meaning the new iPad could get away with a pixel density of only 264 pixels per inch. Likewise, people sit even further back from their MacBook Pros, meaning their Retina displays only need 220 pixels per inch.
So far, so good. There’s only one problem: Steve Jobs said that the human eye, viewing a display from 12 inches away, can’t discern individual pixels if the density is over 300 pixels per inch. Except that this “magic” number is wrong. The real number is closer to nine hundred pixels per inch. Apple’s Retina displays are only about 33% of the way there.
Why Apple’s “Retina” Isn’t True Retina

Apple uses Retina as a marketing term, and it’s a great one. But it also implies that there’s nowhere else to go from here when it comes to resolution, which simply isn’t true. Ten years from now, we will all own Macs, iPhones and iPads with screens so crisp, looking at the iPhone 4S or new MacBook Pro will be like looking at a 1024 x 768 CRT from 2002. And that’s something to be excited about.
To understand why there’s so much more to be done with display resolution, you need to understand how Steve Jobs came up with his initial “magic number” for Retina: in short, he based it on a person having 20/20 vision. Seems reasonable, because colloquially, 20/20 vision is synonymous with having perfect eyesight.
The only problem? 20/20 isn’t perfect eyesight at all.
Apple’s definition of Retina is based upon the vision of seniors
When we talk about a person having 20/20 vision, what we’re actually referring to is how well they can read a standard Snellen eye chart, the kind you see hanging in optometrists’ offices all over the world. If a person has 20/20 vision, it means that a person standing 20 feet away from such a chart can read what an average person could see from the same distance. This is considered standard vision.
But while 20/20 vision might traditionally refer to “standard vision,” most research suggests that normal vision is actually much better than 20/20. In fact, people with normal vision usually won’t see their eyesight degrade to 20/20 until they are 60 or 70 years of age!
Got that? Apple’s definition of Retina is based upon the vision of seniors.
What Is The Resolution Of The Human Eye?

So what are the limits of vision? Ho