Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Apple do the 2x stuff on iOS so that the large set of handcrafted, bitmapped 1x apps can map precisely to pixels when you upscale them. OSX doesn't have the same issue, since there's no existing default screen size that is designed for.


Apple is doing 2x on OS X as well. That's how the new Retina MacBook Pro works.

This allows Apple to give users the same exact workspace as before, just with 4x the pixels.

They could break this with the iMac and Thunderbolt Displays, but 2x is how I expect all Mac laptops to be in a few years.


Makes one wonder whether we'll see 21" retina iMacs / Cinema Displays before the 27" versions.


Since Thunderbolt is also some sort of PCIe port, couldn't you have the GPU be internal to the Cinema Display, and transmit only the GPU code over Thunderbolt? Or are significant parts of the screen drawn by the CPU directly these days?


I suppose that would be possible, if the built in GPU has local video RAM and such. You wouldn't nearly be consuming the bandwidth compared to sending raw video to the display.

I do think it would be a little too complicated though, it's probably more likely there will be a faster Thunderbolt port before that happens ;-)


Latency is a much larger problem.


Is there higher latency between Thunderbolt connected components than "regular" PCIe?


Yes. Even with PCIe "extender" ribbon cables of a few inches, you can stumble over latency issues.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: