Imagination Technologies today announced PowerVR Furian, its next-generation GPU architecture that promises significant improvements in graphics performance and power efficiency that could benefit future iPhones.
PowerVR Furian architecture will offer up to a 70-90% improvement in real-world gaming performance by density, including 35% better shader performance and 80% better fill rate, compared to a similar sized and clocked Series7XT Plus GPU based on current-generation PowerVR Rogue architecture.
Meanwhile, with lower power consumption, an iPhone with PowerVR Furian architecture could have longer battery life for graphics-related tasks.
Imagination Technologies confirmed to MacRumors that PowerVR Furian architecture supports 4K graphics, laying the foundation for future iPhone models to support higher-resolution gaming and other graphics-intensive tasks such as augmented or virtual reality and machine learning.
“We created Furian to address a new class of emerging applications, with a focus on efficient scalability that will extend to multiple generations of PowerVR IP cores. We’re excited to start rolling out the first 8XT IP cores based on Furian. These cores will further cement the leadership of PowerVR at the high end of mobile performance.”
Apple's graphics chip in iPhone 7 and iPhone 7 Plus is a custom-designed version of the Series7XT Plus, and many earlier iPhone models are also based on PowerVR Rogue architecture, so it is reasonable to assume that future iPhone models may use at least some of the PowerVR Furian architecture.
Imagination Technologies said the first GPUs based on PowerVR Furian, such as the Series8XT, will be announced in mid 2017, but don't expect the architecture to power the so-called "iPhone 8" this year. Imagination noted Furian-based mobile devices might not arrive until the end of 2018 at the earliest.
Apple has been a licensee and stakeholder in Imagination Technologies since at least 2008. The two companies have worked closely together over the years, with Apple being a key investor in the UK-based chip designer since it raised its stake in the company to roughly 10% in 2009.
Apple was actually rumored to acquire Imagination Technologies last year, but it later said it did not plan to make an offer at the time. Nevertheless, in recent months, Apple has recruited at least two dozen employees from the chip designer, including former COO John Metcalfe, possibly to build out an in-house GPU team.
Top Rated Comments
As in, why would you do that when you could just save power with a regular display?
Some might compare what I've just said to quotes such as "640K is more memory than anyone will ever need." But this is different as it relates directly to a physical limitation of the human body. I think they will improve display quality, color reproduction, bit depth, etc—but resolution is arriving at an end point for phones, and will hit these so-called limits on larger and larger devices in the coming years. If nobody has named it, I'll call it Duke's Law, as I've been talking about this for many years and most people still don't understand and consider it just another "spec" that will keep getting better, even though it's useless.
You see, eventually you cross a threshold of diminishing returns—the intersection of cost, GPU speed, power consumption, and human eye resolution. Why put in the effort to develop a display with pixels smaller than anyone would ever be able to see? And again, the lone exception to this right now is a wearable VR/AR headset display that allows the display to sit closer to the human eye and remain in focus via optics. On the GPU end of things, it's nice to support 4K as it means you could output that resolution to an external display large enough to resolve the image effectively. And something like that could also be seen in a future Apple TV 5. But even then, from a typical couch distance of 8-10ft, most people who have 4K TVs smaller than 70-80" don't actually see any benefit over 1080p. Now sometimes 4K packs the pixels in tighter, which results in an overall increase in quality of picture and uniform brightness (at least that's something I've noticed). But I think the only legitimate reasons to have a 4K display is if it's huge (can resolve the pixels from the couch), you sit close to your TV, or you use it as a computer display (aka you sit close to it).
iOS10 on my 6S is like what MacOS9 felt like on my TiBook. Great hardware... lagging software. (And I actually liked MacOS9, unlike iOS10)