This site may earn chapter commissions from the links on this folio. Terms of use.

E'er since Apple bought PA Semi, it'south been clear the company had aspirations of semiconductor design. Its CPU architectures have differed from other vendors past emphasizing raw single-thread performance, and it's been debuting co-design chips like motion and sound co-processors for the final several years. It'due south also purchased part of Toshiba'south flash memory concern, thereby gaining a NAND source for itself without having to directly shoulder all of the burden of building a fab and NAND business. And Apple claims its most recent chip, the A11 Bionic, includes a neural engine for AI workloads. The company hasn't revealed how this chip works nevertheless, so we don't know if it's a more conventional, repurposed piece of silicon specifically dedicated to AI tasks, or if Apple is using part of its custom GPU to handle these workloads.

As Nikkei reports, Apple has other significant opportunities to integrate additional functions into its own SoCs. The image beneath shows the various suppliers of its components. While it'due south slightly out-of-date (Apple is no longer using Imagination Technologies equally a supplier), it still makes the point. Nosotros tend to think of a telephone as beingness comprised of an SoC, some RAM, a camera, and diverse screen-related components, sometimes with a co-processor to handle motion processing. But there are nonetheless major opportunities for Apple to innovate and add functionality to its ain SoCs.

20170928iPhoneSuppliersDia_article_main_image

Paradigm past Nikkei

By designing its own fries, Apple can better differentiate itself from others, said Marking Li, a Hong Kong-based analyst with Sanford C. Bernstein. "Further, depending likewise much on other scrap suppliers in the age of bogus intelligence will deter its evolution."

20170928AppleDealsTable_article_main_image

Apple has been rolling out these integrated capabilities over the last nine years, steadily taking over more of the phone's logic in the process. Supposedly, Apple has its sights on baseband cellular modems, which would be a huge shift from the current status quo. Both Qualcomm and Intel sell Apple tree modems for its iPhones, and neither company is going to be excited at the idea of losing that business.

At the same fourth dimension, however, edifice modems from scratch is hard. Edifice modems from scratch that won't go yous sued for patent infringement from other companies is even harder. Thus, while information technology'due south tempting to say that Apple edifice its own modem would mean we'd see this feature in 2018 or 2019, they're highly unlikely to be washed that speedily. Modem technology is a field where established semiconductor companies often struggle, though whether that's due to Qualcomm'due south business practices or the intrinsic and acknowledged difficulty of designing a modem that won't be burdened by patents is a different question.

Given that 4G and LTE are now mature markets, Apple tree will almost certainly target the 5G standard. Apparently any device that supports 5G will accept to take fall-back capabilities to 4G or LTE, merely as 4G and LTE devices can fall back to 3G or even EDGE. And of course, the perennial statement is trotted out that Apple wants to build Mac systems that tin replace Intel processors. I've never said that this is impossible–it isn't–merely that it's unlikely. And for all the progress Apple has fabricated in CPU pattern, I think that's still the example.

Building a high cadre-count CPU–and I'm defining that as four or more "big" cores–is harder, in many ways, than y'all might think. But slapping core designs downward on a floor plan isn't sufficient. Chips need interconnects, cache allocation strategies for L2 and L3, bus topologies, and power gating. These systems are all highly circuitous in their own right, and building a unified L3 across 12-28 cores isn't easy. There's a reason why AMD's Zen architecture uses a CCX structure with an 8MB L3 cache allocated per quad-core, as opposed to a completely uniform L3 across all 16-32 Threadripper / EPYC cores.

Could Apple solve these issues? Absolutely, yes. But it'll take time to ramp up that kind of project, and Apple would have to negotiate a catchy path frontwards. Every person who bought a Mac laptop then they could run Windows applications on information technology–and while that may non be a majority of Mac users, it does constitute some of the base–would exist left in the cold unless Apple could simultaneously negotiate deals with the developers of that software to port it to macOS. Again, could information technology? Sure. When yous've got a few hundred billion in the banking concern, "persuading" other companies to do things is typically just a matter of price.

Apple will likely lay these plans quietly and carefully, and take several years to exercise it. If I had to guess if we'd see an Apple tree 5G modem or an Apple ARM-based Mac Pro first, I'd bet on the modem–and I wouldn't expect to see that before late 2019 or 2020.

Now read: How L1 and L2 CPU Caches Work, and Why They're an Essential Part of Modern Chips