During its "Let Loose" event on May 7, Apple introduced the M4 high-performance chip for its newest iPad models, boasting superior AI capabilities.
The M4 chip is a system on a chip (SoC) that integrates a CPU and GPU, delivering enhanced performance and power efficiency for the latest iPad Pro.
Compared to previous iPad Pro models featuring M2 chips, the M4 offers 1.5 times faster CPU performance, setting a new standard for speed.
With a neural engine capable of handling up to 38 trillion operations per second (TOPS), Apple claims the M4 outperforms the neural processing units of any AI PC currently available.
Johny Srouji, Apple’s senior vice president of hardware technologies, emphasized how the M4's efficiency enables the iPad Pro's sleek design and innovative display while enhancing performance across CPU, GPU, Neural Engine, and memory systems.
Analyst Alexander Harrowell from Omdia acknowledged Apple's consistent hardware advancements, attributing differences between the M4 and A17 Pro chips to variations in core counts and thermal considerations.
Omdia's recent report positions Apple as a leader in AI PC technology, particularly excelling in creative workloads, reflecting the company's commitment to innovation.
Harrowell highlighted the importance of increased memory bandwidth in the M4 chips, suggesting it significantly impacts performance, particularly in tasks like Transformer inference.
CEO Tim Cook hinted at significant investments in generative AI during an earnings call, suggesting further advancements in June at the WWDC event.
Analyst Lian Jye Su suggested that the M4 chips indicate Apple's readiness to deploy large language models on its tablets, aligning with the company's focus on optimization for its product ecosystem.
While Apple has unveiled its OpenELM model, rumors suggest it may collaborate with other technology vendors for on-device deployment of large language models, potentially sparking a competitive market for partnerships.