Apple’s New AI Features Powered by Google’s Tensor Chips: A Surprising Collaboration

HomeBrandsAppleApple's New AI Features Powered by Google's Tensor Chips: A Surprising Collaboration

Highlights

  • Apple Intelligence uses AFM-on-device and AFM-server language models
  • Google’s Tensor Processing Units (TPUs) were crucial for training Apple’s AI models
  • iPhone users will see smarter Siri, enhanced photo search, and improved auto-correct
  • AI models trained on Google’s chips but run on Apple’s hardware

Apple’s latest venture into artificial intelligence, known as Apple Intelligence, is set to make iPhones smarter than ever before.

In an unexpected twist, it turns out that Google, Apple’s longtime rival in the smartphone market, has played an indirect role in this development.

Let’s dive into this fascinating revelation and what it means for iPhone users.

Apple’s AI Development: A Peek Behind the Curtain

Apple Intelligence uses AFM-on-device and AFM-server language models
Apple Intelligence uses AFM-on-device and AFM-server language models

Apple recently released a detailed paper titledApple Intelligence Foundation Language Models,’ offering a rare glimpse into the company’s AI development process.

This report focuses on two key language models that form the backbone of Apple Intelligence:

  1. AFM-on-device: A 3 billion parameter model designed to run directly on iPhones and other Apple devices.
  2. AFM-server: A larger, more powerful model that operates on Apple’s servers.

These models are crucial for various AI features, including improvements to Siri, smart replies in Mail, and even coding assistance in Xcode.

Tensor Chips Power Apple’s AI Training

Google's Tensor Processing Units (TPUs) were crucial for training Apple's AI models
Google’s Tensor Processing Units (TPUs) were crucial for training Apple’s AI models

In a surprising revelation, Apple disclosed that it used Google’s Tensor Processing Units (TPUs) to train these AI models:

  • The AFM-on-device model required 2,048 TPUv5p chips for training.
  • The larger AFM-server model needed a whopping 8,192 TPUv4 chips.

This choice of using Google’s chips instead of other popular options like NVIDIA’s is intriguing.

Apple, known for its preference for in-house solutions, turned to a competitor’s technology for this crucial development phase.

What This Means for iPhone Users

iPhone users will see smarter Siri, enhanced photo search, and improved auto-correct
iPhone users will see smarter Siri, enhanced photo search, and improved auto-correct

While the technical details might seem complex, the outcome for iPhone users is simple and exciting: smarter devices.

The Apple Intelligence features, partially developed using Google’s chip technology, will bring a range of improvements to iPhones:

  • More natural and context-aware interactions with Siri
  • Enhanced photo search capabilities using natural language
  • Improved auto-correct and predictive text
  • Smarter email management with summaries and intelligent replies

It’s important to note that while Google’s chips were used in the training process, the final AI models will run on Apple’s own hardware in devices and servers.

This means iPhone users will enjoy these new features without any direct interaction with Google’s technology.

This collaboration, albeit indirect, between tech giants Apple and Google showcases how complex and interconnected the world of AI development truly is.

FAQs

What are the key AI models used in Apple Intelligence?

Apple Intelligence uses two main models: AFM-on-device, a 3 billion parameter model running on iPhones, and AFM-server, a more powerful model operating on Apple’s servers.

How did Google contribute to Apple’s AI development?

Apple used Google’s Tensor Processing Units (TPUs) for training its AI models, employing 2,048 TPUv5p chips for the AFM-on-device model and 8,192 TPUv4 chips for the AFM-server model.

What improvements will iPhone users experience with Apple Intelligence?

iPhone users can expect smarter Siri interactions, enhanced photo search capabilities, improved auto-correct and predictive text, and smarter email management with summaries and intelligent replies.

Will iPhone users interact directly with Google’s technology?

No, while Google’s chips were used in the training process, the final AI models will run on Apple’s hardware, ensuring users enjoy the new features without direct interaction with Google’s technology.

Why did Apple choose Google’s chips for AI training?

The choice to use Google’s TPUs for training highlights the complex and interconnected nature of AI development, allowing Apple to leverage powerful chip technology to enhance its AI capabilities.

Also Read: Apple’s iOS 18 Is Likely to Be Its Most Ambitious Update Yet: Mark Gurman

Also Read: Apple Reportedly Acquires DarwinAI to Boost iOS 18 AI Features and More

Also Read: Google’s Tensor G5 Chip Could be A New Era for Pixel Phones with Exynos-Free Design

Also Read: Google Pixel 8 Pro on Geekbench with Tensor G3 Chipset & 12 GB RAM Ahead of October 4 Launch: First Device with Android 14 Out of the Box

Latest Articles

How About Putting A Camera Lens...

Highlights Tech DIY video YouTube channel Mod and Shoot...

Motorola edge50 Neo India’s Lightest MIL-STD-810...

Highlights: The Motorola Edge50 Neo features a MIL-STD-810H certified design...

Latest iPhone News Roundup: iPhone 16...

Highlights iPhone 16 Pro to Have ProRAW Photos in...

Apple Increases iPhone 16 Pro Battery...

Highlights Apple has increased the out-of-warranty battery replacement...

Lava Blaze 3 5G Launched in...

Highlights • The device features Vibe-light available in multiple modes...