Highlights
- Introduction of HUGS technique for creating 3D avatars.
- Efficient language model inference to enhance Apple GPT.
- Potential integration of Apple GPT in iPhones and iPads.
- Anticipated announcement of GPT-like features at next year’s WWDC.
LLM and AI are the buzzwords for this year, and you can thank OpenAI’s ChatGPT for revolutionising how people generally perceive AI chatbots.
Apple is not one to be left behind and collaborating on some previous reports, a new research paper has emerged highlighting the innovative approach by Apple to leverage its internal AI chatbot – often referred to as Apple GPT.
Let’s learn a bit about what this means for the future of iPhones and Apple products in general.
Apple GPT Breakthrough: Revolutionizing 3D Avatars with HUGS
Apple has revealed two research papers pertaining to its AI development.
The first introduces the HUGS (Human Gaussian Splats) technique, a novel method for creating animated 3D avatars from simple monocular videos.
Traditionally, generating realistic 3D avatars demanded elaborate multi-camera setups.
However, Apple’s HUGS system simplifies this process, requiring only a short video clip to produce detailed avatars, complete with intricate features like clothing and hair texture.
This advancement not only enhances the capabilities of the Apple GPT on devices like the iPhone but also paves the way for applications like virtual clothes fitting, offering users a unique way to preview clothing on personalized 3D models.
Apple GPT Breakthrough: Efficient Language Model Inference
The second paper delves into efficient language model inference, a key aspect that could empower the Apple GPT iPhone to run complex AI systems smoothly.
This is particularly notable because it suggests a future where the power of AI like ChatGPT isn’t limited to high-end servers but is accessible right in the palm of your hand.
The potential of such technology in everyday devices like iPhones and iPads is going to be immense, promising a more integrated and interactive AI experience.
The paper itself is highly technical in detail but the highlight is this: Apple’s current LLM is likely to run 25 times as faster while being on limited RAM.
Apple GPT Breakthrough: Impact on Vision Pro and Beyond
These technological strides are not just theoretical achievements.
They have tangible applications in products like Vision Pro, enhancing user experience with more immersive visual features.
The Apple GPT could transform how we interact with Apple devices, making AI an even more integral part of our daily digital interactions.
Of course, there is nothing in the way if a release timeline this year but industry analysts have strongly hinted that next year’s WWDC, where Apple will announce iOS 18, is when we can see the company unveil its GPT-like feature.
FAQs
What is Apple’s HUGS technology and how does it impact AI development?
Apple’s HUGS (Human Gaussian Splats) technology is a novel method for creating 3D avatars using monocular videos.
This advancement simplifies the process of generating realistic 3D models, traditionally requiring complex multi-camera setups.
HUGS is set to boost the functionality of Apple GPT in devices like iPhones, with potential applications like virtual clothes fitting.
How does the efficient language model inference paper contribute to Apple GPT?
The second research paper on efficient language model inference is crucial for enhancing the Apple GPT on devices like the iPhone.
It indicates a future where advanced AI systems, similar to ChatGPT, can operate smoothly on everyday devices.
What are the potential applications of Apple GPT and HUGS in Apple products?
Beyond theoretical advancements, Apple’s GPT and HUGS technologies have practical applications in products like Vision Pro, enhancing user experiences with immersive visual features.
The integration of these technologies could significantly change how users interact with Apple devices.
What are the two key techniques of storing AI on Flash Memory ?
In a new research paper titled “LLM in a flash: Efficient Large Language Model Inference with Limited Memory,” the authors note that flash storage is more abundant in mobile devices than the RAM traditionally used for running LLMs.
Their method cleverly bypasses the limitation using two key techniques that minimize data transfer and maximize flash memory throughput:
Windowing: Think of this as a recycling method. Instead of loading new data every time, the AI model reuses some of the data it already processed.
This reduces the need for constant memory fetching, making the process faster and smoother.
Row-Column Bundling: This technique is like reading a book in larger chunks instead of one word at a time. By grouping data more efficiently, it can be read faster from the flash memory, speeding up the AI’s ability to understand and generate language.
The combination of these methods allows AI models to run up to twice the size of the iPhone’s available memory, according to the paper.
This translates to a 4-5 times increase in speed on standard processors (CPUs) and an impressive 20-25 times faster on graphics processors (GPUs).
“This breakthrough is particularly crucial for deploying advanced LLMs in resource-limited environments, thereby expanding their applicability and accessibility,” write the authors.
How is AI faster on iPhone?
The breakthrough in AI efficiency opens new possibilities for future iPhones, such as more advanced Siri capabilities, real-time language translation, and sophisticated AI-driven features in photography and augmented reality.
The technology also sets the stage for iPhones to run complex AI assistants and chatbots on-device, something Apple is already said to be working on.
Apple’s work on generative AI could eventually be incorporated into its Siri voice assistant.
Apple in February 2023 held an AI summit and briefed employees on its large language model work. According to Bloomberg, Apple is aiming for a smarter version of Siri that’s deeply integrated with AI.
Apple is planning to update the way that Siri interacts with the Messages app, allowing users to field complex questions and auto-complete sentences more effectively.
Beyond that, Apple is rumored to be planning to add AI to as many Apple apps as possible.
What is Apple GPT?
Apple is reportedly developing its own generative AI model called “Ajax”. Designed to rival the likes of OpenAI’s GPT-3 and GPT-4, Ajax operates on 200 billion parameters, suggesting a high level of complexity and capability in language understanding and generation.
Internally known as “Apple GPT,” Ajax aims to unify machine learning development across Apple, suggesting a broader strategy to integrate AI more deeply into Apple’s ecosystem.
Apple GPT
Apple is reportedly developing its own generative AI model called “Ajax”.
Designed to rival the likes of OpenAI’s GPT-3 and GPT-4, Ajax operates on 200 billion parameters, suggesting a high level of complexity and capability in language understanding and generation.
Internally known as “Apple GPT,” Ajax aims to unify machine learning development across Apple, suggesting a broader strategy to integrate AI more deeply into Apple’s ecosystem.
What is the Future of AI on Apple Devices?
As per recent reports, Apple plans to launch generative AI features on the iPhone and iPad by late 2024, aligning with the release of iOS 18.
The company is building a robust infrastructure with AI servers to support a blend of cloud-based and on-device AI processing.
Also Read: Apple Vision Pro Headset To Reportedly Go on Sale in February 2024
Also Read: Apple’s 2024 Product Launch Predictions: Apple Watch 10, AirPods 4, Vision Pro, and More
Also Read: Apple’s M3 iMac Teardown Reveals Subtle Changes and High-End Features