what's new in apple intellegenece?
Asked on 06/12/2025
1 search
At WWDC, Apple introduced several new features and enhancements related to Apple Intelligence. Here are some of the key updates:
-
Foundation Models Framework: Apple is opening up access for developers to tap directly into the on-device large language model at the core of Apple Intelligence. This framework allows developers to use Apple's models to get responses in plain text or as structured Swift Data, enabling more intelligent and efficient app experiences.
-
Generative AI Capabilities: Apple Intelligence now includes powerful generative models that enhance iOS, iPadOS, and macOS with capabilities for understanding and generating language and images. These models are deeply integrated into system features and apps, built with privacy in mind.
-
Integration with Siri: Siri is gaining new capabilities thanks to Apple Intelligence, including semantic search and improved natural language understanding. This allows Siri to perform more contextually relevant actions and understand user requests more naturally.
-
Writing Tools and Image Playground: New system-wide writing tools help users rewrite, proofread, and summarize text. The Image Playground API allows for easy integration of image creation features into apps.
-
Privacy and Offline Capabilities: Apple Intelligence is designed to work efficiently on-device, providing low latency and enhanced privacy by minimizing the need for cloud-based processing.
For more detailed information, you can refer to the Platforms State of the Union (00:01:35) session, which covers Apple Intelligence and its integration into Apple's platforms.