What´s new in Apple Inteligence?

Asked on 06/13/2025

1 search

At WWDC, Apple introduced several new features and enhancements related to Apple Intelligence. Here are some of the key updates:

  1. Foundation Models Framework: Apple is opening up access for any app to tap directly into the on-device large language model at the core of Apple Intelligence. This framework allows developers to use Apple's models to get responses in plain text or as structured Swift Data, enabling new intelligent experiences in apps while maintaining privacy and efficiency.

  2. Integration Across Devices: Apple Intelligence is being integrated across the ecosystem, enhancing experiences on iPhone, Apple Watch, Apple Vision Pro, Mac, and iPad. This includes features like live translation, visual intelligence, and powerful shortcuts.

  3. Writing Tools and Image Playground: New capabilities include writing tools that help users communicate more effectively by rewriting text for tone and clarity, and an Image Playground API that allows for easy integration of image creation features into apps.

  4. Siri Enhancements: Siri is becoming more natural and contextually aware, thanks to Apple Intelligence. It can now understand and act on what you're looking at on screen and has improved language understanding.

  5. Privacy and Efficiency: Apple Intelligence is built with privacy from the ground up, using on-device models to ensure user data remains private. This also allows for low latency and better user experiences.

  6. Generative AI: Apple is bringing generative AI capabilities to its platforms, allowing for new ways to understand and generate language and images, deeply integrated into system features and apps.

For more detailed information, you can refer to the Platforms State of the Union (00:01:35) session, which covers Apple Intelligence in depth.