foundation models framework 的记忆长度是多少?

Asked on 06/13/2025

1 search

The Foundation Models framework, as discussed in the WWDC sessions, does not explicitly mention a specific "memory length" for the models. However, it is noted that the on-device model is a large language model with 3 billion parameters, each quantized to 2 bits. This model is optimized for tasks like summarization, extraction, and classification, and is designed to run efficiently on Apple devices. The framework provides a stateful session feature, which records all prompts and responses, but there is a limit to how large a session can grow. For more details, you can refer to the session Deep dive into the Foundation Models framework (00:01:55).