Can I use the new ML libraries to make queries to the cloud based secure inference?

Asked on 06/11/2025

1 search

The content from the WWDC sessions does not mention any specific new ML libraries for making queries to cloud-based secure inference. The focus is primarily on deploying and optimizing machine learning models on Apple devices using Core ML and other frameworks like MLX for on-device inference. These tools are designed to leverage Apple Silicon's capabilities for efficient and low-latency inference on-device rather than cloud-based inference.

If you are interested in learning more about deploying machine learning models on Apple devices, you might want to check out the session Deploy machine learning and AI models on-device with Core ML which covers various features and optimizations for on-device model deployment.