The race toward Confidential AI inference
For almost half a decade now, I have been working on Confidential Computing at Canonical. This position has given me a front-row seat to the evolution of Confidential Computing technologies and their applications. One of the most exciting applications is Confidential AI inference, which allows AI models to be hosted and executed in a way that can keep the user’s input data confidential, even from the service provider itself. While Apple is announcing a partnership with Google, to base its own models on Google Gemini and while some might see this as a failure, it is worth noting that Apple Intelligence already has a meaningful legacy. ...