Business

From a home-grown, 3b parameter on-device model to the use of its "Talaria" tool, Apple is innovating quietly on AI. Details? Well...
Apple’s OpenAI integration drew a lot of headlines last week, and triggered a lot of misunderstanding around Cupertino’s approach to AI – including the claim in some quarters that it will be reliant on ChatGPT.
(Elon Musk posted provocatively on X that it was “patently absurd that Apple isn’t smart enough to make their own AI, yet is somehow capable of ensuring that OpenAI will protect your security and privacy”...)
What went somewhat overlooked amidst the noise around the terms of that particular engagement and Musk’s trolling was that Apple will be using its own three billion parameter on-device language model, and a larger server-based language model available via Private Cloud Compute on Apple silicon to underpin the planned majority of users’ requests.
Get the story, a weekly newsletter (you can turn that off if you want) and support independent journalism. Subscribe today.
Already a member? Sign in