Apple announced the launch of AI features, and Musk angrily said: If OpenAI is integrated, the entire company will be banned from using it!

This article is machine translated
Show original

Apple revealed a new set of generative AI features at the Apple Worldwide Developers Conference on June 10, 2024 - "Apple Intelligence" will be released with iOS 18, iPadOS 18 and macOS Sequoia later this year. Updates are rolled out to users.

One of the features will allow Apple's voice assistant Siri to pass user questions to ChatGPT when necessary. User consent will be asked before any questions, files or photos are sent to ChatGPT, after which Siri will directly present the answers. This feature will be powered by GPT- 4o - Supported by the latest version of OpenAI's ChatGPT.

However, Tesla CEO Musk is quite dissatisfied with this practice of sending user private data to external companies, and even stated that if Apple integrates OpenAI at the operating system level, Apple devices will be prohibited from being used in his company. Because this is an unacceptable security violation. Musk also suggested that visitors to Tesla, SpaceX and other companies he runs need to put their Apple devices into "Faraday cages" upon entry.

In fact, what Musk is targeting is not OpenAI, but Apple's handing over user data to external companies, putting users' security and privacy at risk. He criticized on X:

"Apple is 'not smart enough' to develop AI on its own, but it can somehow ensure that OpenAI protects your security and privacy. This logic is obviously absurd. Once Apple hands over your data to OpenAI, they have no idea what's going on." What's going on? They're selling you downstream."

In fact, Apple actually has its own AI model, and integration with ChatGPT is limited to Siri and writing tools. Apple also emphasizes that the user's IP address will remain ambiguous when calling ChatGPT's functions, and OpenAI will not store user requests. However, Musk still criticized that when Apple uses words like "protecting privacy", it hands over user data to a third-party AI that they don't understand and can't create themselves. This is not the so-called "protecting privacy" at all. ”.

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments