Apple may have delayed the Siri upgrade for fear of jailbreaks

Apple’s work on AI-enhancements for Siri has been officially delayed (it’s now slated to roll out “in the coming year”) and one developer thinks they know why – the smarter and more personalized Siri is, the more dangerous it can be if something goes wrong. Simon Willison, the developer of the data analysis tool Dataset, points the finger at prompt injections. AIs are typically restricted by their parent companies who impose certain rules on them. However, it’s possible to “jailbreak” the AI by talking it into breaking those rules. This is done with so-called “prompt injections”. As...

Apple may have delayed the Siri upgrade for fear of jailbreaks

Apple’s work on AI-enhancements for Siri has been officially delayed (it’s now slated to roll out “in the coming year”) and one developer thinks they know why – the smarter and more personalized Siri is, the more dangerous it can be if something goes wrong. Simon Willison, the developer of the data analysis tool Dataset, points the finger at prompt injections. AIs are typically restricted by their parent companies who impose certain rules on them. However, it’s possible to “jailbreak” the AI by talking it into breaking those rules. This is done with so-called “prompt injections”. As...

This article has been sourced from various publicly available news platforms around the world. All intellectual property rights remain with the original publishers and authors. Unshared News does not claim ownership of the content and provides it solely for informational and educational purposes voluntarily. If you are the rightful owner and believe this content has been used improperly, please contact us for prompt removal or correction.