
Apple’s work on AI-enhancements for Siri has been officially delayed (it’s now slated to roll out “in the coming year”) and one developer thinks they know why – the smarter and more personalized Siri is, the more dangerous it can be if something goes wrong. Simon Willison, the developer of the data analysis tool Dataset, points the finger at prompt injections. AIs are typically restricted by their parent companies who impose certain rules on them. However, it’s possible to “jailbreak” the AI by talking it into breaking those rules. This is done with so-called “prompt injections”. As...
from GSMArena.com - Latest articles https://ift.tt/oLxZi9y
No comments:
Post a Comment