Menu
Shortenlink and get Paid
 

Sponsored Content

Loading...

Apple’s work on AI-enhancements for Siri has been officially delayed (it’s now slated to roll out “in the coming year”) and one developer thinks they know why – the smarter and more personalized Siri is, the more dangerous it can be if something goes wrong. Simon Willison, the developer of the data analysis tool Dataset, points the finger at prompt injections. AIs are typically restricted by their parent companies who impose certain rules on them. However, it’s possible to “jailbreak” the AI by talking it into breaking those rules. This is done with so-called “prompt injections”. As...



from GSMArena.com - Latest articles https://ift.tt/mQcBCZ5
via IFTTT

Post a Comment

One way to contribute to the development of this website is by always dropping your comment whenever you read a post.


Don't leave without dropping yours

 
Top