incoming messages that trigger a bug in the iOS, inducing applications to crash…
Apple suggests Siri as a temporary fix (full instructions here). One of the options includes asking Siri to “reply to the malicious message”. That’s right – if you’re being picked on by a big bad hacker who is sending you “messages of death”, get big sister Siri to reply with a digital tongue-lashing – sweet!
Back in 2014, there was excitement surrounding the application called GoogolPLex, which hooked up a hacked version of Siri to the Internet of things. Suddenly you could use Siri for all sorts of applications, as seen in this video:
What’s particularly clever, is the name: “GoogolPlex, turn on the lights” is actually understood by Siri as “Google, please turn on the lights”. Then, instead of running a Google search, GoogolPlex redirects the requests to its servers and uses API’s that interact with your hardware to process your requests.
GoogolPlex, beam me up!
You love using Siri? She is a great listener (granted, with dubious hearing). However, if you’re also a staunch believe in privacy, you might want to reconsider what it is exactly you tell your beloved assistant. As reported in this post, all voice recordings are stored for 6 months, after which time they keep the recording for another 18 months but delete the number associated with it… In case you’re now thinking of switching to Microsoft’s equally friendly Cortana, the policy is very similar…
And now, for a look at what Siri looks like, as seen in Raj’s vivid imagination (from the Big Bang Theory):