Apple has updated its Siri virtual assistant to handle questions about sexual assault and related emergencies in a more intelligent manner. Some of the new responses were developed in cooperation with the Rape, Abuse and Incest National Network (RAINN).
The update, performed on March 17, came just days after a study was published in the JAMA Internal Medicine journal, which found four virtual assistants – Siri, Google Now, Microsoft’s Cortana and Samsung’s S Voice – were lacking in proper support for offering help in personal emergencies.
Apple contacted RAINN shortly after the study was published, and in cooperation with the organization, has added phrases such as, I was raped” and “I am being abused” to Siri’s query index, and adding responses that include web links to the National Sexual Assault Hotline.
“We have been thrilled with our conversations with Apple,” said Jennifer Marsh, vice president for victim services at RAINN. “We both agreed that this would be an ongoing process and collaboration.”
Apple collected phrases and keywords RAINN receives through its online and phone hotlines, and used them to improve Siri’s responses. The response system was also modified to offer support using softer language. Replying to queries with phrases like “you may want to reach out to someone” in place of “you should reach out to someone.”
As users become more comfortable with the idea of having discussions with their devices via a virtual assistant such as Siri, the apps become more important as ways for victims to report assaults, and to get the help they deserve.
“The online service can be a good first step. Especially for young people,” Marsh said. “They are more comfortable in an online space rather than talking about it with a real-life person. There’s a reason someone might have made their first disclosure to Siri.”