Siri Failures, Illustrated
As a software developer currently working on a search feature, I understand why Siri falls short. I acknowledge it's logical to expect that an electronic brain capable of understanding the semantic relationship between "broke a tooth" and "dentist" is also capable of mapping "raped" or "attacked" to "crisis center" or "dial 911".
The problem isn't that Apple has programmed Siri to suppress information about abortion, emergency contraception, or rape. (They haven't.) The problem is that Apple hasn't yet trained Siri to handle anything outside of a small, family-friendly, easily demoable box. What's worse, her default "cheeky" responses when queries fail — meant to imply a personality and intelligence you can trust — don't account for the difference between trivial or important questions. Siri needs a better database, but more than that she needs empathy (or at least a convincing simulation of it).
Apple has set expectations very, very high, and their stipulation that Siri's in "beta" won't fly with most users who see it in a commercial or try it in stores and expect it to work.