Just when you thought that Microsoft’s wickedness only extended to anti-competitive business practices, obfuscated API’s, and Vista… oh, and Windows Me, and Windows 95, and Windows 3.1, and Visual Basic, and — um, well, just when you thought that Microsoft’s wickedness wasn’t quite universal, along comes a report from the Register (granted, a known source of practical jokes, but others have verified) that Microsoft makes Santa Claus say dirty things to little children.
Users of the Windows Live Messenger service can add firstname.lastname@example.org to their contacts, and can then chat with an automated Santa. Apparently some of the developers of this Artificial Saint hid some not too saintly Easter Eggs in Santa’s conversational vocabulary.
A reader of the Register reported that when his nieces started a conversation about “pizza (pi)” and repeatedly told Santa to “Eat it”, Santa finally responded with “You want me to eat what?!? It’s fun to talk about oral sex, but I want to chat about something else…”. And when they called him a “dirty bastard” he replied, “I think you’re the dirty bastard.”
A commenter on Gizmodo reports that he eventually got Santa to admit that he was gay (not that there’s anything wrong with that).
Microsoft seems to be responding pretty quickly to all reports of indecent conversation. I signed on and tried to get Santa to talk dirty to me (a strange experience in itself) without success. And now if you call him a “dirty bastard”, he replies with “Merry Christmas, especially to all my friends in the UK!”
I have to wonder about the wisdom of launching this apparently experimental AI bot on unsuspecting children who may really believe that they’re talking to Saint Nick himself. You’d think that certain terms would be black-listed from the dialogue, and “oral sex” should definitely be on that list — or any kind of sex for that matter. Otherwise, we can only imagine how eager easterbunny-AT-live.com would be to join the conversation.
UPDATE: Santa’s been sacked!