As first reported yesterday in the Register and then picked up today by ComputerWorld, Microsoft has had to pull the plug on its on-line "artificial-intelligence Santa bot" that was meant to talk to children about what they wanted for Christmas. Seems that the bot, as ComputerWorld put it, "wandered off topic" when certain words - like pizza - were used.
According to ComputerWorld, "Microsoft recently added the artificial Santa as a bot that Windows Live Messenger users could insert into their IM buddy list as firstname.lastname@example.org."
You can read about the bot in a Microsoft press release I found from last year titled: For a Jolly Good Time, Chat With Santa on Windows Live Messenger. A line in it is: "Filling Santa in on Christmas wishes and asking all about how the reindeer are doing or whatâ''s new at the North Pole are a few of the things kids can talk to Santa about. Santa can even tell kids where they stand on his list: naughty or nice."
I guess the press release forgot to mention that Santa would be informing the kids about whether he was naughty or nice this year.
Microsoft said in a statement posted on the Register site: "Yesterday we received reports that the automated Santa Claus agent in Windows Live Messenger used inappropriate language. As soon as we were alerted, we took steps to mitigate the issue, including the removal of language from the agentâ''s automated script."
"We were not completely satisfied with the result of these actions, and have decided to discontinue the automated Santa Claus agent. We apologise for any offence or upset caused by this disturbing incident."
I guess Microsoft tested this year's Santa bot using the same strategy it does on most of its products - let the users find the bugs.