Short note:
So someone was talking to the Microsoft Santa IM thingie and ... I mean *insistently* talking to the program about something ... and it starts chatting away about oral sex.
Now, I admit to feeling some sympathy for the MSoft folks running this thing. Why? Because at one point back in the now-dark-ages I'd added an IM-autochat AI agent to my TriBBS Bulletin Board and we'd sometimes get people chatting away to it for fifteen or twenty minutes or so. I could check how long they were talking away because I could read the logs, you see.
But on one occasion I lost a user who was *sure* I must be at the chat and not the AI (despite what it said on-screen) and ... it was hysterical. By the time I got to the end and managed to stop laughing I was damn near falling on the floor with tears... because our normally very quiet and reflective long-time ex-military user had gotten so angry and was *so* pushed that they typed the word "damn" at the computer. Which raised the stakes with a hell. Next I knew all kinds of blue language was flying about.... and I think the caller never *did* come back to the BBS. Forty seven minutes and change that was ... via long-distance modem. They paid for that abuse, they did. And as sorry as I was to lose my user, the log was still funny. Sigh.
But this article says:
"Sohn said Microsoft was not aware that the Santa code included the foul language, but that the company did not suspect a prank."
Golly gee, Micros**t guys, with *my* BBS I could at least go into the language file, make the choice of vocabulary settings, set a time-base system up(based on the calling zone of the person calling, so that after midnight the more macho night-sysop could chat) and even tweak individual word choice. If I didn't want the word B***h ever to sound (except in the 'repeat response' mode to something the caller said!)or if I wanted to merely replace it with "boathead" or worse, with "Trekkie" ... I could. You see *I* had a plain text vocabulary list to futz with (even if when compiled it looked a bit like something unspeakable). Code is code, but you gotta look at the source, MSoft. It's in there. Did you not know this?
So how did that song go? "We are not responsible...."
http://www.kirotv.com/news/14784354/detail.html
So someone was talking to the Microsoft Santa IM thingie and ... I mean *insistently* talking to the program about something ... and it starts chatting away about oral sex.
Now, I admit to feeling some sympathy for the MSoft folks running this thing. Why? Because at one point back in the now-dark-ages I'd added an IM-autochat AI agent to my TriBBS Bulletin Board and we'd sometimes get people chatting away to it for fifteen or twenty minutes or so. I could check how long they were talking away because I could read the logs, you see.
But on one occasion I lost a user who was *sure* I must be at the chat and not the AI (despite what it said on-screen) and ... it was hysterical. By the time I got to the end and managed to stop laughing I was damn near falling on the floor with tears... because our normally very quiet and reflective long-time ex-military user had gotten so angry and was *so* pushed that they typed the word "damn" at the computer. Which raised the stakes with a hell. Next I knew all kinds of blue language was flying about.... and I think the caller never *did* come back to the BBS. Forty seven minutes and change that was ... via long-distance modem. They paid for that abuse, they did. And as sorry as I was to lose my user, the log was still funny. Sigh.
But this article says:
"Sohn said Microsoft was not aware that the Santa code included the foul language, but that the company did not suspect a prank."
Golly gee, Micros**t guys, with *my* BBS I could at least go into the language file, make the choice of vocabulary settings, set a time-base system up(based on the calling zone of the person calling, so that after midnight the more macho night-sysop could chat) and even tweak individual word choice. If I didn't want the word B***h ever to sound (except in the 'repeat response' mode to something the caller said!)or if I wanted to merely replace it with "boathead" or worse, with "Trekkie" ... I could. You see *I* had a plain text vocabulary list to futz with (even if when compiled it looked a bit like something unspeakable). Code is code, but you gotta look at the source, MSoft. It's in there. Did you not know this?
So how did that song go? "We are not responsible...."
http://www.kirotv.com/news/14784354/detail.html
no subject
2007-12-06 17:36 (UTC)I didn't realize how prevalent this was until a friend of mine set up a Pandora bot. (About Pandora Bots (http://www.pandorabots.com/botmaster/en/home)) This wasn't an adult chat bot, mind, it was set up to have the "personality" of a cartoon character, just for fun.
He showed me some of the logs, and there were a surprising number of chatters that persistently engaged the bot in explicit sex talk, and this despite having some pretty good rules set up to make the bot not respond to sexual comments.
My educated guess is that the Santa Bot fell victim to some of these losers, and I would also make the educated guess that Microsoft had actually set up a pretty firm barrier against explicit talk, if the most Santa came up with was "oral sex". It would suggest that they managed to cover most of the bases for the protection of the children but a couple of things managed to sneak through.
It's really kind of sad. My friend had several carefully-crafted chat bots that were ruined this way. With all the sex chat available, I don't know why someone would want to mess up a bot intended to amuse children--and it's not just "perverts" doing this kind of thing, it's otherwise ordinary people that seem to think the sole purpose of the Internet is sex and they're can say anything they want at any time to anyone.
A very advanced form of this type of bot, Jabberwock, can be found here (http://www.abenteuermedien.de/jabberwock/). Try chatting for a few moments--if you didn't know he was a bot, he might fool you for a time.