We all do it, it’s not a big deal.” As I awaited the wrath of three women, I was surprised to hear them laughing and sharing their opinions.
With a tipsy grin I shot back, “Oh please, we’re all adults here.
Yesterday, Microsoft unveiled Tay — a Twitter bot that the company described as an experiment in "conversational understanding." The more you chat with Tay, said Microsoft, the smarter it gets, learning to engage people through "casual and playful conversation." Unfortunately, the conversations didn't stay playful for long.
Pretty soon after Tay launched, people starting tweeting the bot with all sorts of misogynistic, racist, and Donald Trumpist remarks.
The Guardian picked out a (now deleted) example when Tay was having an unremarkable conversation with one user (sample tweet: "new phone who dis?
Now, while these screenshots seem to show that Tay has assimilated the internet's worst tendencies into its personality, it's not quite as straightforward as that.After a couple of minutes, we were all introducing ourselves and hitting it off like old friends.Why did women suddenly respond with enthusiasm rather than distaste?If there is any misunderstanding or disagreement about the contents of this page, leave now.with Corinne Olympios—the night that resulted in production being shut down amid claims of possible sexual misconduct. My whole thing was not really committing to one thing. She like came up, jumped in his arms and slammed him to the ground, like all over him.