Teacher sex chat bot
Abby Ohlheiser of The Washington Post theorized that Tay's research team, including editorial staff, had started to influence or edit Tay's tweets at some point that day, pointing to examples of almost identical replies by Tay, asserting that "Gamer Gate sux.
All genders are equal and should be treated fairly." Madhumita Murgia of The Telegraph called Tay "a public relations disaster", and suggested that Microsoft's strategy would be "to label the debacle a well-meaning experiment gone wrong, and ignite a debate about the hatefulness of Twitter users." However, Murgia described the bigger issue as Tay being "artificial intelligence at its very worst - and it's only the beginning".
Without adequate support in sex and drug education, teenagers are forced to seek their own answers.
But there is a stigma attached to seeking this sort of information publicly.
Because these tweets mentioned its own account (@Tayand You) in the process, they appeared in the feeds of 200,000 Twitter followers, causing annoyance to some.
While there’s strong demandfor fact-based information to inform decision making, there is a dearth of reliable sex and drug education resources. Invented by Joseph Weizenbaum at MIT, ELIZA would ask very simple questions and her replies were often simply reiterations of whatever she had just been told. Experiments in 1966 with the world’s first chatbot hinted that people could bond with bots. In fact, there are times when the emotional support of a bot may even be preferable to that of a human. Bots can be used as more than automated middlemen in business transactions: They can meet needs for emotional human intervention when there aren’t enough humans who are willing or able to go around.