Microsoft's artificial intelligence chatbot, Tay, learnt too much, too quickly, via its twitter chats and has subsequently been turned off for the time being. AI can learn, quickly, but, when it became abused to drive it's "chat" into unsavoury topics the chatbot looked like a liability.
Reboot the memory!
I wonder for the future of this might be as, clearly, it was abused to prove a point.
[
nytimes.com...]