Motor AI může používat pouze stávající účastníci Kik, aby se zabránilo zneužití
Microsoft has revealed its latest AI effort – a chatbot named Zo. The robot learns about language and how people use words and emotion together by engaging in real conversation with humans, just like its predecessor Tay.
In an attempt to stop people abusing Zo, only those who have an account on messaging platform Kik can sign up for an invite to test it, although those who don’t have a Kik account are able to specify they have a Facebook Messenger or Snapchat account, suggesting Microsoft will be testing it on other platforms in future too.
Although Zo is so far, great at normal conversation according to MSPoweruser’s Mehedi Hassan, it’s not particularly good at taking part in more advanced conversations, such as those about politics or other conversations that require a more sophisticated knowledge.
Zo’s older brother Tay was withdrawn from testing after it appeared the people it was conversing with were feeding it with hateful information, which didn’t contribute useful information to Microsoft.
Na papíře byl Tay inteligentním vynálezem, který se dozvěděl od rozhovorů s jinými lidmi, někteří lidé začali zneužívat AI vyplňovat jeho robotickou mysl s potenciálně škodlivými poznámkami, včetně rasové nenávisti, podpory fašistů a dalších hanlivých připomínek.
“The AI chatbot Tay is a machine learning project, designed for human engagement,” Microsoft said as it withdrew the AI innovation from public testing.
“It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.”