Why doesn't Alexa know that homonyms aren't homophones?
As we head unto an AI dominated future, the Turing test will probably become less like a Voight-Kampff test and more like a warzone Shibboleth.
Yesterday, I asked the Alexa to set a timer.
"What do you want to name your timer?" She It asked.
"Bow," I replied.
"Bow timer set," it said.
Except… that isn't quite right. I wanted a timer for my bāo buns (包). That's pronounced /baʊ/ - as in to bow one's head.
So Alexa translated my speech to text and stored it as "bow".
When it came to read back the word, it pronounced it as /bō/ - as in a bow and arrow.
And there was much confusion.
I've ranted before about Alexa's complete lack of common sense and its inability to handle any form of nuance. The ability to store the original pronunciation of a homonym isn't even AI - it's barely even a database!
A common trope in World War 2 movies is that enemy soldiers are unable to pronounce normal English words like "Loughborough", or "Worcestershire sauce", or "Mr. Cholmondley-Warner".
Will it be so with AI?
Perhaps we will find ourselves being quizzed by an interlocutor that demand we prove our humanity by assessing whether words rhyme?
An arms race, to be sure. But one the machines show no signs of winning. Yet.
ISLA SALISBURY says:
Alex Gibson says:
More comments on Mastodon.