Each word gets converted to a number before it is processed, so asking how many “how many r are there in strawberry” could be converted to “how many 7 are there in 13”, for example.
But then the AI just looks up the definition of 13, and the definition of 7, and should be able to answer anyhow. I mean, this is how computers work. Are you sure that’s what the other commenter was refering to?
It’s not how AIs specifically work. They’re pretty brain-like, and learn through their experiences during the training process. (Which is also why they’re so hard to consistently control)
It’s possible they still might be able to learn this spelling fact from some bit of their training data, somehow, but they’re at an immense disadvantage.
They could just ask how many r’s there are in strawberry
It was always a kind of unfair test, when you consider words are rendered down to a token before the thing ever sees them.
Can you explain?
Each word gets converted to a number before it is processed, so asking how many “how many r are there in strawberry” could be converted to “how many 7 are there in 13”, for example.
(Very simplified)
But then the AI just looks up the definition of 13, and the definition of 7, and should be able to answer anyhow. I mean, this is how computers work. Are you sure that’s what the other commenter was refering to?
It’s not how AIs specifically work. They’re pretty brain-like, and learn through their experiences during the training process. (Which is also why they’re so hard to consistently control)
It’s possible they still might be able to learn this spelling fact from some bit of their training data, somehow, but they’re at an immense disadvantage.