Watson--IBM’s ‘intelligent’ computer--won at Jeopardy against trivia super-geniuses Ken Jennings and Brad Rutter last Thursday. I thought I'd lead off by repeating that information just in case you've been living under a rock. Our question is: will Watson-like technologies end up impacting our industry by changing the way that search engines analyze data? SPOILER ALERT: The answer is, 'Yes, almost certainly these technologies will have an impact at some point, but probably not right away, and probably not for some time to come.' But before we toy around with that idea, let's do some review. Who is Watson?
IBM built Watson as a follow-up to Deep Blue--the computer that beat Gary Kasparov at chess in 1997. By all accounts, it is tremendously more difficult to design a computer that can win at Jeopardy than it is to design a computer that can beat the the all-time world-champion chess-player Garry Kasparov. For humans facing-off against a master-chess player may seem more intimidating than playing Jeopardy--something that most humans can do with varying degrees of success. But the crucial difference between chess and Jeopardy is that the former requires the computer to understand graphic relationships & sequences in a clear, rule-bound formula (something that computers are very good at doing), while the latter requires the computer to understand language--which is a far more elusive, fluid, and altogether human 'game.'
So: are computers now able to understand language the way that human beings do? The short answer is: no, not yet. The tone of this answer has changed, though. The answer used to be: no, and they never will. Now even skeptics will say: 'not yet.'
Let's look at the ruminations of Jeopardy superstar Ken Jennings for some insight into Watson's grasp of language.
"Watson is indisputably a huge leap forward in computer 'thinking.' When I studied artificial intelligence in college just a decade ago, a question-answering computer as flexible and sophisticated as Watson would have been snorted at as science fiction - the kind of technology that only Captain Kirk, not Alex Trebek, would have access to....But is it really head and shoulders above the best human 'Jeopardy!' players, the way it looked on TV? Not by a long shot."
In an interesting blog post, Ken Jennings says that he’d wanted to be like John Connor (re: the soldier who defeats the Terminators in an apocalyptic future) but he ended up performing more like John Henry (re: the steelworker who died tried to outpace the steam engine). “BUT...” Jennings qualifies Watson’s victory. He says that the machine’s primary advantage was its reflexes--it pushes the buzzer at a super-human speed if it knows the answer.
Here is how Ken Jennings explains Watson’s win:
“The key to Watson's dominance lies in the famously tricky "Jeopardy!" buzzer, the signaling device that allows players to respond to the show's clues. Like any human player, Watson does buzz with a "thumb" of sorts (actually a magnetic coil mounted over a buzzer), but it can also rely on the millisecond-precision timing of a computer. The reflexes of even a very good human player will vary slightly, but not Watson's. If it knows the answer, it makes the perfect buzz. Every single time. And it's hard to win if you can't buzz. Imagine if John Henry had to beat the steam engine at a feat of brute strength just to be allowed to swing his hammer, or if chess grandmaster Garry Kasparov had to solve a long-division problem faster than supercomputer Deep Blue every time he moved a piece in their epic match.”
With that said: for all that we haven’t arrived at self-evolving artificial intelligence quite yet, even Ken Jennings will concede that Watson does represent a huge leap forward in terms of how data-processors are able to understand language.
As a copy-writer and SEO consultant this piques my interest because that is what our industry is all about: we have to write so that algorithms will understand us as well as human audiences. To some extent, the whole concept of SEO is that search-engines and human beings looking for websites need a middle-man to help them fully understand one another in order to derive optimum efficiency and maximum benefits. After the epic Jeopardy death-match between man and machine that went down last week, I am left to wonder how long it will take Google to integrate some of these various new capacities into the way that they do search and I am very interested in how that will ultimately change our field.
As Google continues to improve its algorithm--adding reader-levels & n-grams and other tricky mechanisms to the massive amounts of user-data that inform their search results--the ‘intelligence’ of search results can often seem almost eerie. As a sidenote: for me, the addition of millions of OCR-ed texts in Amazon and Google Books has added a new level of functionality to Google’s search engine. If a stray thought runs through my head from a book that I read years ago, I can type in a paraphrase of that quote and find the source.
But, of course, Google’s search engine is not actually ‘intelligent,’ its just incredibly well-informed. You can enter the terms ‘web-design’ into Google search-engine and it will pull out reams of information on that subject for you in a cleverly arranged hierarchy. But Google’s search engine does not actually know what ‘web design’ is. It may know that ‘web-design’ and ‘web-designer’ are related terms--the second term contains the first, but it does not understand that the term ‘web-designer’ designates a human-subject, whereas ‘web-design’ can refer to a discipline or an industry. The people at IBM, however, are beginning to design machines that do understand these distinctions, however. It will be interesting to watch as new applications are developed to exploit the advances that Watson’s victory represents.