Mobile makes a great proving ground for testing new usability concepts. Two years ago we pondered our own: showing you OpenTable restaurants with availability around your current location, beating the traditional phone at its own game. Since we started on OpenTable Mobile in 2008, we’ve seated more than 2 million diners via our mobile applications. Concept proven.
Over a year ago, we were approached by Siri, who wanted to take a fresh approach to voice recognition. Their goal was to make voice search contextually relevant (more so than the “Call home.” cliché) and actionable with third-party services. Siri wanted to tap into the vast network of OpenTable restaurants and prove their model through a use case everyone can understand – making a reservation. I have to admit some of us here were a little skeptical and thought we’d have to speak like a robot to make it work. However, we were pleasantly surprised when we spoke a natural but complicated phrase — “Find a table for two at Bambino’s this Saturday at 7PM,” — and Siri came back with a relevant response. From there, the reservation was confirmed in no time.
Siri took this usability concept one step further by responding to your queries on screen, rather than reading it back to you in monotone, which we all know would be an embarrassing experience in public. (By the way, if Siri makes an Auto-Tune version, I’ll go from being embarrassed to insisting it speak back to me.)
You may be wondering if Siri can do more than just search for a specific restaurant (incidentally, our forthcoming iPhone update will let you type in the name of a restaurant and reserve it right then and there). Try saying, “Show me Italian restaurants around here with tables tonight,” and feel a smile cross your face when Siri does just that. Siri is leading the field in voice recognition, and OpenTable is proud to be part of proving this incredible concept.
Have you tried Siri yet? Let us know what you think here or over on Facebook.
Josh Garnier is an OpenTable Product Manager.