Google's new flagship phone, the Galaxy Nexus, is coming on November 3rd to all U.S. carriers*.
As many know, Apple's new iOS5 now has integrated voice recognition called Siri. While Android has had this feature for a couple years now, Apple's version is in many ways better.
But Google has been hard at work creating two answers to the keyboard: One is a much improved speech recognition system on video here
. The other is much more powerful: keep your phone in your pocket and just think about what you want it to do for you!
The Nexus Prime has an extra chip dedicated to decoding signals from a EEG-based add-on device that sits on your head. The device picks up brainwaves and relays them to the Nexus Prime in your pocket via blue-tooth. So you can control the phone by thinking. The app that comes with Ice Cream Sandwich allows a wide range of commands that you train the app to understand. The training period is usually a few minutes per command, as it needs to calibrate based on brain wave pattern analysis.
Are you as excited as I am? What is the next step? Not just asking our phone to make an appointment but sending a complex message! Soon, technically, we have machine-assisted-telepathy!
*Verizon version will not be a full "Nexus" because of Verizon's reluctance to give users a fully stock experience; they tend to want to add certain apps, replace Google search with Bing, etc.