Will Voice Leapfrog Handwriting Recognition?

Share on LinkedIn9Tweet about this on Twitter53Share on Google+2Share on Facebook3

leapfrog

Last year, I posed the question if Apple’s barrier to business adoption of the iPad was handwriting recognition. In fact at the time I thought that without a better data input method, the iPad would only be really useful as an information consumption device.

Since then, I have experimented extensively with the iPad as a note taking device using my handwriting.  I’m to the point now where I neither carry any newspapers and magazines nor do I carry a notebook. The iPad has effectively replaced reading and writing material.  There have also been a series of announcements about other tablet platforms and their inking interfaces, such as the ThinkPad’s Android-based tablet and some early apps for the Playbook.

The problem with tablet-based note taking is that it is image-based rather than text-based (via handwriting recognition) and can’t be searched, except for meta data and document titles that you can type in using the virtual keyboard. Yes, there is a handwriting recognition app for the iPad that converts your handwriting to text, but it’s just not the same as writing in your notebook - real or virtual - because there is a delay as the text is converted and I find myself always watching the converted text to see if it was correctly interpreted.  Everything just slows down and doesn’t feel natural. There is also some discussion that other apps can convert (good) handwriting into searchable metadata, but I haven’t had the time to experiment yet. But that requires yet another app and more process to solve the problem.

Recent discussions about the next iOS release wonder if voice input is coming, hinted at by the inclusion of a microphone icon on the virtual keyboard.  Further investigation describes a voice “assistant” function that would allow the user to make a request such as “make a reservation for 2 people at a good sushi restaurant nearby,” presumably using some of the capabilities of their recent acquisition of Siri, an app that accepts voice input based on the Nuance voice processing engine.

Controlling the phone and apps in a smart device with your voice is nice, but I wonder what it would do as a more sophisticated text entry interface?  Is it possible that voice recognition gets to the point where we just turn on our phone or tablet in a meeting and watch the real-time transcription of the different voices while we annotate in parallel? Will a refined handwriting recognition capability be unnecessary?

For now, I’m satisfied taking my image based notes with a stylus.  But, I’m also keeping an eye on the emerging voice interfaces.

Share on LinkedIn9Tweet about this on Twitter53Share on Google+2Share on Facebook3
  • http://twitter.com/21stCentSchool Kevin Pashuk

    If MS OneNote was available on the iPad, along with a reasonable less-than-the-thickness-of-your-pinky stylus, the iPad would be much more useful.

    OneNote lets you search handwritten notes (even mine), organizes non-linear thoughts into tabs, pages, and shared notebooks (w/near real time syncronization), the ability to drop in documents, videos and random gatherings.  All in all, the perfect app for a student working in a collaborative space.

    Unfortunately there’s nothing out there for the iPad (or other Apple computers) that supports collaborative work spaces in the same way.  Most iPad apps so far do not factor collaboration into the mix.

    BTW, it also allows you to record audio that is syncronized to your note taking (however random) and does a reasonable job of handwriting to text conversion should you desire.

  • http://twitter.com/jeff_dana Jeff Dana

    I definitely think that voice recognition will surpass handwriting recognition on the technical front, but the cultural front presents the biggest barrier.  During a meeting, taking notes via voice is not only disruptive, but rude.  I regularly use voice recognition with great success, but not during meetings.