Expect Labs’ MindMeld iPad App Understands Your Conversations In Real Time http://t.co/Y8Mx9PWC
First off, I need this!
“Our application analyzes and understands the last 10 minutes of your conversation to predict the information you may need in the next 10 seconds,†Tuttle told me. “We call this ‘continuous predictive modeling,’ and in some cases, it can find relevant information as you talk before you even need to ask for it.â€
Then I could be insufferably smart all the time! Okay, the technology isn’t quite up to the promise, yet.
What’s that old adage about learning to walk before you can run? Yeah, that. When the voice-only iPad app launches next month, it will support up to eight people, but it won’t yet be ready to predict what might come up during your conversations. For now, the app will retrieve information based on your conversation that you trigger within the app as topics of interest come up. It will also pull in Facebook information when users sign up for the service. Other services like LinkedIn will eventually be integrated as well.
While I think this would be a great app to have – providing useful information contextually while you’re having the conversation, I’m much more interested in the technology that allows the computer to understand the context. This requires great speech recognition (a still-to-be-realized holy grail). But beyond that it requires the computer to understand context. We’re seeing more and more examples of computers understanding context, so when the technology matures I see it being used to derive transcripts, and keywords, for post logging. (And for feeding into smart editing algorithms.)