The new app that serves as eyes for the blind

Moreover, sensors capable of recognizing emotions on these faces — work that’s part of other Carnegie Mellon research into autism – could make it possible to recognize when those people passing you are smiling or frowning. Researchers also are exploring the use of computer vision to characterize the activities of people in the vicinity and ultrasonic technology to help identify locations more accurately. As Asakawa shared with me, the cognitive assistance research that went into creating the NavCog app has some parallels with the cognitive computing work being performed by IBM Watson. In both cases, there is a growing attempt to improve the cognitive abilities of humans on a real-time basis.

Read the full story here

Leave a comment

%d bloggers like this: