See Sound Live is an assistive technology under development....
At present a profoundly deaf person is unable to develop speech because when he / she attempts to speak any sound or word, he is unable to hear it and therefore understand what he spoke and how well he spoke it. If the person could get a sense of how clearly he spoke in some other way, he could potentially improve his performance in speaking the sound clearly. He would also need to compare what he spoke with what he needs to speak in order to speak clearly.
A person with normal hearing achieves these two tasks by hearing. Hearing comprises of two main steps - 1) Inner ear decoding sounds into various frequencies and 2) the brain processing this decoded sound into meaningful words. Once the brain has completely understood the sound, it is able to direct our voice box, tongue,lips, mouth etc to recreate the same sound. It assesses the speech output by hearing this sound and makes automatic changes to achieve a near perfect outcome. These automatic changes that the brain does happen in our sub conscious space. That is why we ourselves don't really know how exactly we speak. Its a task that our brain does for us. This is not unusual. If you think about it, our brain is constantly running our lung, liver, kidneys etc all the time without us really understanding how it does this.
See Sound Live converts the outcome of any speech effort (in speaking a 'small sound') into a Visual Equivalent and displays it on a smart phone screen.
It also displays the Visual Equivalent of the same 'small sound' when spoken ideally next to it. When a deaf person tries to speak the ''small sound'', he or she gets a sense of what they spoke by looking at the pattern of the Visual Equivalent and comparing it to the one spoken ideally. The decoding done by the inner ear is done by See Sound Live and the brain has now achieved the decoded sound through its visual pathways.
In reality, what is really happening is that the deaf person is using his brains visual processing skills to evaluate his own speech outcome. Our preliminary research has demonstrated that the deaf person can use this visual processing ability to speak clearly.
The deaf person can now hear with his eyes - see sound instead of hearing it - the basic building block of developing spoken language.
Combined with sign language, See Sound Live can help a deaf signer pick up speaking a few words too. Depending on level of interest and practice, spoken vocabulary can slowly increase to a point that a deaf signer can communicate easily with someone who doesn't know sign language.