An interesting paper dealing with creating training regimes for people using prosphetic vision devices. As regards the vOICe, activities such as reading can be ignored until the user gains proficiency but the other activities such as going down a corridor that has large symbols painted on the walls can be carried out and are applicable.
Sensory substitution
A pulley and a rope
A pulley and a rope used to draw water from a well. This well is situated at a restaurant just outside Karnal.
Note:
The well does have some water but is no longer used.
An image of the pulley and the rope
A soundscape of the above image
Braille Literacy: The Coin on a Mission, the Crisis & the Controversy
A very long article on the importance of Braille. The article also deals with neuron plasticity that occurs when reading Braille.
All this is related to the US government release a coin with Braille markings.
Highly transient neuroplasticity following one session of learning to use a sensory substitution device: an fMRI study
A 1 hour training session and then subsequent usage of the vOICe leads to significant neuron reorganization. This reorganization reverses itself quickly though.
Visual-Haptic Mapping and the Origin of Cross-Modal Identity
We have been told not to mix sight with touch since sight is like another language. However, new research states that cross modal integration is possible.
Read more here
Tactile Temporal Processing in the Auditory Cortex
Our perception of the outside world results from integration of information
simultaneously derived via multiple senses. This happens at the early stages of processing sensory input. Therefore, even for a sense like touch,
the auditory cortex is required to process it successfully.
This was tested by knocking out the respective brain area using TMS.
my incredible out the windscreen experience
Before I begin, I want to highlight that in the latest test version of the vOICe, we have hesterernalization sound rendering. this has not been available on the mobile phone version until now. It adds a tremendous amount of clarity to the soundscape. Though, it does slow down the sonification a little bit.
I had wayfinder access running on my mobile phone. at the same time, I had launched the latest version of the vOICe. I had set the vOICe to double speed sonification as well as 4D rendering. I then connected the hands-free kit to the phone. I now was holding the phone on top of the dashboard such that the camera was looking at the windscreen. I was able to detect a surprising number of things. For example, for the first time, I was able to make out whether there was a clear road ahead of me. On top of that, I was able to tell whether a vehicle was getting too close to the left or to the right of the car. In India, we have right-hand drive. So, I was sitting on the left of the driver. therefore, the right side soundscapes were not very clear however, if I paid attention, I was still able to make out that a vehicle was getting too close. On the left, I was able to make an approximate guess on what the vehicle was. For example, a bus gives a very distinctive piano like sound because of its many windows.
Another thing that I was now able to do was to pay attention to the pitch of the soundscapes. By comparing the pitch I heard of the road and that of the object, I was able to tell whether the object was higher than my car or not. I am not sure if this is accurate. I was able to sense traffic moving. At one point, I was also able to sense when a vehicle moved too close to my car and then moved a little away. I was doublechecking my observations with my driver.
As of now, I do not see any way of seeing the rearview mirror and, I was unable to see roadsigns or any other structures on the road. I do need to do a lot more testing but, from my previous similar experiences, this experience represents a qualitative jump. For the first time, I was actually able to analyse what I was looking at.