Virtual Braille and Image Discernment the key learnings of the Optacon
I am a life long braille reader who enjoys the freedom that braille gives me to read any book that has been transcribed or translated. Braille is the key to literacy for all of us who are blind. I can’t imagine not reading braille. It would be like depriving me of the human right of learning.
I recall in the 1970s’ learning to use the Optacon for reading print. As an Optacon user, I was by no means the most efficient. I found that I could read print at about 60 words a minute. The great thing was that the similar tactile context was used to braille. This meant that reading with the Optacon became a natural activity, just like reading braille. The problem was that the low words per minute coupled with the fatigue factor made it very slow to read any larger quantities of material. Although I loved the freedom of reading print, I never lost my love of Braille which I can read at at least five times the speed that I could read with the Optacon. Perhaps if I would have started reading with the Optacon as a young child, I would have read faster. The brain is an amazing thing and I am confident that the fingers can discern in minute detail. A large part of the fatigue factor with the Optacon was the requirement to track the camera across the page. Not only tracking, but alignment was crucial.
In the mid 70s’ a group of students and I at the University of Minnesota, along with a refining effort by engineers at LBL labs succeeded in creating a computer terminal display output that utilized the Optacon’s tactile display to present the letters. One may read about this effort in more detail by following the links on my publication page. My experience in using the terminal was that since there was no tracking effort required, the reading speed was much faster.
The electronic braille display has been around for at least 36 years. In some ways the concept of an electronic braille display is very similar to the Optacon. it seems as if the designers of braille displays, are focused on providing a multiple cell context, i.e., enough tactile cells to simulate a line of braille. I don’t believe that this is a critical factor. I expect that if we built a device similar to the Optacon, coupled with the technology used to build the virtual terminal screen we could accomplish a revolutionary change for braille reading. What would happen if we had a tactile display that went across all four of the fingers of both hands? In other words, a display for the left hand and the right hand. Then we create a virtual screen that when prompted through commands that we issue such as through small muscle changes in the hands, we would experience letters in Braille moving under our fingers. Gone is the limitation for providing a line of cells for we now simulate a virtually long line that can be any length. Gone is the need to talk about simulating an entire page of braille for we have a virtual page that can be any size. We have broken the barrier that single line braille displays suffer from when attempting to represent tables where alignment is key. The forty character standard line length for braille is no longer important because we are not trying to fit in a paper model. Then we totally change the metaphor. The 25 line page length is meaningless because pages can be of any length. The tactile display can be higher resolution then is required just to display braille which means that we can imagine images being displayed in the same manner as the Optacon does.
Let’s stretch our imagination. Can we bend the ideas of braille and the Optacon, and come up with a totally new approach, that really isn’t that new but just a combining of successes that we have experienced in the past?