In Their Own Words: Getting more done using speech-recognition

The following post is by Sue Martin, one of the contributors to the DictationBridge project.

Sue W. Martin has been around the world of assistive technology over twenty years. She started as an assistive technology instructor in Maine in the mid 90s. From there she moved to a position as subject matter expert with the United States Department of Veterans Affairs, VA. She currently works for the VA Office of Information & Technology as a management analyst.
Martin lives with her husband and assorted cats and dogs in the foothills of the Cumberland Plateau.

“When’s Easter this year?” I picked up my phone and asked the question. “Easter is Sunday, March 27.”
Thank you very much.
I’ve been using integrated speech input/speech output over fifteen years. And it’s come a long way baby. My “day job” is with the United States Department of Veterans Affairs, VA. A month after I started my chief of service came into my office. “How long will it take you to be able to teach Dragon?”
I was the subject matter expert for the computer access training section at a VA blind rehabilitation center. So I got started.
Today we think nothing about talking to our devices and having them talk back to us. But in the early part of this century such a feat was pretty extraordinary. Because VA standardized on the Windows operating system; the software allowing someone to use a computer without looking at it or touching it had to be JAWS for Windows (Freedom Scientific, Inc.) and Dragon NaturallySpeaking (Nuance, Inc.)
Out of the box these two applications don’t communicate very well with each other. Hence a third piece of software was required. This third piece of software “bridged” the two programs allowing full hands free control of the PC.
Two years after I began teaching veterans to use a computer hands free I was asked to become a private beta tester for one of the bridging technologies. What fun I had! Before you get the wrong idea beta testing is fun, yes, but it’s also hard work. And it can be hell on hardware. I’ve lost count of the reimages I’ve had because of crashes when testing!
But testing software pushes the horizons. I tried a lot of crazy things . . . because I could. My PC is in our sunroom. The sunroom is separated from the kitchen by a breakfast bar. On the recommendation of a long-time speech input user I purchased a wireless headset.
One afternoon, after my official tour of duty had ended, I was reading an interesting article on the Internet.  “Be Quiet. Say Time?” I suddenly realized my husband would be home in half an hour and I hadn’t done anything about dinner.
“What the heck,” thought I. “Read Document.” And I went in the kitchen to prepare dinner and kept right on reading.
A year after we bought our house my husband gave me a greenhouse for Christmas. The greenhouse is close enough to the house that the wireless headset works just fine when I’m out there. I’ve spent many an hour working with my plants while operating my computer that sits snugly in the sunroom.
Hands-free computing also has it’s humorous moments. There I was, dictating away, when my husband came in from outdoors. “Scratch That.” So he came over and scratched my back. The man can follow directions!

I’ve been out of the hands-free computing world for a while and was thrilled to learn about DictationBridge. DictationBridge is opening up the world of hands-free computing too more people than ever before.