CNIB National Braille Conference 2018

CNIB Braille Conference 2018 Workshop

The Integrated Braille/iPhone User Experience:

Visualization, Communication, Multi-Tasking and Participation

A Journey into Structured Touch-Screen Discovery


The advent of the accessible touch-screen has opened up a whole new world of possibility for blind users above and beyond the range of apps we all know and love. For the first time, we also have an fully accessible and interactive two-dimensional user experience to explore. Unfortunately, a large portion of the user community is not realizing the full potential of this User Experience. Why does this situation exist and how and why should it be rectified? What will be the benefits?


Prior to the touch-screen, the computing world for blind people largely consisted of issuing keyboard commands and hearing the result via speech or braille. Actual knowledge of any screen layout was essentially theoretical with no direct way to experience it. Even now, many blind computer users get along well enough with arrow keys, tab and enter.

Similarly, many blind iPhone users get along well enough by flicking their way to a destination, then doing a double or split tap upon hearing the desired item.

This approach is not unlike the methodology behind route-based mobility, where the instruction is geared to mastering a single route. Walk a block, cross, turn left, walk two blocks, etc. The person can learn to get from point A to point B alright, but what did they learn about everything in between? And yes, it will mean another O&M lesson for the next new route.

Of course, there are some super-fast flickers out there, running Voiceover at near top speed, who are quite proficient in their own way. But are they really as proficient as they seem? Fact is, a person who takes time to explore the screens of an app as they learn to use it ends up getting more done, with less effort, in much less time.

And thus, we have the potential for structured discovery that didn’t exist prior to the touch-screen. Suddenly, with the ability to drag a voice cursor’s focus around the screen, we can connect with all aspects of screen layout in a practical way. And as we explore, processes such as mental mapping and muscle memory become active, enhancing each other and allowing a Mind’s-Eye view of the screen to begin taking shape.

Commonalities Between Touch-Screen and Braille

First, as already mentioned, the touch-screen now offers accessible layouts that can be explored, just like a braille page. As we explore a braille page, we hear the text in our mind’s ear or  see a graphic in our mind’s eye. So, as we explore a touch-screen we hear Voiceover announce screen elements as we track to them. The advantage here is that knowing braille is not a requirement for tracking the screen successfully.

This is especially the case when using a tactile screen aid such as a SpeedDots screen protector. And an added advantage here is that the dots can be repurposed for quick screen element locations across multiple apps.

And just as we use both hands for proper braille reading, so we should do for efficient touch-screen navigation, especially in the beginning. The tracking hand can orient better to the phone if the holding hand can have fingers placed at nine, twelve and three o’clock. This stereo bio-feedback between the hands increases the chance that the tracking finger will move in straight vertical and horizontal lines, thus facilitating orientation to the screen.

The beginning of Structured Discovery

The home screen is a great place to practice basic orientation skills just to get the feel of navigating the screen. The process starts by learning what is meant by Voiceover focus and learning how to move that focus among status bar, icon grid and dock. Notice that flicking is not mentioned as an initial option, only because the objective is to encourage development of structured discovery techniques right from the outset.

This is why the next step involves exercises to refine the movement of the tracking finger. Pick an icon more or less in the middle of the screen. What icons are above or below it, or to its left and right? Later, once gesture work has begun, add the three-finger single tap to confirm icon locations. You can also discover the special sounds that occur when moving from one area of the screen to another, using the speech on/off toggle to let the sounds play by themselves. (three-finger double-tap)

Communication (Part One)

The learning process for iPhone beginners, especially those with congenital blindness, involves conceptualizing and learning the gestures. Traditional approaches like hand over hand create the risk of too many fingers touching the screen and creating unhelpful outcomes. Learners need to experience the gestures in a way that shows them what the touch-screen is expecting.

A good way to accomplish this is to have the learner imagine the palm or back of their hand to be the screen, where you then demonstrate how flicks, taps and rotations actually feel. Then, turn on screen help with the four-finger double-tap toggle and have them practice. The good thing here is that the phone is actually reporting learner gesture performance, which is particularly handy for blind instructors.

Another variation of this “experience gestures” process is to do them on the learner’s back when both their hands are operating the phone. This may seem odd, but it works particularly well to avoid the confusion of instructor, student and two iPhones talking all at the same time. It works like this:

  • Instructor simultaneously executes gesture on his/her phone and on learner.
  • Instructor Voiceover speaks result.
  • Learner executes gesture just performed.
  • If result is correct, learner’s phone echoes instructor’s phone.

With proper intervention and orientation, these methods can be particularly helpful for deaf-blind learners, especially if they are also working with a braille display. They can also make the intervenor’s job much easier.

Games like VO Starter, VO Lab and Blindfold Bop Gesture Game can also be fun ways to develop gesture skills.

While it is completely understandable that some people experience a great deal of fear around the process of embarking on the touch-screen journey, it would appear that such fear can be significantly minimized by an extended period of orientation at the beginning. New users shouldn’t hurry, nor should they go live until they’ve achieved at least some level of mastery and confidence. Even the best blind iPhone instructors were frightened, nervous and often very frustrated at the beginning. Remember that the best teachers are often the ones who had the most difficult time getting started. But it’s an adventure worth the challenge considering the level playing field that awaits at the end of the journey.

Communication (Part Two)

Communication in the broader sense involves selecting the right tools to get the job done in the most efficient manner. This starts with selecting the primary data entry mode:

  • On-screen braille keyboard (screen away or tabletop mode)
  • Braille display
  • Bluetooth qwerty keyboard
  • FlickType
  • Tap keyboard

Next comes the selection of apps to streamline data entry and collection to further increase personal productivity. This enhances the ability to interact and collaborate with others.

Acquiring and collecting information can be as easy as using visual assistance apps such as Seeing AI, En-Vision, Be My Eyes, or even Aira. Then there are audio apps such as Just Press Record, which also allow for the transcription of recordings into manipulable text.

For inputting your own data, consider an app like Drafts, which promote the concept of “write it now, task it later”. The app opens with a blank page ready for keyboard entry or dictation. Once the text is in, press the action button and select Email, Twitter, Facebook or hundreds of other actions that can be selected from the Drafts library. Separate the first line from the rest of the text by pressing enter, and it automatically becomes the subject of an Email, title of a calendar appointment, etc.


For many of us, collaboration started with the ability to share information via Dropbox. Now environments like Office365 are fully accessible, collaboration and all, while many are now using G-Suite as its accessibility improves. Collaboration apps such as Trello and Slack are also quite accessible

Multi-Tasking (Braille/iOS multimedia)

Since the iPhone and iPad have evolved as very powerful computers, it is not really that difficult to imagine that they can equal the multi-tasking capability of the average desktop computer. This is especially the case when using a braille display:

  • Ensure that only the audio source app and the note taking app are in the app switcher. In this situation you will be able to switch between the two apps by flicking four fingers left and right
  • Start the audio source
  • Go to the note taking app
  • Toggle speech off with a three-finger double-tap

Now your audio will be playing, Voiceover won’t interrupt it, and you’ll be able to monitor your note taking activity on the braille display.


Of course, the structured discovery techniques mentioned here can be applied across all accessible apps. This includes social media such as Facebook, Twitter, LinkedIn and many more. It includes cloud meeting platforms, particularly Zoom, which is fully accessible and more ubiquitous all the time. So now, when you join a webinar, follow the instructions above, open up an app and take notes as you go. And if the webinar chat window isn’t too, too busy, you can even participate there via braille without interrupting the audio from the presenters.

The Multi-Lingual Braille/iPhone Experience

If you’ve ever studied a foreign language and thought you might like to learn its contracted braille, the iPhone/braille display combo is the way to go. If you have your braille display set to contracted, it will honour that setting when you switch to a language whose contracted braille is supported. Then you can read along on the braille display while listening to the speech, or toggle the contractions on and off with a Chord-G.

And yes, there are several translation apps that are very accessible, particularly Google. And so, the language barriers that once existed are no more.

Conclusion and Follow-Up

We hope you have enjoyed and received benefit from this brief 30,000-foot view of Braille/iPhone integration and its potential. For more information or to receive coaching in your personal structured touch-screen discovery process, or that of your students, visit to explore our resources and coaching pages. We look forward to discovering together.