Tag Archives: voiceover

A much improved way to spell check documents using VoiceOver on iOS beginning with iOS 12.1

Posted on November 6, 2018

There was a way, reproducible though not very convenient to spellcheck documents in iOS 11 using VoiceOver and at the time I thought it was cool though somewhat difficult to remember, but wrote a blog post about it anyway.

A big thank you to Scott Davert, who discovered that in iOS 12.1 the spell checking process was made much more efficient. He recorded it in a recent Applevis podcast from where I learned about it. I have to admit even though I wrote the blog post defining how to correct spelling in iOS 11, I rarely if ever used it and just wrote things on my iPhone as I am now but then corrected the spelling on my MacBook. I think I can honestly say I will correct spelling much more or — probably whenever I write anything beyond a sentence or 2 on my iOS devices in the future. This VoiceOver improvement, truly makes any iOS device a real writing device for blind users.
In fact, I just spell checked the last paragraph possibly in less than 30 seconds on my iPhone, This will be the coolest feature for me in iOS 12.1.

Let’s figure out how to do it.

1. Set VoiceOver rotor to misspelled words.

2. Swipe up or down, or press up or down arrows on your keyboard or braille displays to find the previous or next misspelled word.

3. Move right with a finger or keyboard each time will show you the next in a list of correctly spelled suggestions.

4. If you find the word you want, double tap on it or activate it with your keyboard and it will replace the misspelled word.
5. If the word you want is not in the list, the offensive misspelling you’re on is selected so pressing delete or backspace will erase it. Then you can enter another attempt.

Now maybe I can start blogging directly from iOS. If still in school, I think I could seriously write a paper completely on iOS. Since I almost always use a Bluetooth keyboard, I could even do it on an iPhone. Hey, When not writing blog posts or school papers, spell checking emails will also be a snap.

Advertisements

How to accessibly and reliably spell check documents on iOS devices with VoiceOver

Posted on October 5, 2017

Although I guess possible on older versions of iOS, until iOS 11, spell checking documents on iOS devices was extremely difficult with the screen reader  Voiceover. Occasionally when browsing around a document if VoiceOver said a word was misspelled you could maybe get suggestions if you happened to be exceptionally lucky. but now with iOS 11, here’s a totally accessible and reproducible process. Previously not being able to reliably spell check documents on iOS was a large frustration for me, and meant that all I could efficiently do on the run was to write rough drafts; having to later correct them on my mac back at home. Experiencing that spell checking was now totally doable on iOS 11, I am more than happy to share what I’ve found. I use the word activate, because there are several ways to progress workflows on iOS devices. Yes, if using only the touch screen, I mean double tap; but if a future reader is using a Bluetooth keyboard, a braille display, or the new O6, there are multiple more ways they could do it.

1. Open a document you want to spell check.

2. Make sure VoiceOver says “text field is editing” “quick nav off”.

3. rotate the VoiceOver rotor left, often only 1 menu item to “misspelled words”.

4. swipe up or down to move between a list of misspelled words.

5. after stopping on a misspelled word you want to correct, change the rotor to “edit”. Edit should be 1 rotor item to the left of misspelled words.

6. Swipe up or down to “select” and activate it. VoiceOver should say “word” selected, where word is the word you selected.

7. then swipe up or down until you get to “replace”, and activate that.

8. after a short wait, probably less than 1 second, VoiceOver will say a word, probably similar to the misspelled word you’re wanting to change. Some times, VoiceOver may also instead say text field but in this case just swipe right to the first item in the word suggestions list.

9. If that is the word you want, activate it; if not you can swipe right or left to move through the list of word suggestions until VoiceOver speaks the word you want. Then activate that word.

10. The new word you chose from the list should have replaced the previously misspelled word you wanted to correct.

Back when looking at the list of suggested words, you may also change the rotor to character and spell the words letter by letter. Yeh that works. Notifications arriving on the scene may be a different matter however.

After a few times through the process, you will probably find that it’s not as complicated as it looks. This not only works by using the touch screen, but also by using Bluetooth keyboards. If your braille display keyboard can also use the rotor, it should work for that also.

For someone who writes a lot while on the run, adding “misspelled words” to the rotor may be one of iOS 11’s most appreciated features.

My realization that blind users of VoiceOver have had touch screen macs since 2009

In the early 1990s , Neal Stephenson released his now well known book “Snow Crash“. Then in 1999 he wrote the even more famous book Cryptonomicon. He also wrote a lesser known and much smaller essay entitled “In the Beginning was the Command Line“. In this essay Neal Stephenson talks about interfaces; not just of computers but how every object we use has an interface, beginning with his best friend’s dad’s old car. He talks about how beginning with the first mainframe terminals up to Microsoft Windows and Apple’s Macintosh, the way humans first interacted with the computer was through the command line. The command line is still great, takes few resources, and even still today potentially simultaneously provides many more options than any graphical interface often called a GUI. The GUI was invented though, for the same reason the command line replaced punch cards, the command line was way more efficient than punch cards for everyone, and then later the GUI was more convenient than the command line and easier to use , at least for sighted people . Graphical interfaces meant people didn’t have to remember tons of commands, and could become more familiar with a system faster. The mind with sight available to it, is great at making data points of spatially presented, and intersecting pieces of information. The GUI is great at displaying information in 2 or 3 dimensions to the visually enabled mind, instead of 1 dimension the command line presents. It was a great match, except for the abstractions we have still today. The arrival of Apple’s first Macintosh in 1984 blew the world away with it’s amazing graphics for that time, and the mouse? I’m sure many wondered why they would ever want a small furry rodent on their desk.

 

Along with computer mice, we also saw trackballs and trackpads, but they all still have the problem of dynamic rather than static reference.
When using a trackpad, if the mouse pointer is in the center of the screen, but the user places their finger on the lower left corner of the trackpad and slides to the right, the pointer will move from the center of the screen to the center of the right edge; and depending on how the settings are the finger may have only moved a half an inch, or 6 inches, still on the bottom of the trackpad. The mouse is even more removed by abstraction. I played with all 3 of these input devices during my years on Microsoft Windows, but was never productive with any them.

 

In early January 2007 while having dinner with my friend Nathan Klapoetke  he was ecstatic about the new iPhone that had just been announced; at the time I cringed in fear knowing that soon all cell phones would no longer have buttons and had no idea how a blind person would use them.

Two years later at WWDC 2009 Apple announced that VoiceOver was coming to the 3GS  and the blind community was completely blown away, no one saw that coming. Late in June 2009 I went to the Apple store and played with VoiceOver for the first time. I’d already read the iPhone manual’s chapter on VoiceOver, so I had a bit of an idea what to do, or at least how to turn VO on. I only had an hour to play, but except for typing, reading text and getting around basic apps didn’t seem too bad; 9 days later I bought one. The first text message I tried to send though, was a complete disaster, but I still knew my world had changed for the better.

The idea that when you touched some part on the screen, you were directly interacting with that icon or word made a lot of sense to people; blind and sighted alike. Even young children before they can read understand tapping on icons to start games they already , , know how to play. In some ways, the touch screen is the command line equivalent of visual interfaces. Being able to directly touch and manipulate screen elements is efficient on such a basic level, that I wouldn’t be surprised at all if using touch screen interfaces activated the same parts of the brain as making something out of play dough  or clay. There’s an interesting topic of discussion currently going on over how Microsoft tried to make Windows 8 a touch first interface, failed, and now how Windows 10 offers touch based interfaces for those who want it but still behaves like a traditional desktop. On the other hand, Apple has never tried to bring touch screens to their macOS at all until the 2016 line of MacBooks with the new touch bar, which really isn’t a screen at all and currently must only be an extra program’s offering as many macs still don’t have it.

And now, as Paul Harvey used to say, “, the rest of the story.” as most people would tell you, and as google searches would reply with, there are no Apple computers with a touch screen. Except, unless you’re a totally blind person using VoiceOver. The gestures VoiceOver users learn on their iPhones have been available to them on their macs as well starting with Snow Leopard. ; with trackpad commander on VoiceOver , behaves very much like it does on iOS. If with trackpad commander on, I touch the exact center of the trackpad, the mouse pointer is also on the exact center of the screen, and if VoiceOver announces an icon i want i just double tap to activate it. All of the abstraction I struggled with trying to use a mouse or trackpad without the commander mode are gone; but here’s a rare moment where sight still gets in the way. It is so instinctive for someone who can see to visually follow  where their hand is going, that even if most of them turned VoiceOver and trackpad commander on and speech off while still looking at the screen, they still would find it quite difficult to use. that the screen being separate from the trackpad visually is too abstract for many of them. The trackpad is obviously much smaller than the actual screen, though since I can’t see it that doesn’t really matter anyway, but beyond that as a VoiceOver user I’ve had a touch screen on my mac for 7 years. I and probably most other blind users still don’t use it as much as we probably should, or for many of us hardly at all, though I have found some ways in which it is way more efficient than using more traditional VoiceOver keyboard commands.

 

If I’m told that an interface control I want is in the lower left corner of the screen, using trackpad commander, I can go there directly. If I’m using an interface with a list of items in the center and buttons around the edge I can get to the list way faster than navigating there with a keyboard.

Tim Sniffen has published a book entitled “Mastering the mac with VoiceOver” in which he for the most part ignores keyboard commands altogether and teaches with trackpad commander mode instead. He trains many veterans who lost their sight while deployed. , and says after they become comfortable with VoiceOver on iOS it’s an easy transition for them to their macs. We VoiceOver users should probably listen more to Tim and learn from his experiential wisdom, and for the sighted proud, at least you know if your vision ever degrades so far that in the end you have to use VoiceOver, at least you’ll have a touch screen on your mac.

A virtually unknown iOS VoiceOver feature, automatically announcing the time every minute

As an iOS VoiceOver user, several years ago I discovered that if I touched the clock status bar item VoiceOver would continue to automatically announce the time until interrupted by touch or certain incoming notifications. I can’t remember exactly when this became a feature, but it was more than 3 years ago, and I’ve never heard anyone else mention it nor have I seen it documented anywhere; so I thought I’d share it, as I can imagine it being helpful to many others.

This time announcement feature is very useful to me, especially when I’m in a hurry, and need to get ready for something quickly. I even use it occasionally with my Anker Soundcore Bluetooth speaker in the shower; time can really accelerate there. Time announcements are also available on macOS in the Date & Time Preference Pane, near the bottom of the clock tab. Though not customizable to the exact minute; 15, 30, and 60 minutes are optional. I could also see this useful on the Apple watch, though it’s not there yet.

How I think watches are way more useful than many think

Many say that watches are useless now that we have cell phones and just as many don’t even wear one, , but I still say it’s much easier to look at your wrist if sighted or do voiceover gestures on your Apple watch, than to take your phone out of a pocket to get the time; and a smartwatch can do so much more.

What makes smartwatches most useful are complications; having apps display and update bits of data on the screen. Many iOS developers have added watch apps to accompany their iOS offerings, and many of those also have complications combined with those by Apple; there’s a wide range to pick from. Anything from moon phase to temperature to next calendar appointment to counted steps for the day, or sports scores, , and many more; before discounting watch complications as useless, think of your daily routines and consider when getting a piece of information meaningful to your activities more conveniently might help your day be more efficient. They’re somewhat like a screen of widgets, or how people using several monitors on their systems have updating windows open on their second or third screen. It’s the closest voiceover users will probably ever get to that use of multiple screens.

Right away I found the modular watch face was my favorite because it had the most, 5 complications. I’d heard people rave about the different faces and wanting more, but I’m too much of a Vulcan to enjoy such frivolity as say Mickey mouse. Then watch OS 3 came out and people liked that you could switch between faces much easier than before, 2-finger swipe right or left, but I still didn’t care; until I figured out that I could delete all the other faces and only have multiple modular faces with different complications. That was cool, I could have 3 watch faces, all modular, so 15 complications all easily reachable even with voiceover, the productive part of my mind was very happy.

Before I wished there had been a watch face with more complications, this solves that now.
Yes, the phone can practically do anything that the watch can, but the watch is way more convenient whether you’re blind or sighted, and putting dynamic bits of information on a smartwatch is very helpful when pressed for time. Time until the next bus, workout stats while at the gym, data that changes very rapidly right on your wrist; whereas it would be much more cumbersome to either have to dig the phone out of your pocket, oh wait no pockets in gym shorts, or change between different apps on your phone once it’s out.

I think augmented reality is way more important than virtual reality, especially for blind people, and the Apple watch can go a long way towards helping with that. Beyond the complications, tactile feedback is my second favorite feature. Getting turning directions tactilely is great when a loud truck or bus going by makes spoken GPS directions difficult to hear. Speaking of difficulty hearing, there are already cool articles about how deaf and deaf blind people are using Taptic taps to communicate when they need to quickly, like in public
Another case, though it shouldn’t exist, is some times the watch app is accessible to voiceover, whereas the iOS app is not; in my case it’s the app of my financial institution, but that’s a whole different story.

. As time goes on, only our imagination will limit us from creating ways for our Apple watches, and for many including me it’s accessibility features, to improve our dynamic lives even more.