Category Archives: Efficiency

A much improved way to spell check documents using VoiceOver on iOS beginning with iOS 12.1

Posted on November 6, 2018

There was a way, reproducible though not very convenient to spellcheck documents in iOS 11 using VoiceOver and at the time I thought it was cool though somewhat difficult to remember, but wrote a blog post about it anyway.

A big thank you to Scott Davert, who discovered that in iOS 12.1 the spell checking process was made much more efficient. He recorded it in a recent Applevis podcast from where I learned about it. I have to admit even though I wrote the blog post defining how to correct spelling in iOS 11, I rarely if ever used it and just wrote things on my iPhone as I am now but then corrected the spelling on my MacBook. I think I can honestly say I will correct spelling much more or — probably whenever I write anything beyond a sentence or 2 on my iOS devices in the future. This VoiceOver improvement, truly makes any iOS device a real writing device for blind users.
In fact, I just spell checked the last paragraph possibly in less than 30 seconds on my iPhone, This will be the coolest feature for me in iOS 12.1.

Let’s figure out how to do it.

1. Set VoiceOver rotor to misspelled words.

2. Swipe up or down, or press up or down arrows on your keyboard or braille displays to find the previous or next misspelled word.

3. Move right with a finger or keyboard each time will show you the next in a list of correctly spelled suggestions.

4. If you find the word you want, double tap on it or activate it with your keyboard and it will replace the misspelled word.
5. If the word you want is not in the list, the offensive misspelling you’re on is selected so pressing delete or backspace will erase it. Then you can enter another attempt.

Now maybe I can start blogging directly from iOS. If still in school, I think I could seriously write a paper completely on iOS. Since I almost always use a Bluetooth keyboard, I could even do it on an iPhone. Hey, When not writing blog posts or school papers, spell checking emails will also be a snap.

Advertisements

Another post I wrote with additional thoughts about Apple’s Face ID

Posted on October 1, 2018

Two weeks ago, I wrote about how I strongly dislike Apple’s Face ID; and although some in the blind community have agreed with my thoughts, there are also some who do not. They say “oh Face ID is just fine and it’s accessible,” etc. Accessible for sure, but not efficient; and in some cases totally breaks people’s workflows.

Because accessibility has lacked efficiency in many ways, some bigger than others over the years, blind people have often collectively accepted that they have to deal with it. There might be an app that is almost accessible except 1 screen so blind users memorize the screen and don’t complain. Then there are environments like  amateur or ham radio, where the equipment isn’t really accessible at all, or the accessible versions are often considerably more expensive, (though things are improving) so blind people write and share guides to help them get around the problems. I respect the people who wrote those guides, and appreciated them many times, and even wrote a few myself, but the question needs to be asked: why are we just rolling over as a community and accepting this? Why aren’t we pushing back harder and finding polite and respectful ways to ask for or joining in to help create more accessible and/or even more  efficient solutions.

With some pretext, I now return to the Face ID situation. To date, Twitter is the most accessible and efficient social media for blind users, and it is there will you will find them discussing anything and everything they find important. As we have had a year with Apple’s Face ID, there have been tweets among the blind community about it, though I find most of them just saying things like, ye it’s ok, I got used to it; or even a few of it’s amazing works great. Santiago’s tweets shared here I think encompass much of this mentality.

Santiago – @kevinrj There are blind people that say that faceID is an accessibility issue, but I don’t feel like it is. Unlocking your phone with touchID in your pocket isn’t an accessibility feature. It’s simply a convenience. A nice one at that, but not necessary.
Santiago – @kevinrj Well, that convenience certainly goes away, and I honestly wish I could have it back sometimes. Could FaceID improve? Certainly, but I think everyone experiences similar issues with it. Even sighted people.

Santiago – @kevinrj You do also have the security issue with it. When it comes to sighted people, the phone actually looks for your attention in order to unlock. It automatically turns it off if you’re using VoiceOver. I have a work-around, but again… not very convenient. 
Santiago – @kevinrj I’m all about efficiency. Heck, I sometimes upgrade my products, because they slow down after years and affect that greatly, but I, a totally blind person, have efficiently used my iPhone X for almost a year now. Is there a learning curve? Yes. But it’s accessible.

Yes, as I said earlier, it’s accessible, but that doesn’t mean efficient. Could I take a bus from New York to Las Angeles? Sure, it’s totally accessible, and would even be way cheaper, but if I had to do it every 2 months for my job I would not like wasting up to a week each time I could save by flying. For a blind person, Face ID is very much like that; even though some are making it work or even enjoying it, some also enjoy long bus rides; I haven’t found that from my own personal experiences, but I think it has something to do with the scenery.

Sina Bahram has an ABD in  PHD in computer science and is probably the most advanced computer user in the blind community who I know of. Last week I found a thread on Twitter with him and a few other people about why Face ID is a step back for blind accessibility. These are not just opinions, but hard facts that should be taken seriously.

In this thread, screen curtain is mentioned, but is mostly only called curtain, which I realized may be confusing to those who don’t know about it. Screen curtain is an amazing VoiceOver feature that along with bringing added security and privacy to VoiceOver users on Apple products can definitely also save battery life.

Sina Bahram – Wow, I was not expecting to do this, but I’m returning the iPhone 10S. I cannot believe what an atrocious experience this is now. FaceID is nowhere near as efficient or accessible as fingerprint sensor. Not having a home button is ridiculous. No more immediacy of access. #a11y

James Teh – @SinaBahram I suspect my feeling will be the same. Some people seem to cope okay, but I just can’t see how face ID could be more efficient for us. And my understanding is you have to disable the gaze stuff, which means we reduce security due to disability, but I might be misunderstanding.

Michiel Bijl – @jcsteh @SinaBahram I’d be curious to know how that is measured. If it’s done by detecting whether your eyes are pointed at the phone with eyelids open—it might not be a problem for everyone.
Of course you can always use the passcode but that’s a major step back from Touch ID.

Michiel Bijl – @SinaBahram The interaction to go home with VoiceOver is weird. I mess that up rather regularly. Any tips?

James Teh – @MichielBijl @SinaBahram Also, the whole idea of having to actually pick up my phone and bring it to my face just to unlock it… so many kinds of bleh. The number of times I just quickly look at my phone with one hand while it sits on my desk…

Julianna Rowsell – @SinaBahram A friend of my is having similar feelings. His physical access disability doesn’t allow him to effectively use it. The angles to his face are wrong and such so the recognition  software doesn’t authenticate. – Retweeted by SinaBahram

Sina Bahram – @jcsteh @MichielBijl Exactly. This is just simply unacceptable. I really hope that some advocates inside of Apple bothered trying to speak up. It’s just not like them, sir. There are so many use cases this completely destroys.

Sina Bahram – @MichielBijl Yes, the tip is to get another phone. I’m not being sarcastic. I just boxed mine up. I am an expert in this topic and the most power user I have encountered, not bragging just facts, and this is unusable. So, I’m done. I’ll try to motivate some internal Apple convos, but no idea.
Sina Bahram – @MichielBijl @jcsteh I, plus its default now if it detects VO running, have turned off attention requirements. That’s not the FaceID issue here. The issue is that it doesn’t work in the dark with curtain on and it requires your face instead of your hand that’s already touching the device.

Sina Bahram – @jcsteh You are absolutely not misunderstanding. You are reducing security because of disability. Welcome to every X phone from original to the S and Max. Other concerns make this unusable, though.

James Teh – @SinaBahram @MichielBijl Oh… I didn’t think of that, and that’s super frustrating. So I’d have to turn off curtain to unlock my phone or go turn on a light? How utterly ridiculous.

Sina Bahram – @jcsteh @MichielBijl Yup, I can confirm that. I turn off curtain, and oh look, it’s magically like 10X more accurate, and turn it back on … pin code it is!
Tomi 🇭🇺 – @SinaBahram wait, doesn’t work in the dark with curtain on? Is this a thing? Does having screen curtain change this? I thought infra-red works with low or no light anyway since it’s using its own infra-red beams, so most people I read about using it said it works at night /in dark.

Sina Bahram – @tomi91 Everyone assumes infra-red means works in dark. This is not true. Infra-red buys you depth sensing independent of (visible)  light. That barely matters since gaze is disabled by most VO users. Face ID  still needs (visible) light in addition to depth info.

Tomi 🇭🇺 – @SinaBahram oh that’s interesting. I wonder if people re-enable attention mode if it changes. But then again some people can’t even get their eyes straight (like me) so it’d probably just fail over and over. Man I’m really glad about my 8 now, thanks for that hope. lol

Sina Bahram – @tomi91 That feature is automatically turned off for VO users, so the eyes thing is not an issue itself, though it negatively impacts everything from lack of full messages on lock screen to everything else.

I wish Apple had an iPhone pro line, kind of like their iPad pros. Face ID would be a great feature on those phones, but then instead of an iPhone Xr they could have what the iPhone 9 should have been, still an a12 processor but maybe a slightly lower quality camera, a smaller screen, and also still a home button.

There are still people who would like a smaller phone. There are even still some sighted people who are not obsessed with the latest perfections in screen technology, or don’t even care if they have the best camera. There are even some sighted people who would still prefer Touch over Face ID, even Alex Lindsay, who is one of the most visually oriented people I know of, said on MacBreak Weekly recently, that he personally prefers Touch ID but thinks phones should actually have both.

 

My thoughts about how as a VoiceOver user, Face ID although accessible is not very efficient

Posted on September 13, 2018
On January 9, 2007 Steve Jobs introduced the first iPhone, but I wasn’t sucked into his reality distortion field quite yet. My friend Nathan Klapoetke was ecstatic about it though and told me later that day when reinstalling Windows for me that he couldn’t wait to buy one; that was a very long 6 months for Nathan.

I, and much of the blind community were frustrated though, and felt left out. Some of us new that the touch screen, just a piece of glass for us then, would be the future and that soon no companies would make phones with number pads anymore. Then, at WWDC 2009, Steve Jobs introduced iOS 3 and mentioned VoiceOver during the keynote; the blind could use modern phones again. Those 2 years were hard though. Many of us in the blind community had Nokia phones, and some were playing with Windows phones, but they weren’t changing our lives anywhere near as much as the iPhone would later. This is how I feel current things are with the iPhone X models starting last year.

Apple is obsessed with Face ID and is making it their future, but I’m feeling a bit left out again.

Yes Face ID is accessible, I played some with an iPhone X last fall but as more people are recently realizing, accessible does not always mean efficient. I found it quite finicky when setting it up, and because I wear prosthetic eyes, when they’re out, I appear as a different person. There is an article written specifically for a blind person setting up Face ID, but note that they tell you to put your face in the frame, but don’t actually explain how to do it when unable to see the frame. Then there’s the trick of figuring out how far away to hold the phone from your face. I also have no idea how to move my face around in a circle while simultaneously staring at one spot. Ok, I understand the concept, but can’t physically do it. It took me about 15 minutes to get Face ID setup the first time. Another problem is how attention mode is disabled by default when VoiceOver is enabled. I understand that Face ID would work less with it on for blind people who are unable to visually focus, but that’s a potential security hole. A blind person could have their phone next to them on their desk and another person could quietly walk by pick up the phone pan it by the blind person’s face, and they’re in.

Beyond setting it up and all the problems of having eyes that don’t work, there is the inconveniences of work flow. My phone is often on my belt, and most people blind or sighted keep their phones in a pocket. If you’re blind, and have headphones, why would you ever want to  take your phone off your belt, or out of your pocket to use it? Taking your phone out just to authenticate gets annoying real fast, it also may require the person to stop what else they were doing. I’m rarely if ever looking at my phone when I use it, often my face is completely in a different direction.

I could go on a rant, oh wait I already am. I could be cynical, flame Apple, or just give up and switch to Android, and some might; but from my experience where although it took 2 years, Apple did bring VoiceOver to the iPhone in 2009. Here are some thoughts.

Things I could do today to unlock my phone, I have a long complicated password, I really don’t want to enter it every time. I could use the Bluetooth keyboard I already carry with me. I could plug in a series 4 Yubico key when I’m home or not around other people or in a situation where having something rigid plugged into the phone has a low probability of being bumped or damaged. These are only hacks though, I’m really hoping Apple can come up with an awesome solution again.

The iPhone can already unlock the Apple watch, and the watch can unlock my mack; I really hope that my Apple watch could unlock my iPhone too. Just having the phone unlock if my watch is on would definitely not be secure at all, but with the new Apple watch series 4 having some capacitance in the digital crown, having to touch the crown to unlock the phone could be a start. Putting Touch ID into a future model of the watch crown would be awesome.

I already wrote about how there are solutions that let me use my wired headphones even with no headphone jack, I know there are solutions for Touch ID equivalents that don’t include Face ID too. Whether Apple implements any of them is still a question, but I really hope they will realize how visual only options inconvenient a non-trivial segment of their market, and give us an alternative.

How for me, bone conduction headphones by Aftershokz is one of those technologies that is not just nice to have but a huge game changer for blind people

Posted on July 17 2018
Shortly after Christmas when i. Was 5, my sister Andrea introduced me to headphones with one of those single ear plugs from the 1970s, and showed me how I could plug it into my radio and listen to it. After a few minutes of private listening I couldn’t understand at all why she or anyone else around me couldn’t hear it. That just blew my 5-year-old mind, and it hasn’t been the same since.

As I got older, headphones became more a part of my workflow. I knew there are good speakers out there, I’m just a headphone guy at ear. Part of this came from. Using screen readers on first computers and later phones, besides using headphones for privacy, I’m sure people around me appreciate not being annoyed by it. I even used headphones on the bus, at work, and even when I took classes in college. There was still one area where headphones couldn’t help me though, when traveling alone using the white cane. I began to use GPS apps on my phone, but all the headphones I knew of still blocked some of the sounds from my environment, thus I didn’t feel safe using them when walking, and when traffic was loud I couldn’t hear the GPS info on the phone. Then I learned about bone conduction.

My friend Hai Nguyen Ly told me about Aftershokz and their line of bone conducting headphones, and how they rested on the face using transducers to convey sound through the cheek bone thus leaving the ear completely uncovered and blocking none of a person’s natural hearing. I couldn’t afford them then, so Hai sent me an old pair he was no longer using. Like Andrea introducing me to headphones so long ago, Hai improved my life again.

The first time I used the Aftershokz psychologically I wasn’t quite sure that it really wasn’t blocking my hearing, but it didn’t take long for me to realize that they really weren’t. I could hear traffic just fine, and the GPS info from my phone was always hearable even in the loudest truck or bus roared by. Now 5 years later after buying 2 more pair of Aftershokz headphones, I still use them pretty much every day. I wear them all the time when I’m in public, they work great at meetings and conferences. Even when I’m not needing to hear traffic when cane traveling, they still let me have the ability to use my phone without interrupting anyone around me but still be able to hear what they’re saying. Yes, trying to understand both audio streams might not work as well as I’d like, but sighted people get distracted too.

Some of you reading this might be wondering, ok but why does this matter? In my last post I talked about how for most of us most of the time technology is a nice convenience, but for those with disabilities, technology can be a huge life changer; this is definitely one of them.
Especially for blind users of screen readers. Bone conduction headphones allow them to get the info they need in real time while still having full access to their environment through their primary sense. Bone conduction technology may have been initially invented for the military, but now thankfully it is now also being used to help humans also be more human.

Instructions on how to get your Aftershokz Trex headset to work with Bluetooth Multipoint

Posted on November 9, 2017

About a year ago, I wrote how useful the Apple watch was for me, but one problem still was unless I used my mess of cables hack I couldn’t get VoiceOver from both my iPhone and watch simultaneously.
The Aftershokz headphone company released their Trekz-Titanium model in early 2016 and many people in the blind community were excited because they claimed the titaniums had Bluetooth Multipoint. They did, but after 1-2 hours of trying to get it working I gave up in frustration.

Then the Apple AirPods came out and I was hoping they could do the trick, but no, the user has to switch between devices; they won’t do it automatically.
Then my friend Hai Nguyen Ly who introduced me to bone conduction headphones 4+ years ago said in passing last month, that he’d gotten them working in Multipoint with both his iPhone and Apple watch, so I decided to reexamine the challenge, and this time was successful within about 30 minutes.

Here are the steps to do it, this works for both the Aftershokz Trex Titanium, and Trekz-air models; hopefully they make sense.

1. first you have to reset the headset, Turn off the headset before beginning to reset it.

2. Enter pairing mode by turning on the headset and holding the volume up button for 5-7 seconds.
You will hear the Audrey Says™ voice say “Welcome to Trex Titanium, and then shortly after, “pairing”. The LED will flash red and blue.
Audrey will say Trex-air if you have that model instead.

3. Then press and hold all 3 buttons on the headset for 3-5 seconds. You will hear a low pitched double beep, or feel vibrations.

4. Turn the headset back off.

5. Enter pairing mode again by pressing and holding the volume up-power on button. Audrey will first say “Welcome to Trex Titanium” and then “pairing.” The LED will flash red and blue.

6. Continue to press-hold the volume up button while then simultaneously also pressing and holding the triangular multi-function button on the left. after about 3 more seconds Audrey will say Multipoint enabled.

7. In your first device’s Bluetooth settings, select your Trekz model. Audrey will say “Connected.”

8. Turn the headset off.

9. Reenter pairing mode again by pressing and holding the volume up-power on button. Audrey will first say “Welcome to Trex Titanium” and then “pairing.” The LED will flash red and blue.
10. In your second device’s Bluetooth settings, select your Trekz model. Audrey will say “Connected” or “Device 2 Connected.”
11. Turn the headset off.

The next time you turn your Trex headset on it will connect to both devices. It works pretty well, though here are some things I’ve noticed.

If I move out of range of one of the connected devices and then move back into range, the device doesn’t always reconnect. Turning the headset off and back on reconnects both again.

I said Multipoint lets you connect 2 devices simultaneously but that doesn’t mean you can hear audio from both simultaneously. only one at a time. This means if I’m playing a podcast on my iPhone, I won’t hear anything from my Apple watch; that has already bit me a few times. if I pause the podcast on the phone, audio from the watch will start playing in about 2 seconds.
Beyond that, using Multipoint is still quite useful. I can use either device in a meeting, concert, or at church. I can also use either device while traveling in loud situations like around heavy traffic. I can also use the watch in situations where the watch’s built-in speaker would be too quiet to hear. Even with the limitations I’ve mentioned , I think you’ll still find using your Aftershokz with Multipoint a productivity boost.
Oh, my mess of cables hack is still useful if I want to hear more than 2 devices; and with that solution, the audio really is simultaneous.

How to accessibly and reliably spell check documents on iOS devices with VoiceOver

Posted on October 5, 2017

Although I guess possible on older versions of iOS, until iOS 11, spell checking documents on iOS devices was extremely difficult with the screen reader  Voiceover. Occasionally when browsing around a document if VoiceOver said a word was misspelled you could maybe get suggestions if you happened to be exceptionally lucky. but now with iOS 11, here’s a totally accessible and reproducible process. Previously not being able to reliably spell check documents on iOS was a large frustration for me, and meant that all I could efficiently do on the run was to write rough drafts; having to later correct them on my mac back at home. Experiencing that spell checking was now totally doable on iOS 11, I am more than happy to share what I’ve found. I use the word activate, because there are several ways to progress workflows on iOS devices. Yes, if using only the touch screen, I mean double tap; but if a future reader is using a Bluetooth keyboard, a braille display, or the new O6, there are multiple more ways they could do it.

1. Open a document you want to spell check.

2. Make sure VoiceOver says “text field is editing” “quick nav off”.

3. rotate the VoiceOver rotor left, often only 1 menu item to “misspelled words”.

4. swipe up or down to move between a list of misspelled words.

5. after stopping on a misspelled word you want to correct, change the rotor to “edit”. Edit should be 1 rotor item to the left of misspelled words.

6. Swipe up or down to “select” and activate it. VoiceOver should say “word” selected, where word is the word you selected.

7. then swipe up or down until you get to “replace”, and activate that.

8. after a short wait, probably less than 1 second, VoiceOver will say a word, probably similar to the misspelled word you’re wanting to change. Some times, VoiceOver may also instead say text field but in this case just swipe right to the first item in the word suggestions list.

9. If that is the word you want, activate it; if not you can swipe right or left to move through the list of word suggestions until VoiceOver speaks the word you want. Then activate that word.

10. The new word you chose from the list should have replaced the previously misspelled word you wanted to correct.

Back when looking at the list of suggested words, you may also change the rotor to character and spell the words letter by letter. Yeh that works. Notifications arriving on the scene may be a different matter however.

After a few times through the process, you will probably find that it’s not as complicated as it looks. This not only works by using the touch screen, but also by using Bluetooth keyboards. If your braille display keyboard can also use the rotor, it should work for that also.

For someone who writes a lot while on the run, adding “misspelled words” to the rotor may be one of iOS 11’s most appreciated features.

A virtually unknown iOS VoiceOver feature, automatically announcing the time every minute

As an iOS VoiceOver user, several years ago I discovered that if I touched the clock status bar item VoiceOver would continue to automatically announce the time until interrupted by touch or certain incoming notifications. I can’t remember exactly when this became a feature, but it was more than 3 years ago, and I’ve never heard anyone else mention it nor have I seen it documented anywhere; so I thought I’d share it, as I can imagine it being helpful to many others.

This time announcement feature is very useful to me, especially when I’m in a hurry, and need to get ready for something quickly. I even use it occasionally with my Anker Soundcore Bluetooth speaker in the shower; time can really accelerate there. Time announcements are also available on macOS in the Date & Time Preference Pane, near the bottom of the clock tab. Though not customizable to the exact minute; 15, 30, and 60 minutes are optional. I could also see this useful on the Apple watch, though it’s not there yet.