Category Archives: Visual interfaces

Another post I wrote with additional thoughts about Apple’s Face ID

Posted on October 1, 2018

Two weeks ago, I wrote about how I strongly dislike Apple’s Face ID; and although some in the blind community have agreed with my thoughts, there are also some who do not. They say “oh Face ID is just fine and it’s accessible,” etc. Accessible for sure, but not efficient; and in some cases totally breaks people’s workflows.

Because accessibility has lacked efficiency in many ways, some bigger than others over the years, blind people have often collectively accepted that they have to deal with it. There might be an app that is almost accessible except 1 screen so blind users memorize the screen and don’t complain. Then there are environments like  amateur or ham radio, where the equipment isn’t really accessible at all, or the accessible versions are often considerably more expensive, (though things are improving) so blind people write and share guides to help them get around the problems. I respect the people who wrote those guides, and appreciated them many times, and even wrote a few myself, but the question needs to be asked: why are we just rolling over as a community and accepting this? Why aren’t we pushing back harder and finding polite and respectful ways to ask for or joining in to help create more accessible and/or even more  efficient solutions.

With some pretext, I now return to the Face ID situation. To date, Twitter is the most accessible and efficient social media for blind users, and it is there will you will find them discussing anything and everything they find important. As we have had a year with Apple’s Face ID, there have been tweets among the blind community about it, though I find most of them just saying things like, ye it’s ok, I got used to it; or even a few of it’s amazing works great. Santiago’s tweets shared here I think encompass much of this mentality.

Santiago – @kevinrj There are blind people that say that faceID is an accessibility issue, but I don’t feel like it is. Unlocking your phone with touchID in your pocket isn’t an accessibility feature. It’s simply a convenience. A nice one at that, but not necessary.
Santiago – @kevinrj Well, that convenience certainly goes away, and I honestly wish I could have it back sometimes. Could FaceID improve? Certainly, but I think everyone experiences similar issues with it. Even sighted people.

Santiago – @kevinrj You do also have the security issue with it. When it comes to sighted people, the phone actually looks for your attention in order to unlock. It automatically turns it off if you’re using VoiceOver. I have a work-around, but again… not very convenient. 
Santiago – @kevinrj I’m all about efficiency. Heck, I sometimes upgrade my products, because they slow down after years and affect that greatly, but I, a totally blind person, have efficiently used my iPhone X for almost a year now. Is there a learning curve? Yes. But it’s accessible.

Yes, as I said earlier, it’s accessible, but that doesn’t mean efficient. Could I take a bus from New York to Las Angeles? Sure, it’s totally accessible, and would even be way cheaper, but if I had to do it every 2 months for my job I would not like wasting up to a week each time I could save by flying. For a blind person, Face ID is very much like that; even though some are making it work or even enjoying it, some also enjoy long bus rides; I haven’t found that from my own personal experiences, but I think it has something to do with the scenery.

Sina Bahram has an ABD in  PHD in computer science and is probably the most advanced computer user in the blind community who I know of. Last week I found a thread on Twitter with him and a few other people about why Face ID is a step back for blind accessibility. These are not just opinions, but hard facts that should be taken seriously.

In this thread, screen curtain is mentioned, but is mostly only called curtain, which I realized may be confusing to those who don’t know about it. Screen curtain is an amazing VoiceOver feature that along with bringing added security and privacy to VoiceOver users on Apple products can definitely also save battery life.

Sina Bahram – Wow, I was not expecting to do this, but I’m returning the iPhone 10S. I cannot believe what an atrocious experience this is now. FaceID is nowhere near as efficient or accessible as fingerprint sensor. Not having a home button is ridiculous. No more immediacy of access. #a11y

James Teh – @SinaBahram I suspect my feeling will be the same. Some people seem to cope okay, but I just can’t see how face ID could be more efficient for us. And my understanding is you have to disable the gaze stuff, which means we reduce security due to disability, but I might be misunderstanding.

Michiel Bijl – @jcsteh @SinaBahram I’d be curious to know how that is measured. If it’s done by detecting whether your eyes are pointed at the phone with eyelids open—it might not be a problem for everyone.
Of course you can always use the passcode but that’s a major step back from Touch ID.

Michiel Bijl – @SinaBahram The interaction to go home with VoiceOver is weird. I mess that up rather regularly. Any tips?

James Teh – @MichielBijl @SinaBahram Also, the whole idea of having to actually pick up my phone and bring it to my face just to unlock it… so many kinds of bleh. The number of times I just quickly look at my phone with one hand while it sits on my desk…

Julianna Rowsell – @SinaBahram A friend of my is having similar feelings. His physical access disability doesn’t allow him to effectively use it. The angles to his face are wrong and such so the recognition  software doesn’t authenticate. – Retweeted by SinaBahram

Sina Bahram – @jcsteh @MichielBijl Exactly. This is just simply unacceptable. I really hope that some advocates inside of Apple bothered trying to speak up. It’s just not like them, sir. There are so many use cases this completely destroys.

Sina Bahram – @MichielBijl Yes, the tip is to get another phone. I’m not being sarcastic. I just boxed mine up. I am an expert in this topic and the most power user I have encountered, not bragging just facts, and this is unusable. So, I’m done. I’ll try to motivate some internal Apple convos, but no idea.
Sina Bahram – @MichielBijl @jcsteh I, plus its default now if it detects VO running, have turned off attention requirements. That’s not the FaceID issue here. The issue is that it doesn’t work in the dark with curtain on and it requires your face instead of your hand that’s already touching the device.

Sina Bahram – @jcsteh You are absolutely not misunderstanding. You are reducing security because of disability. Welcome to every X phone from original to the S and Max. Other concerns make this unusable, though.

James Teh – @SinaBahram @MichielBijl Oh… I didn’t think of that, and that’s super frustrating. So I’d have to turn off curtain to unlock my phone or go turn on a light? How utterly ridiculous.

Sina Bahram – @jcsteh @MichielBijl Yup, I can confirm that. I turn off curtain, and oh look, it’s magically like 10X more accurate, and turn it back on … pin code it is!
Tomi 🇭🇺 – @SinaBahram wait, doesn’t work in the dark with curtain on? Is this a thing? Does having screen curtain change this? I thought infra-red works with low or no light anyway since it’s using its own infra-red beams, so most people I read about using it said it works at night /in dark.

Sina Bahram – @tomi91 Everyone assumes infra-red means works in dark. This is not true. Infra-red buys you depth sensing independent of (visible)  light. That barely matters since gaze is disabled by most VO users. Face ID  still needs (visible) light in addition to depth info.

Tomi 🇭🇺 – @SinaBahram oh that’s interesting. I wonder if people re-enable attention mode if it changes. But then again some people can’t even get their eyes straight (like me) so it’d probably just fail over and over. Man I’m really glad about my 8 now, thanks for that hope. lol

Sina Bahram – @tomi91 That feature is automatically turned off for VO users, so the eyes thing is not an issue itself, though it negatively impacts everything from lack of full messages on lock screen to everything else.

I wish Apple had an iPhone pro line, kind of like their iPad pros. Face ID would be a great feature on those phones, but then instead of an iPhone Xr they could have what the iPhone 9 should have been, still an a12 processor but maybe a slightly lower quality camera, a smaller screen, and also still a home button.

There are still people who would like a smaller phone. There are even still some sighted people who are not obsessed with the latest perfections in screen technology, or don’t even care if they have the best camera. There are even some sighted people who would still prefer Touch over Face ID, even Alex Lindsay, who is one of the most visually oriented people I know of, said on MacBreak Weekly recently, that he personally prefers Touch ID but thinks phones should actually have both.

 

Advertisements

My thoughts about how as a VoiceOver user, Face ID although accessible is not very efficient

Posted on September 13, 2018
On January 9, 2007 Steve Jobs introduced the first iPhone, but I wasn’t sucked into his reality distortion field quite yet. My friend Nathan Klapoetke was ecstatic about it though and told me later that day when reinstalling Windows for me that he couldn’t wait to buy one; that was a very long 6 months for Nathan.

I, and much of the blind community were frustrated though, and felt left out. Some of us new that the touch screen, just a piece of glass for us then, would be the future and that soon no companies would make phones with number pads anymore. Then, at WWDC 2009, Steve Jobs introduced iOS 3 and mentioned VoiceOver during the keynote; the blind could use modern phones again. Those 2 years were hard though. Many of us in the blind community had Nokia phones, and some were playing with Windows phones, but they weren’t changing our lives anywhere near as much as the iPhone would later. This is how I feel current things are with the iPhone X models starting last year.

Apple is obsessed with Face ID and is making it their future, but I’m feeling a bit left out again.

Yes Face ID is accessible, I played some with an iPhone X last fall but as more people are recently realizing, accessible does not always mean efficient. I found it quite finicky when setting it up, and because I wear prosthetic eyes, when they’re out, I appear as a different person. There is an article written specifically for a blind person setting up Face ID, but note that they tell you to put your face in the frame, but don’t actually explain how to do it when unable to see the frame. Then there’s the trick of figuring out how far away to hold the phone from your face. I also have no idea how to move my face around in a circle while simultaneously staring at one spot. Ok, I understand the concept, but can’t physically do it. It took me about 15 minutes to get Face ID setup the first time. Another problem is how attention mode is disabled by default when VoiceOver is enabled. I understand that Face ID would work less with it on for blind people who are unable to visually focus, but that’s a potential security hole. A blind person could have their phone next to them on their desk and another person could quietly walk by pick up the phone pan it by the blind person’s face, and they’re in.

Beyond setting it up and all the problems of having eyes that don’t work, there is the inconveniences of work flow. My phone is often on my belt, and most people blind or sighted keep their phones in a pocket. If you’re blind, and have headphones, why would you ever want to  take your phone off your belt, or out of your pocket to use it? Taking your phone out just to authenticate gets annoying real fast, it also may require the person to stop what else they were doing. I’m rarely if ever looking at my phone when I use it, often my face is completely in a different direction.

I could go on a rant, oh wait I already am. I could be cynical, flame Apple, or just give up and switch to Android, and some might; but from my experience where although it took 2 years, Apple did bring VoiceOver to the iPhone in 2009. Here are some thoughts.

Things I could do today to unlock my phone, I have a long complicated password, I really don’t want to enter it every time. I could use the Bluetooth keyboard I already carry with me. I could plug in a series 4 Yubico key when I’m home or not around other people or in a situation where having something rigid plugged into the phone has a low probability of being bumped or damaged. These are only hacks though, I’m really hoping Apple can come up with an awesome solution again.

The iPhone can already unlock the Apple watch, and the watch can unlock my mack; I really hope that my Apple watch could unlock my iPhone too. Just having the phone unlock if my watch is on would definitely not be secure at all, but with the new Apple watch series 4 having some capacitance in the digital crown, having to touch the crown to unlock the phone could be a start. Putting Touch ID into a future model of the watch crown would be awesome.

I already wrote about how there are solutions that let me use my wired headphones even with no headphone jack, I know there are solutions for Touch ID equivalents that don’t include Face ID too. Whether Apple implements any of them is still a question, but I really hope they will realize how visual only options inconvenient a non-trivial segment of their market, and give us an alternative.

My realization that blind users of VoiceOver have had touch screen macs since 2009

In the early 1990s , Neal Stephenson released his now well known book “Snow Crash“. Then in 1999 he wrote the even more famous book Cryptonomicon. He also wrote a lesser known and much smaller essay entitled “In the Beginning was the Command Line“. In this essay Neal Stephenson talks about interfaces; not just of computers but how every object we use has an interface, beginning with his best friend’s dad’s old car. He talks about how beginning with the first mainframe terminals up to Microsoft Windows and Apple’s Macintosh, the way humans first interacted with the computer was through the command line. The command line is still great, takes few resources, and even still today potentially simultaneously provides many more options than any graphical interface often called a GUI. The GUI was invented though, for the same reason the command line replaced punch cards, the command line was way more efficient than punch cards for everyone, and then later the GUI was more convenient than the command line and easier to use , at least for sighted people . Graphical interfaces meant people didn’t have to remember tons of commands, and could become more familiar with a system faster. The mind with sight available to it, is great at making data points of spatially presented, and intersecting pieces of information. The GUI is great at displaying information in 2 or 3 dimensions to the visually enabled mind, instead of 1 dimension the command line presents. It was a great match, except for the abstractions we have still today. The arrival of Apple’s first Macintosh in 1984 blew the world away with it’s amazing graphics for that time, and the mouse? I’m sure many wondered why they would ever want a small furry rodent on their desk.

 

Along with computer mice, we also saw trackballs and trackpads, but they all still have the problem of dynamic rather than static reference.
When using a trackpad, if the mouse pointer is in the center of the screen, but the user places their finger on the lower left corner of the trackpad and slides to the right, the pointer will move from the center of the screen to the center of the right edge; and depending on how the settings are the finger may have only moved a half an inch, or 6 inches, still on the bottom of the trackpad. The mouse is even more removed by abstraction. I played with all 3 of these input devices during my years on Microsoft Windows, but was never productive with any them.

 

In early January 2007 while having dinner with my friend Nathan Klapoetke  he was ecstatic about the new iPhone that had just been announced; at the time I cringed in fear knowing that soon all cell phones would no longer have buttons and had no idea how a blind person would use them.

Two years later at WWDC 2009 Apple announced that VoiceOver was coming to the 3GS  and the blind community was completely blown away, no one saw that coming. Late in June 2009 I went to the Apple store and played with VoiceOver for the first time. I’d already read the iPhone manual’s chapter on VoiceOver, so I had a bit of an idea what to do, or at least how to turn VO on. I only had an hour to play, but except for typing, reading text and getting around basic apps didn’t seem too bad; 9 days later I bought one. The first text message I tried to send though, was a complete disaster, but I still knew my world had changed for the better.

The idea that when you touched some part on the screen, you were directly interacting with that icon or word made a lot of sense to people; blind and sighted alike. Even young children before they can read understand tapping on icons to start games they already , , know how to play. In some ways, the touch screen is the command line equivalent of visual interfaces. Being able to directly touch and manipulate screen elements is efficient on such a basic level, that I wouldn’t be surprised at all if using touch screen interfaces activated the same parts of the brain as making something out of play dough  or clay. There’s an interesting topic of discussion currently going on over how Microsoft tried to make Windows 8 a touch first interface, failed, and now how Windows 10 offers touch based interfaces for those who want it but still behaves like a traditional desktop. On the other hand, Apple has never tried to bring touch screens to their macOS at all until the 2016 line of MacBooks with the new touch bar, which really isn’t a screen at all and currently must only be an extra program’s offering as many macs still don’t have it.

And now, as Paul Harvey used to say, “, the rest of the story.” as most people would tell you, and as google searches would reply with, there are no Apple computers with a touch screen. Except, unless you’re a totally blind person using VoiceOver. The gestures VoiceOver users learn on their iPhones have been available to them on their macs as well starting with Snow Leopard. ; with trackpad commander on VoiceOver , behaves very much like it does on iOS. If with trackpad commander on, I touch the exact center of the trackpad, the mouse pointer is also on the exact center of the screen, and if VoiceOver announces an icon i want i just double tap to activate it. All of the abstraction I struggled with trying to use a mouse or trackpad without the commander mode are gone; but here’s a rare moment where sight still gets in the way. It is so instinctive for someone who can see to visually follow  where their hand is going, that even if most of them turned VoiceOver and trackpad commander on and speech off while still looking at the screen, they still would find it quite difficult to use. that the screen being separate from the trackpad visually is too abstract for many of them. The trackpad is obviously much smaller than the actual screen, though since I can’t see it that doesn’t really matter anyway, but beyond that as a VoiceOver user I’ve had a touch screen on my mac for 7 years. I and probably most other blind users still don’t use it as much as we probably should, or for many of us hardly at all, though I have found some ways in which it is way more efficient than using more traditional VoiceOver keyboard commands.

 

If I’m told that an interface control I want is in the lower left corner of the screen, using trackpad commander, I can go there directly. If I’m using an interface with a list of items in the center and buttons around the edge I can get to the list way faster than navigating there with a keyboard.

Tim Sniffen has published a book entitled “Mastering the mac with VoiceOver” in which he for the most part ignores keyboard commands altogether and teaches with trackpad commander mode instead. He trains many veterans who lost their sight while deployed. , and says after they become comfortable with VoiceOver on iOS it’s an easy transition for them to their macs. We VoiceOver users should probably listen more to Tim and learn from his experiential wisdom, and for the sighted proud, at least you know if your vision ever degrades so far that in the end you have to use VoiceOver, at least you’ll have a touch screen on your mac.