My thoughts about what blind people see, or don’t see. Spoiler, it’s not a short answer :)

Posted on November 7, 2018

Being totally blind since birth, I get asked periodically what I see, as if sighted people can’t begin to imagine what not seeing anything would be like if they were blind; this is, actually, in fact the case. Before I had ever been in an airplane I used to have dreams about flying in one, they were all fantastically wrong. My brain had no accurate data to base those dreams on, so it just made things up. While we’re at it, a congenitally blind person will probably not dream visually because they have no visual memories to draw on. A recent study, however, shows that it is still hypothetically possible to do so. Blind people who lost their sight as early as age 5 or 6 can dream visually over the rest of their lives, and many in this situation do; and now we’re getting back to the original question.

Damon Rose, a blind journalist with the BBC has also answered this question, but very differently from what I experience. He had sight until age 13, and went blind for a different reason than I; so beyond that fact that his primary visual cortex somewhat developed, and that his optic nerves may still have some residual connections, he also has visual memories his brain can still play with.

I was born 3 months premature and became blind in an incubator from Retinopathy of Prematurity; destroyed optic nerves, my eyes never grew beyond the size of a 2 year-old’s, my brain eventually remapped. People have respectfully said to me, “If I blindfold myself I can still see black, so is that what you see?”

Seeing blackness means you can distinguish between light and dark, which means you can perceive visually, so I thought about this for a while and came up with an analogy that I think makes sense.

Say you had a radio, and it picked up a bunch of stations. We could say those stations were different colors and the white-noise or static between the stations was black, except I can’t call it black noise, because that actually exists, and  means near absolute silence. Then one day, all of the stations disappear with no explanation, so you still turn on the radio occasionally to see if they come back, but they never do. So now you’re just figuratively hearing black. Then one day you learn those stations will never come back, so you unplug the radio and throw it away. Now you aren’t even hearing black, you’re hearing in fact nothing; as far as the radio is concerned, which actually then would equate to black noise, and that time you had been using to listen to radio stations is now free for other things.

This is basically what the  visual cortex in my brain, and those of congenitally blind people does. Once enough time goes by where there is no input from the eyes through their optic nerves, the brain begins to remap and neuroplasticity happens. When a baby, humans have many billions of neurons all waiting to develop, and totally impressionable; and since 90% of a sighted person’s sensory input is visual, that’s where many of them go. When there is no visual input they get programmed to do other things. When a totally blind person reads braille for example, the same activities go on in the primary visual cortex that happen when a sighted person reads print. Blind people are often thought to hear better than those who can see, but deaf people are also way more observant visually than sighted people who can still hear; in part the brain’s resources devoted to the lost sense redistribute and enhance those that are left. Enhance sometimes means increased, but not always , as in the case of children who lose a sense after age 2 or 3, but in those cases the brain at least is more focused on those senses that remain.

So in my case, what do I see? Absolutely nothing, which is not black; black would have been like the static on that radio, or the number zero. Consider for a moment, that zero is way more meaningful than nothing in mathematics; it is the center of the number line after all.

Advertisements

A much improved way to spell check documents using VoiceOver on iOS beginning with iOS 12.1

Posted on November 6, 2018

There was a way, reproducible though not very convenient to spellcheck documents in iOS 11 using VoiceOver and at the time I thought it was cool though somewhat difficult to remember, but wrote a blog post about it anyway.

A big thank you to Scott Davert, who discovered that in iOS 12.1 the spell checking process was made much more efficient. He recorded it in a recent Applevis podcast from where I learned about it. I have to admit even though I wrote the blog post defining how to correct spelling in iOS 11, I rarely if ever used it and just wrote things on my iPhone as I am now but then corrected the spelling on my MacBook. I think I can honestly say I will correct spelling much more or — probably whenever I write anything beyond a sentence or 2 on my iOS devices in the future. This VoiceOver improvement, truly makes any iOS device a real writing device for blind users.
In fact, I just spell checked the last paragraph possibly in less than 30 seconds on my iPhone, This will be the coolest feature for me in iOS 12.1.

Let’s figure out how to do it.

1. Set VoiceOver rotor to misspelled words.

2. Swipe up or down, or press up or down arrows on your keyboard or braille displays to find the previous or next misspelled word.

3. Move right with a finger or keyboard each time will show you the next in a list of correctly spelled suggestions.

4. If you find the word you want, double tap on it or activate it with your keyboard and it will replace the misspelled word.
5. If the word you want is not in the list, the offensive misspelling you’re on is selected so pressing delete or backspace will erase it. Then you can enter another attempt.

Now maybe I can start blogging directly from iOS. If still in school, I think I could seriously write a paper completely on iOS. Since I almost always use a Bluetooth keyboard, I could even do it on an iPhone. Hey, When not writing blog posts or school papers, spell checking emails will also be a snap.

Another post I wrote with additional thoughts about Apple’s Face ID

Posted on October 1, 2018

Two weeks ago, I wrote about how I strongly dislike Apple’s Face ID; and although some in the blind community have agreed with my thoughts, there are also some who do not. They say “oh Face ID is just fine and it’s accessible,” etc. Accessible for sure, but not efficient; and in some cases totally breaks people’s workflows.

Because accessibility has lacked efficiency in many ways, some bigger than others over the years, blind people have often collectively accepted that they have to deal with it. There might be an app that is almost accessible except 1 screen so blind users memorize the screen and don’t complain. Then there are environments like  amateur or ham radio, where the equipment isn’t really accessible at all, or the accessible versions are often considerably more expensive, (though things are improving) so blind people write and share guides to help them get around the problems. I respect the people who wrote those guides, and appreciated them many times, and even wrote a few myself, but the question needs to be asked: why are we just rolling over as a community and accepting this? Why aren’t we pushing back harder and finding polite and respectful ways to ask for or joining in to help create more accessible and/or even more  efficient solutions.

With some pretext, I now return to the Face ID situation. To date, Twitter is the most accessible and efficient social media for blind users, and it is there will you will find them discussing anything and everything they find important. As we have had a year with Apple’s Face ID, there have been tweets among the blind community about it, though I find most of them just saying things like, ye it’s ok, I got used to it; or even a few of it’s amazing works great. Santiago’s tweets shared here I think encompass much of this mentality.

Santiago – @kevinrj There are blind people that say that faceID is an accessibility issue, but I don’t feel like it is. Unlocking your phone with touchID in your pocket isn’t an accessibility feature. It’s simply a convenience. A nice one at that, but not necessary.
Santiago – @kevinrj Well, that convenience certainly goes away, and I honestly wish I could have it back sometimes. Could FaceID improve? Certainly, but I think everyone experiences similar issues with it. Even sighted people.

Santiago – @kevinrj You do also have the security issue with it. When it comes to sighted people, the phone actually looks for your attention in order to unlock. It automatically turns it off if you’re using VoiceOver. I have a work-around, but again… not very convenient. 
Santiago – @kevinrj I’m all about efficiency. Heck, I sometimes upgrade my products, because they slow down after years and affect that greatly, but I, a totally blind person, have efficiently used my iPhone X for almost a year now. Is there a learning curve? Yes. But it’s accessible.

Yes, as I said earlier, it’s accessible, but that doesn’t mean efficient. Could I take a bus from New York to Las Angeles? Sure, it’s totally accessible, and would even be way cheaper, but if I had to do it every 2 months for my job I would not like wasting up to a week each time I could save by flying. For a blind person, Face ID is very much like that; even though some are making it work or even enjoying it, some also enjoy long bus rides; I haven’t found that from my own personal experiences, but I think it has something to do with the scenery.

Sina Bahram has an ABD in  PHD in computer science and is probably the most advanced computer user in the blind community who I know of. Last week I found a thread on Twitter with him and a few other people about why Face ID is a step back for blind accessibility. These are not just opinions, but hard facts that should be taken seriously.

In this thread, screen curtain is mentioned, but is mostly only called curtain, which I realized may be confusing to those who don’t know about it. Screen curtain is an amazing VoiceOver feature that along with bringing added security and privacy to VoiceOver users on Apple products can definitely also save battery life.

Sina Bahram – Wow, I was not expecting to do this, but I’m returning the iPhone 10S. I cannot believe what an atrocious experience this is now. FaceID is nowhere near as efficient or accessible as fingerprint sensor. Not having a home button is ridiculous. No more immediacy of access. #a11y

James Teh – @SinaBahram I suspect my feeling will be the same. Some people seem to cope okay, but I just can’t see how face ID could be more efficient for us. And my understanding is you have to disable the gaze stuff, which means we reduce security due to disability, but I might be misunderstanding.

Michiel Bijl – @jcsteh @SinaBahram I’d be curious to know how that is measured. If it’s done by detecting whether your eyes are pointed at the phone with eyelids open—it might not be a problem for everyone.
Of course you can always use the passcode but that’s a major step back from Touch ID.

Michiel Bijl – @SinaBahram The interaction to go home with VoiceOver is weird. I mess that up rather regularly. Any tips?

James Teh – @MichielBijl @SinaBahram Also, the whole idea of having to actually pick up my phone and bring it to my face just to unlock it… so many kinds of bleh. The number of times I just quickly look at my phone with one hand while it sits on my desk…

Julianna Rowsell – @SinaBahram A friend of my is having similar feelings. His physical access disability doesn’t allow him to effectively use it. The angles to his face are wrong and such so the recognition  software doesn’t authenticate. – Retweeted by SinaBahram

Sina Bahram – @jcsteh @MichielBijl Exactly. This is just simply unacceptable. I really hope that some advocates inside of Apple bothered trying to speak up. It’s just not like them, sir. There are so many use cases this completely destroys.

Sina Bahram – @MichielBijl Yes, the tip is to get another phone. I’m not being sarcastic. I just boxed mine up. I am an expert in this topic and the most power user I have encountered, not bragging just facts, and this is unusable. So, I’m done. I’ll try to motivate some internal Apple convos, but no idea.
Sina Bahram – @MichielBijl @jcsteh I, plus its default now if it detects VO running, have turned off attention requirements. That’s not the FaceID issue here. The issue is that it doesn’t work in the dark with curtain on and it requires your face instead of your hand that’s already touching the device.

Sina Bahram – @jcsteh You are absolutely not misunderstanding. You are reducing security because of disability. Welcome to every X phone from original to the S and Max. Other concerns make this unusable, though.

James Teh – @SinaBahram @MichielBijl Oh… I didn’t think of that, and that’s super frustrating. So I’d have to turn off curtain to unlock my phone or go turn on a light? How utterly ridiculous.

Sina Bahram – @jcsteh @MichielBijl Yup, I can confirm that. I turn off curtain, and oh look, it’s magically like 10X more accurate, and turn it back on … pin code it is!
Tomi 🇭🇺 – @SinaBahram wait, doesn’t work in the dark with curtain on? Is this a thing? Does having screen curtain change this? I thought infra-red works with low or no light anyway since it’s using its own infra-red beams, so most people I read about using it said it works at night /in dark.

Sina Bahram – @tomi91 Everyone assumes infra-red means works in dark. This is not true. Infra-red buys you depth sensing independent of (visible)  light. That barely matters since gaze is disabled by most VO users. Face ID  still needs (visible) light in addition to depth info.

Tomi 🇭🇺 – @SinaBahram oh that’s interesting. I wonder if people re-enable attention mode if it changes. But then again some people can’t even get their eyes straight (like me) so it’d probably just fail over and over. Man I’m really glad about my 8 now, thanks for that hope. lol

Sina Bahram – @tomi91 That feature is automatically turned off for VO users, so the eyes thing is not an issue itself, though it negatively impacts everything from lack of full messages on lock screen to everything else.

I wish Apple had an iPhone pro line, kind of like their iPad pros. Face ID would be a great feature on those phones, but then instead of an iPhone Xr they could have what the iPhone 9 should have been, still an a12 processor but maybe a slightly lower quality camera, a smaller screen, and also still a home button.

There are still people who would like a smaller phone. There are even still some sighted people who are not obsessed with the latest perfections in screen technology, or don’t even care if they have the best camera. There are even some sighted people who would still prefer Touch over Face ID, even Alex Lindsay, who is one of the most visually oriented people I know of, said on MacBreak Weekly recently, that he personally prefers Touch ID but thinks phones should actually have both.

 

My thoughts about how as a VoiceOver user, Face ID although accessible is not very efficient

Posted on September 13, 2018
On January 9, 2007 Steve Jobs introduced the first iPhone, but I wasn’t sucked into his reality distortion field quite yet. My friend Nathan Klapoetke was ecstatic about it though and told me later that day when reinstalling Windows for me that he couldn’t wait to buy one; that was a very long 6 months for Nathan.

I, and much of the blind community were frustrated though, and felt left out. Some of us new that the touch screen, just a piece of glass for us then, would be the future and that soon no companies would make phones with number pads anymore. Then, at WWDC 2009, Steve Jobs introduced iOS 3 and mentioned VoiceOver during the keynote; the blind could use modern phones again. Those 2 years were hard though. Many of us in the blind community had Nokia phones, and some were playing with Windows phones, but they weren’t changing our lives anywhere near as much as the iPhone would later. This is how I feel current things are with the iPhone X models starting last year.

Apple is obsessed with Face ID and is making it their future, but I’m feeling a bit left out again.

Yes Face ID is accessible, I played some with an iPhone X last fall but as more people are recently realizing, accessible does not always mean efficient. I found it quite finicky when setting it up, and because I wear prosthetic eyes, when they’re out, I appear as a different person. There is an article written specifically for a blind person setting up Face ID, but note that they tell you to put your face in the frame, but don’t actually explain how to do it when unable to see the frame. Then there’s the trick of figuring out how far away to hold the phone from your face. I also have no idea how to move my face around in a circle while simultaneously staring at one spot. Ok, I understand the concept, but can’t physically do it. It took me about 15 minutes to get Face ID setup the first time. Another problem is how attention mode is disabled by default when VoiceOver is enabled. I understand that Face ID would work less with it on for blind people who are unable to visually focus, but that’s a potential security hole. A blind person could have their phone next to them on their desk and another person could quietly walk by pick up the phone pan it by the blind person’s face, and they’re in.

Beyond setting it up and all the problems of having eyes that don’t work, there is the inconveniences of work flow. My phone is often on my belt, and most people blind or sighted keep their phones in a pocket. If you’re blind, and have headphones, why would you ever want to  take your phone off your belt, or out of your pocket to use it? Taking your phone out just to authenticate gets annoying real fast, it also may require the person to stop what else they were doing. I’m rarely if ever looking at my phone when I use it, often my face is completely in a different direction.

I could go on a rant, oh wait I already am. I could be cynical, flame Apple, or just give up and switch to Android, and some might; but from my experience where although it took 2 years, Apple did bring VoiceOver to the iPhone in 2009. Here are some thoughts.

Things I could do today to unlock my phone, I have a long complicated password, I really don’t want to enter it every time. I could use the Bluetooth keyboard I already carry with me. I could plug in a series 4 Yubico key when I’m home or not around other people or in a situation where having something rigid plugged into the phone has a low probability of being bumped or damaged. These are only hacks though, I’m really hoping Apple can come up with an awesome solution again.

The iPhone can already unlock the Apple watch, and the watch can unlock my mack; I really hope that my Apple watch could unlock my iPhone too. Just having the phone unlock if my watch is on would definitely not be secure at all, but with the new Apple watch series 4 having some capacitance in the digital crown, having to touch the crown to unlock the phone could be a start. Putting Touch ID into a future model of the watch crown would be awesome.

I already wrote about how there are solutions that let me use my wired headphones even with no headphone jack, I know there are solutions for Touch ID equivalents that don’t include Face ID too. Whether Apple implements any of them is still a question, but I really hope they will realize how visual only options inconvenient a non-trivial segment of their market, and give us an alternative.

How to play one of my more favorite games, Quixo, completely wood and low tech remotely with friends who live farther away

since 2014 one of the highlights of my summers has been a day at the Bristol WI Renaissance fair. Last year I found out they had a  game store , and was introduced to the game Quixo. Quixo is some times explained as Tic Tac Toe on steroids, but that’s superficial at best. The game has 5 rows of 5 cubes, starting out all facing up blank. Players either 2 to a game, or 2 teams pick either x or o and begin. The wooden set of Quixo have the letters engraved into the cubes so it is totally accessible literally out of the box; the travel version, however, uses plastic cubes and on those pieces the letters are not tactile. Players actually move the pieces around and whole rows or columns of pieces in a move, it’s lots of fun.

I’ve enjoyed it and even gave a few to family for Christmas last year. The problem though is Quixo seems to be rarely known about, so most my friends who play it thus far don’t live near enough to play in person very often, so I’ve been thinking about how to play remotely. I thought about how people play chess using algebraic notation, and went from there.

It took me a while, but then I finally realized that if 2 people played a game of Quixo and thought of the board as a1 being in the lower left, e1 lower right, a5 upper-left, and e5 upper-right, and communicated moves as something like a1 to a5; it would work. It would be the same as if player 1 sat at the board and made a move, then got up and player 2 sat in the exact same chair and made a move. This would allow people to play by text or some kind of voice call.

My friend Andrew Hanson was over last weekend and he said when people played remotely they didn’t even have to say a1 is lower-left corner, that no matter which corner a1 was in if the player always referred to that square as a1 it would work, though the boards would look different. . Trying to rotate spacial frames of reference around in my mind caused a meltdown, so I’ll just prefer to always call a1 lower-left and leave it at that.

I still think that both players need to refer to the squares with the same notation in a game, especially when played on the same board, or at best it will soon become very complicated.

At any rate, Quixo can now easily be played remotely. Try it out, it can be lots of fun.

How for me, bone conduction headphones by Aftershokz is one of those technologies that is not just nice to have but a huge game changer for blind people

Posted on July 17 2018
Shortly after Christmas when i. Was 5, my sister Andrea introduced me to headphones with one of those single ear plugs from the 1970s, and showed me how I could plug it into my radio and listen to it. After a few minutes of private listening I couldn’t understand at all why she or anyone else around me couldn’t hear it. That just blew my 5-year-old mind, and it hasn’t been the same since.

As I got older, headphones became more a part of my workflow. I knew there are good speakers out there, I’m just a headphone guy at ear. Part of this came from. Using screen readers on first computers and later phones, besides using headphones for privacy, I’m sure people around me appreciate not being annoyed by it. I even used headphones on the bus, at work, and even when I took classes in college. There was still one area where headphones couldn’t help me though, when traveling alone using the white cane. I began to use GPS apps on my phone, but all the headphones I knew of still blocked some of the sounds from my environment, thus I didn’t feel safe using them when walking, and when traffic was loud I couldn’t hear the GPS info on the phone. Then I learned about bone conduction.

My friend Hai Nguyen Ly told me about Aftershokz and their line of bone conducting headphones, and how they rested on the face using transducers to convey sound through the cheek bone thus leaving the ear completely uncovered and blocking none of a person’s natural hearing. I couldn’t afford them then, so Hai sent me an old pair he was no longer using. Like Andrea introducing me to headphones so long ago, Hai improved my life again.

The first time I used the Aftershokz psychologically I wasn’t quite sure that it really wasn’t blocking my hearing, but it didn’t take long for me to realize that they really weren’t. I could hear traffic just fine, and the GPS info from my phone was always hearable even in the loudest truck or bus roared by. Now 5 years later after buying 2 more pair of Aftershokz headphones, I still use them pretty much every day. I wear them all the time when I’m in public, they work great at meetings and conferences. Even when I’m not needing to hear traffic when cane traveling, they still let me have the ability to use my phone without interrupting anyone around me but still be able to hear what they’re saying. Yes, trying to understand both audio streams might not work as well as I’d like, but sighted people get distracted too.

Some of you reading this might be wondering, ok but why does this matter? In my last post I talked about how for most of us most of the time technology is a nice convenience, but for those with disabilities, technology can be a huge life changer; this is definitely one of them.
Especially for blind users of screen readers. Bone conduction headphones allow them to get the info they need in real time while still having full access to their environment through their primary sense. Bone conduction technology may have been initially invented for the military, but now thankfully it is now also being used to help humans also be more human.

Technology for most is a nice thing to have, but for those with disabilities, how huge of a game changer technology is in improving their lives cannot be exaggerated

Posted on July 6, 2018

As I wrote about in my last post, technology in its most basic definition is an innovation which makes some task easier, and for most of us that is most of the time, the case; but for people with disabilities, it is way more than that.
Technology helps us get places faster and safer. Technology has made communication more possible and convenient today than was ever thought to ever be possible even a short time ago. Technology is nice, cool, fun, entertaining, amazing, and many more words beyond this sentence; but for those with disabilities, technology is way more than any of those accolades, technology is life changing; sometimes in unimaginable ways. I think sometimes even in some ways that can only be realized not in blog posts or videos, but in first hand experiences.

One of my favorite podcasts is the Mac Power Users, and several episodes ago David Sparks, and Katie Floyd were talking about one of my least favorite forms of technology, the pdf file format. If pdfs have done anything good for me though it is that many more people in recent years are aware of and use optical character recognition (OCR)
During the episode David talked about how he’d OCR scanned a bunch of documents and that enabled him to find a phrase he heard and needed to refer to during a court case. OCR is great even if you’re sighted because it enables you to very quickly search documents and even automate tasks like organizing and processing them. It wasn’t that many years ago that OCR was thought of as unnecessary, slow, not worth it, and often avoided; but for blind people optical character recognition is one of the most enabling technologies to come out since the invention of braille itself.
Yes braille is awesome and crucial to the education and development of a blind person’s intellect, but only about 1% of all the books in the world are ever commercially produced in braille. Audio books, and text to speech together with OCR have made many more books available to blind readers, but nothing will replace braille for things like mathematics and program code. I know some blind geeks will flame me for this  and say they don’t need braille at all and they write program code all day; but I know from years of personal experience that no matter how good they are at hearing text to speech spell out arcane function name spellings and all types of punctuation, that using an exorbitantly expensive refreshable  braille display would significantly increase their efficiency; a whole other topic for another post another day.

Beyond all of that, OCR means that if there isn’t any e-book available for a title, a blind person can probably buy a print copy, and after scanning it in, have a copy they can read in braille or text to speech relatively quickly.

The smartphone is also something that along with the internet and OCR is a close second to braille in how huge of a life changer inventions can be for the improvements of the lives of blind people.
When the iPhone was first announced in 2007 I was seriously frustrated thinking that a touch screen would never be accessible, and also knowing that touch screens were the future. No one outside of Apple saw VoiceOver coming to the iPhone in June 2009 but that along with Talkback on Android some time later may be the largest improvement to the blind world by technology in the last 20 years.
Yes I can call people on my iPhone, or text them; I can also play games or listen to music, but that is only the beginning. There are GPS apps that not only tell me how to get to places, but also tell me landmarks and street names as I travel, almost like a sighted person looking around and telling me in my ear what they see passing by; , a true form of augmented reality and it’s not even visual. Some blind people use their phones to read small documents or food labels on the fly, and if it can’t read the text maybe a barcode scan instead.
There are still talking devices made specifically for the blind, like for example, scales for weighing, thermometers for body temperature, cooking, or outside. There are talking glucose monitors and other things not mentioned here. These devices are often significantly more expensive than their mainstream counterparts, and before smartphones they were the only options blind people had. Some still feel more comfortable with them than trying to pair a smartphone to a more modern device, but that’s just another way technology is improving our world. A talking smartphone, plus an accessible app, plus a mainstream bluetooth device; means an often very accessible and usable device that has more features than the blind-specific devices also out there, and they’re also devices many people have, sighted and blind. If I buy a bluetooth scale and don’t understand how to add my weight to the health app, I can ask anyone who has the scale, not only the few blind people I might know who bought the blind-specific one.

Yes there are other disabilities that being blind, like people in wheelchairs, or people with cognitive disabilities, and those are just as important to realize, and they have been just as impacted by technology, just take a moment to remember what Stephen Hawking could do. Helen Keller if still alive today would probably be amazed at what more she could have done.

Please think about instead of just how cool your app is or what fun it can be, but more of how can You improve someone’s life even or more especially if they perceive information and interact differently with interfaces than You do. If you are a  developer or designer reading this post, please take a moment and step outside the box that is your subset of reality, and not only imagine how you could make the world, including the lives of those who think differently better, , but then actually do it.