Category Archives: Efficiency

My thoughts on how productivity is way more portable than in the past, but how annoyingly some non-visual features only appear on products with larger screens

Posted on December 13, 2019

Johan Sebastian Bach probably wished he’d had a better way to work on his “Musikalisches Opfer”, “A Musical Offering” when he traveled back home to Leipzig from visiting King Frederick the great of Prussia  near Berlin in 1747. Ok, he probably instead really wished for something faster and more comfortable than a horse-drawn carriage.

I still remember being in high school, when I had a 30 minute ride between school and home, each way; wanting a portable typewriter that ran on batteries; all the homework I could have done.

In 1987, Deane Blazie and his company, Blazie Engineering, came out with the Braille ’n Speak, very awesome for its time. David Holladay told me there was nothing like it in the sighted world at that time. He considered buying one for his own use, even though he is sighted.

The original version was a braille note taker, had 192K of memory that could be separated into 30 separate files. It was kind of that portable typewriter I wanted; it had  actually already been out when I was a senior in high school, I just didn’t know about it until a month before   I started college.. The Braille ‘n Speak was 8 by 4 by 1/2 inches and was way easier to carry around than the Perkins braille writer which I still used in my dorm room, just not loudly in my classes.

People everywhere got excited when laptops became viable, affordable, and light weight enough to not break the average back, though the most portable models even today are still definitely breaking wallets. Even the lightest of laptops back then still had rather large footprints, until the netbooks came out in the late early two thousands. People liked netbooks because they were easy to carry, and although not very powerful were good for writing projects and web browsing.

Back in the blind community, Levelstar came out with both the Icon, and the Braille Plus, which were kind of the Braille ‘n Speaks of their time in 2007. The Icon was 3 by 5 by 1 inches, the Braille Plus slightly wider. They both ran linux,, excited many people, and probably would have gone much farther if it weren’t for the industry-wide disruptive changes brought by Apple and their iOS devices. Yes, they also killed the netbook market.

When the first iPads came out, people thought they were just personal screens to consume content on; but now, ten years later, as iPads continue to get more powerful, many people see with significant success, how much work they can do on them while on the run; just ask  Federico Viticci. You can even get very nice iPad cases with keyboards built-in so that they in some ways almost function like laptops, though probably a netbook would be more accurate. The problem is, most people are photon dependent, and they can’t seem to get anything done without a large flashy screen to look at.

I often hear people talk about how much they can get done with their iPads, and that’s great, but my annoyance is that this means the majority also think that the iPhone is useless as a productivity tool. This means: I can’t press command-tab to move between open apps on my iPhone, or get a list of keyboard shortcuts offered by an iOS app. Even though both of these are completely available using the exact same Bluetooth keyboard on  iPads. I can’t think of any technical reason why the same keyboard-focused productivity features used on iPads every day can’t also work on iPhones.

Since 2010 I still carry a foldable Bluetooth keyboard along with my iPhone, and since 2014 braille screen input, more often called BSI has also been available. This means that blind users have been able to be just as productive as their sighted counterparts with their larger iPad screens. If I were in college writing papers today, I could probably do 95% of them on my iPhone only needing my MacBook for finishing touches. If I were sighted, I’d probably want larger screens too, but I hopefully would still appreciate that small screens can also be just as effective.

It seems that many products only offer their high-end features on their larger screened devices. This can be interpreted by some as kind of a screen tax. “I, who can’t use a screen am forced to pay more money for a larger screen that I can’t benefit from, just to get a quad core CPU instead of two cores, etc.” “We only make phones with huge screens and no home button, because that’s what everyone wants.”  When individuals or companies come up with some new awesome product, I just wish they would not assume that everyone thinks the same ways they do.

I still smile when I remember how my friend Eric Knapp explained how confused people looked when they saw me typing on my keyboard but couldn’t see any screen or device; my iPhone was under the table attached to my belt. I wear bone conduction headphones in public, so VoiceOver speaks to me through them just fine. If the iPhone got those cool keyboard shortcuts on the iPad already mentioned above, I would consider that a nice step towards improvements.

There is a very cool app for both iOS and Android called Voice Dream Reader, it can read all kinds of file formats, and to some degree, makes digital text into a kind of audio book. I use it every day. I also thought how amazing it would be if I could have Voice Dream Reader on my Apple watch, the smallest e-book reader ever. Alas, the Apple watch can only play audio through Bluetooth headphones and speakers. Yes, I totally get how bad music would sound through the watch’s very small speaker, but for reading a book; especially if I’m just laying in bed, it seems like another opportunity to think outside the box not taken. Voice Dream Reader is on my watch now, but requiring a Bluetooth audio device makes it inconvenient for me to use.

If I were complaining just to complain, I could have succeeded by babbling to myself in an echo chamber. I wrote this to hopefully show mobile productivity from a different angle, hoping a reader or two might take the next opportunity they have to think or better yet just step outside the box and include more users, regardless of what screen preferences they might have.

My journey to the cool Amazon “Show and Tell” feature discovering along the way that the Echos Show 5 and 8 won’t ever be able to support it.

Posted Tuesday, December 3, 2019
One of the bigger challenges for a blind person is quickly identifying things that aren’t physically unique or labeled in braille. This is one of the frustrations technology has helped in a big way.

The first device that made a significant breakthrough in this area, was the I.D. Mate. A talking barcode reader with a built-in database of products. It was very cool, very efficient and also very expensive; still costs $1299 today. I considered buying one in the fall of 2007, but now am happy I didn’t.

I did however, in late 2009, buy a $300 USB barcode reader and attach it to a netbook which I already owned and had been using. Still expensive, but way less than the I.d. Mate. It also meant I had to keep the netbook on top of the microwave and plugged in. It did work though, was faster than more modern solutions still today, but it was also cumbersome and I finally gave up on it.

There are several nice apps, like Seeing AI,  for either or both iOS and Android that can identify barcodes today. The problem is the APIs assume that the barcode can be visually aligned in the camera view. The app developers have offered beeps to help the blind user do this, but it’s still not as efficient as a dedicated scanner. Smartphones are way more mobile than my old USB barcode scanner attached to my netbook though, so it’s still somewhat of an improvement.

The only annoyance for me in using a smartphone is placing round containers on the counter and then them rolling around; even holding them with one hand, I wished that I could have both hands free to position the item I was identifying.

Enter the Echo Show from Amazon. When the first generation Echo Show was announced with it’s flashy screen so you could watch videos in the kitchen I thought that was the most useless thing ever for a blind person; but then Amazon announced the “Show and Tell” feature in their September 2019 release. I was interested, and decided to go for it.

The Echo Show 5, their most recently announced version seemed the best for me. It was small and cheap, too cheap. I got it and then found it didn’t support Show and Tell. Amazon still says the feature is supported by Echo Show, first and second generation. The Echo Show 5 doesn’t have generation anywhere in its name, but I figure it’s new why wouldn’t it support show and tell. I then found it can’t, because its camera is only 1 mega pixel; I’m still wondering why anyone would want a camera that anemic. Pretty bad if i, totally blind knows 1 megapixel is that bad.

The problem is though, that it really  is that bad, it means that a blind person, who doesn’t need a screen at all can’t buy the least expensive visual models, grumble. This also excludes the new Echo Show 8, still only 1 megapixel camera, frown. The Echo Show  first generation is the best way to go. It’s around $100 for as long as it’s still available, the second generation is $230. More than twice the price, with very little benefit if you can’t see it.

It’s been setup in my kitchen for almost two months now, on top of the microwave, but still smaller than my old netbook. I find identifying food faster and do find having both hands to hold items as convenient as I had imagined. Its database is somewhat limited, but still not bad. I’m guessing it will grow over time. Some times if it can’t identify something exactly, it will read some of the text it can see which can be just as successful in my opinion.

turning on VoiceView, Amazon’s screen reader is easy, and adds some nice capabilities to the Show, and VoiceView gestures are very similar to those in VoiceOver on Apple’s iOS. The Echo Show, first generation, is definitely worth it, even if you’re totally blind and can’t see the screen. The Show and tell feature is more convenient than scanning barcodes with a smartphone, and it will be able to use the “cook to scan” technology if you ever decide to get the new Amazon smart oven in the future.

Yes, I later realized I could put the phone on the counter face down, and then have both hands to position items to be identified, but am still glad I got the Echo Show. It is always ready to identify things for me, even if my phone is in another room, or doing something else.

How the Amazon Basics microwave oven has replaced the much more expensive talking microwaves made specifically for the blind

Posted on Friday September 6, 2019

In summer 1981 between my fifth and sixth grade years, my parents and I visited my oldest sister Kathi in Colorado. She had a microwave oven, and her kids who were in grade school could use it to cook or reheat foods. Kathy thought it would be good for me also, as it would be safer than a stove.
Later that summer Mom and Dad bought one.

Although not the panacea originally thought as, the microwave is still quite useful I use it almost every day. The model I’d been using from 2001-2018 was the Panasonic mid sized model, though I had to buy a second one after the first one died in 2014 but hey 12 years isn’t bad.
Still one problem for blind people with just about any microwave though, is that they have flat panels so if you’re blind it was hard to know where the buttons were. Many of us found ways to braille labels and put them on to the microwave panels. Braille takes up a lot of space though, so some of us also , used Highmarks, or fabric paint, which is much cheaper, to save space; but there was only so much room. This meant not all of the more advanced features or any features selected from a menu were usable unless memorized.
There were talking microwave ovens made specifically for blind people to use, but they were significantly more expensive, costing as much as $400,  and often hard to find. Some of them also didn’t have as many features or were lower powered.

Then last summer Amazon announced their Amazon Basics microwave oven, and you could control it with any of their Amazon Echo  devices.
I read reviews when they came out, but sadly many people thought it was silly or only a novelty. Why control something with your voice when the buttons are right there. They seemed not to think about how much more convenient it might be for someone with a disability, or even for someone without any disability at all. They also mentioned that the Amazon microwave was under powered, and it is at only 700 wats, and it is also smaller, so it may not serve a family of more than 2 people very well, but it’s a start. With that in mind I still felt it was an ok experiment.

It showed up, I plugged it in, and it configured itself. Because I had already configured an Amazon echo device, it knew my wifi network, and once on there, it also set the correct time. Awesome, now my sighted friends won’t have to tell me that the clock isn’t set anymore.
I still had a friend mark the microwave with fabric paint though, and one day when there was an internet outage I had to press the buttons like an animal. Most of the time though, , unless in a phone call, I control it with my voice.
One can say things like “Alexa, Cook microwave for 3 minutes” and it will do that, you can also say “at power x” where x is from 1 to 10. I also had marked the popcorn bacon defrost and reheat buttons on previous microwaves I’d owned, but this was about the most a blind person can do with most models.
With the Amazon model, you can say things as advanced as, “Alexa, Cook 8 ounces of broccoli,” “cook one cup, (or bowl), of soup;” “cook one cup of coffee. saying heat also works. baking a potato also works. There are more commands, I’ve only mentioned a few here.
One might say that’s cool and all, but the expensive talking microwaves specifically for the blind could tell you how much time was left in cooking. That’s true, but you can get the same on the Amazon microwave as well. You can also add more time while it’s cooking; so if you say “Alexa add 1 second to microwave,” it will tell you something like “cooking 45 seconds on power 10.”

It is true that the Amazon basics microwave is under powered, so it takes a bit longer to cook things, but at the end of the day, it’s really not that big of a deal. It is also quite small, large dishes may not fit. I still call my experiment a success with a few miner caveats.

General Electric also has a model of microwave for about $150 that can be controlled by an Amazon device. It is larger, 0.9 cubic feet,  and runs at up to 900 wats, so also not as under powered. It also has a feature where you can scan barcodes on frozen food packaging with your smartphone, after which your phone will look up cooking directions and send them to the microwave. This in the reviews I read seemed not to work so well, but hey probably future models will get better.

Even if one  bought the GE model and an Amazon Echo Dot   for around $200 , they could have a talking microwave for about half the price  than one of those previously mentioned talking models specifically for the blind. As I have said, mainstream device plus talking cell phone, or in this case Amazon echo device, equals talking, or accessible mainstream device. Seemingly novelty features to many are game changing accessibilities for others. Designing with inclusion from the start, is always the best way to go.

A much improved way to spell check documents using VoiceOver on iOS beginning with iOS 12.1

Posted on November 6, 2018

There was a way, reproducible though not very convenient to spellcheck documents in iOS 11 using VoiceOver and at the time I thought it was cool though somewhat difficult to remember, but wrote a blog post about it anyway.

A big thank you to Scott Davert, who discovered that in iOS 12.1 the spell checking process was made much more efficient. He recorded it in a recent Applevis podcast from where I learned about it. I have to admit even though I wrote the blog post defining how to correct spelling in iOS 11, I rarely if ever used it and just wrote things on my iPhone as I am now but then corrected the spelling on my MacBook. I think I can honestly say I will correct spelling much more or — probably whenever I write anything beyond a sentence or 2 on my iOS devices in the future. This VoiceOver improvement, truly makes any iOS device a real writing device for blind users.
In fact, I just spell checked the last paragraph possibly in less than 30 seconds on my iPhone, This will be the coolest feature for me in iOS 12.1.

Let’s figure out how to do it.

1. Set VoiceOver rotor to misspelled words.

2. Swipe up or down, or press up or down arrows on your keyboard or braille displays to find the previous or next misspelled word.

3. Move right with a finger or keyboard each time will show you the next in a list of correctly spelled suggestions.

4. If you find the word you want, double tap on it or activate it with your keyboard and it will replace the misspelled word.
5. If the word you want is not in the list, the offensive misspelling you’re on is selected so pressing delete or backspace will erase it. Then you can enter another attempt.

Now maybe I can start blogging directly from iOS. If still in school, I think I could seriously write a paper completely on iOS. Since I almost always use a Bluetooth keyboard, I could even do it on an iPhone. Hey, When not writing blog posts or school papers, spell checking emails will also be a snap.

Another post I wrote with additional thoughts about Apple’s Face ID

Posted on October 1, 2018

Two weeks ago, I wrote about how I strongly dislike Apple’s Face ID; and although some in the blind community have agreed with my thoughts, there are also some who do not. They say “oh Face ID is just fine and it’s accessible,” etc. Accessible for sure, but not efficient; and in some cases totally breaks people’s workflows.

Because accessibility has lacked efficiency in many ways, some bigger than others over the years, blind people have often collectively accepted that they have to deal with it. There might be an app that is almost accessible except 1 screen so blind users memorize the screen and don’t complain. Then there are environments like  amateur or ham radio, where the equipment isn’t really accessible at all, or the accessible versions are often considerably more expensive, (though things are improving) so blind people write and share guides to help them get around the problems. I respect the people who wrote those guides, and appreciated them many times, and even wrote a few myself, but the question needs to be asked: why are we just rolling over as a community and accepting this? Why aren’t we pushing back harder and finding polite and respectful ways to ask for or joining in to help create more accessible and/or even more  efficient solutions.

With some pretext, I now return to the Face ID situation. To date, Twitter is the most accessible and efficient social media for blind users, and it is there will you will find them discussing anything and everything they find important. As we have had a year with Apple’s Face ID, there have been tweets among the blind community about it, though I find most of them just saying things like, ye it’s ok, I got used to it; or even a few of it’s amazing works great. Santiago’s tweets shared here I think encompass much of this mentality.

Santiago – @kevinrj There are blind people that say that faceID is an accessibility issue, but I don’t feel like it is. Unlocking your phone with touchID in your pocket isn’t an accessibility feature. It’s simply a convenience. A nice one at that, but not necessary.
Santiago – @kevinrj Well, that convenience certainly goes away, and I honestly wish I could have it back sometimes. Could FaceID improve? Certainly, but I think everyone experiences similar issues with it. Even sighted people.

Santiago – @kevinrj You do also have the security issue with it. When it comes to sighted people, the phone actually looks for your attention in order to unlock. It automatically turns it off if you’re using VoiceOver. I have a work-around, but again… not very convenient. 
Santiago – @kevinrj I’m all about efficiency. Heck, I sometimes upgrade my products, because they slow down after years and affect that greatly, but I, a totally blind person, have efficiently used my iPhone X for almost a year now. Is there a learning curve? Yes. But it’s accessible.

Yes, as I said earlier, it’s accessible, but that doesn’t mean efficient. Could I take a bus from New York to Las Angeles? Sure, it’s totally accessible, and would even be way cheaper, but if I had to do it every 2 months for my job I would not like wasting up to a week each time I could save by flying. For a blind person, Face ID is very much like that; even though some are making it work or even enjoying it, some also enjoy long bus rides; I haven’t found that from my own personal experiences, but I think it has something to do with the scenery.

Sina Bahram has an ABD in  PHD in computer science and is probably the most advanced computer user in the blind community who I know of. Last week I found a thread on Twitter with him and a few other people about why Face ID is a step back for blind accessibility. These are not just opinions, but hard facts that should be taken seriously.

In this thread, screen curtain is mentioned, but is mostly only called curtain, which I realized may be confusing to those who don’t know about it. Screen curtain is an amazing VoiceOver feature that along with bringing added security and privacy to VoiceOver users on Apple products can definitely also save battery life.

Sina Bahram – Wow, I was not expecting to do this, but I’m returning the iPhone 10S. I cannot believe what an atrocious experience this is now. FaceID is nowhere near as efficient or accessible as fingerprint sensor. Not having a home button is ridiculous. No more immediacy of access. #a11y

James Teh – @SinaBahram I suspect my feeling will be the same. Some people seem to cope okay, but I just can’t see how face ID could be more efficient for us. And my understanding is you have to disable the gaze stuff, which means we reduce security due to disability, but I might be misunderstanding.

Michiel Bijl – @jcsteh @SinaBahram I’d be curious to know how that is measured. If it’s done by detecting whether your eyes are pointed at the phone with eyelids open—it might not be a problem for everyone.
Of course you can always use the passcode but that’s a major step back from Touch ID.

Michiel Bijl – @SinaBahram The interaction to go home with VoiceOver is weird. I mess that up rather regularly. Any tips?

James Teh – @MichielBijl @SinaBahram Also, the whole idea of having to actually pick up my phone and bring it to my face just to unlock it… so many kinds of bleh. The number of times I just quickly look at my phone with one hand while it sits on my desk…

Julianna Rowsell – @SinaBahram A friend of my is having similar feelings. His physical access disability doesn’t allow him to effectively use it. The angles to his face are wrong and such so the recognition  software doesn’t authenticate. – Retweeted by SinaBahram

Sina Bahram – @jcsteh @MichielBijl Exactly. This is just simply unacceptable. I really hope that some advocates inside of Apple bothered trying to speak up. It’s just not like them, sir. There are so many use cases this completely destroys.

Sina Bahram – @MichielBijl Yes, the tip is to get another phone. I’m not being sarcastic. I just boxed mine up. I am an expert in this topic and the most power user I have encountered, not bragging just facts, and this is unusable. So, I’m done. I’ll try to motivate some internal Apple convos, but no idea.
Sina Bahram – @MichielBijl @jcsteh I, plus its default now if it detects VO running, have turned off attention requirements. That’s not the FaceID issue here. The issue is that it doesn’t work in the dark with curtain on and it requires your face instead of your hand that’s already touching the device.

Sina Bahram – @jcsteh You are absolutely not misunderstanding. You are reducing security because of disability. Welcome to every X phone from original to the S and Max. Other concerns make this unusable, though.

James Teh – @SinaBahram @MichielBijl Oh… I didn’t think of that, and that’s super frustrating. So I’d have to turn off curtain to unlock my phone or go turn on a light? How utterly ridiculous.

Sina Bahram – @jcsteh @MichielBijl Yup, I can confirm that. I turn off curtain, and oh look, it’s magically like 10X more accurate, and turn it back on … pin code it is!
Tomi 🇭🇺 – @SinaBahram wait, doesn’t work in the dark with curtain on? Is this a thing? Does having screen curtain change this? I thought infra-red works with low or no light anyway since it’s using its own infra-red beams, so most people I read about using it said it works at night /in dark.

Sina Bahram – @tomi91 Everyone assumes infra-red means works in dark. This is not true. Infra-red buys you depth sensing independent of (visible)  light. That barely matters since gaze is disabled by most VO users. Face ID  still needs (visible) light in addition to depth info.

Tomi 🇭🇺 – @SinaBahram oh that’s interesting. I wonder if people re-enable attention mode if it changes. But then again some people can’t even get their eyes straight (like me) so it’d probably just fail over and over. Man I’m really glad about my 8 now, thanks for that hope. lol

Sina Bahram – @tomi91 That feature is automatically turned off for VO users, so the eyes thing is not an issue itself, though it negatively impacts everything from lack of full messages on lock screen to everything else.

I wish Apple had an iPhone pro line, kind of like their iPad pros. Face ID would be a great feature on those phones, but then instead of an iPhone Xr they could have what the iPhone 9 should have been, still an a12 processor but maybe a slightly lower quality camera, a smaller screen, and also still a home button.

There are still people who would like a smaller phone. There are even still some sighted people who are not obsessed with the latest perfections in screen technology, or don’t even care if they have the best camera. There are even some sighted people who would still prefer Touch over Face ID, even Alex Lindsay, who is one of the most visually oriented people I know of, said on MacBreak Weekly recently, that he personally prefers Touch ID but thinks phones should actually have both.

 

My thoughts about how as a VoiceOver user, Face ID although accessible is not very efficient

Posted on September 13, 2018
On January 9, 2007 Steve Jobs introduced the first iPhone, but I wasn’t sucked into his reality distortion field quite yet. My friend Nathan Klapoetke was ecstatic about it though and told me later that day when reinstalling Windows for me that he couldn’t wait to buy one; that was a very long 6 months for Nathan.

I, and much of the blind community were frustrated though, and felt left out. Some of us new that the touch screen, just a piece of glass for us then, would be the future and that soon no companies would make phones with number pads anymore. Then, at WWDC 2009, Steve Jobs introduced iOS 3 and mentioned VoiceOver during the keynote; the blind could use modern phones again. Those 2 years were hard though. Many of us in the blind community had Nokia phones, and some were playing with Windows phones, but they weren’t changing our lives anywhere near as much as the iPhone would later. This is how I feel current things are with the iPhone X models starting last year.

Apple is obsessed with Face ID and is making it their future, but I’m feeling a bit left out again.

Yes Face ID is accessible, I played some with an iPhone X last fall but as more people are recently realizing, accessible does not always mean efficient. I found it quite finicky when setting it up, and because I wear prosthetic eyes, when they’re out, I appear as a different person. There is an article written specifically for a blind person setting up Face ID, but note that they tell you to put your face in the frame, but don’t actually explain how to do it when unable to see the frame. Then there’s the trick of figuring out how far away to hold the phone from your face. I also have no idea how to move my face around in a circle while simultaneously staring at one spot. Ok, I understand the concept, but can’t physically do it. It took me about 15 minutes to get Face ID setup the first time. Another problem is how attention mode is disabled by default when VoiceOver is enabled. I understand that Face ID would work less with it on for blind people who are unable to visually focus, but that’s a potential security hole. A blind person could have their phone next to them on their desk and another person could quietly walk by pick up the phone pan it by the blind person’s face, and they’re in.

Beyond setting it up and all the problems of having eyes that don’t work, there is the inconveniences of work flow. My phone is often on my belt, and most people blind or sighted keep their phones in a pocket. If you’re blind, and have headphones, why would you ever want to  take your phone off your belt, or out of your pocket to use it? Taking your phone out just to authenticate gets annoying real fast, it also may require the person to stop what else they were doing. I’m rarely if ever looking at my phone when I use it, often my face is completely in a different direction.

I could go on a rant, oh wait I already am. I could be cynical, flame Apple, or just give up and switch to Android, and some might; but from my experience where although it took 2 years, Apple did bring VoiceOver to the iPhone in 2009. Here are some thoughts.

Things I could do today to unlock my phone, I have a long complicated password, I really don’t want to enter it every time. I could use the Bluetooth keyboard I already carry with me. I could plug in a series 4 Yubico key when I’m home or not around other people or in a situation where having something rigid plugged into the phone has a low probability of being bumped or damaged. These are only hacks though, I’m really hoping Apple can come up with an awesome solution again.

The iPhone can already unlock the Apple watch, and the watch can unlock my mack; I really hope that my Apple watch could unlock my iPhone too. Just having the phone unlock if my watch is on would definitely not be secure at all, but with the new Apple watch series 4 having some capacitance in the digital crown, having to touch the crown to unlock the phone could be a start. Putting Touch ID into a future model of the watch crown would be awesome.

I already wrote about how there are solutions that let me use my wired headphones even with no headphone jack, I know there are solutions for Touch ID equivalents that don’t include Face ID too. Whether Apple implements any of them is still a question, but I really hope they will realize how visual only options inconvenient a non-trivial segment of their market, and give us an alternative.

How for me, bone conduction headphones by Aftershokz is one of those technologies that is not just nice to have but a huge game changer for blind people

Posted on July 17 2018
Shortly after Christmas when i. Was 5, my sister Andrea introduced me to headphones with one of those single ear plugs from the 1970s, and showed me how I could plug it into my radio and listen to it. After a few minutes of private listening I couldn’t understand at all why she or anyone else around me couldn’t hear it. That just blew my 5-year-old mind, and it hasn’t been the same since.

As I got older, headphones became more a part of my workflow. I knew there are good speakers out there, I’m just a headphone guy at ear. Part of this came from. Using screen readers on first computers and later phones, besides using headphones for privacy, I’m sure people around me appreciate not being annoyed by it. I even used headphones on the bus, at work, and even when I took classes in college. There was still one area where headphones couldn’t help me though, when traveling alone using the white cane. I began to use GPS apps on my phone, but all the headphones I knew of still blocked some of the sounds from my environment, thus I didn’t feel safe using them when walking, and when traffic was loud I couldn’t hear the GPS info on the phone. Then I learned about bone conduction.

My friend Hai Nguyen Ly told me about Aftershokz and their line of bone conducting headphones, and how they rested on the face using transducers to convey sound through the cheek bone thus leaving the ear completely uncovered and blocking none of a person’s natural hearing. I couldn’t afford them then, so Hai sent me an old pair he was no longer using. Like Andrea introducing me to headphones so long ago, Hai improved my life again.

The first time I used the Aftershokz psychologically I wasn’t quite sure that it really wasn’t blocking my hearing, but it didn’t take long for me to realize that they really weren’t. I could hear traffic just fine, and the GPS info from my phone was always hearable even in the loudest truck or bus roared by. Now 5 years later after buying 2 more pair of Aftershokz headphones, I still use them pretty much every day. I wear them all the time when I’m in public, they work great at meetings and conferences. Even when I’m not needing to hear traffic when cane traveling, they still let me have the ability to use my phone without interrupting anyone around me but still be able to hear what they’re saying. Yes, trying to understand both audio streams might not work as well as I’d like, but sighted people get distracted too.

Some of you reading this might be wondering, ok but why does this matter? In my last post I talked about how for most of us most of the time technology is a nice convenience, but for those with disabilities, technology can be a huge life changer; this is definitely one of them.
Especially for blind users of screen readers. Bone conduction headphones allow them to get the info they need in real time while still having full access to their environment through their primary sense. Bone conduction technology may have been initially invented for the military, but now thankfully it is now also being used to help humans also be more human.