Category Archives: technology

My thoughts on how productivity is way more portable than in the past, but how annoyingly some non-visual features only appear on products with larger screens

Posted on December 13, 2019

Johan Sebastian Bach probably wished he’d had a better way to work on his “Musikalisches Opfer”, “A Musical Offering” when he traveled back home to Leipzig from visiting King Frederick the great of Prussia  near Berlin in 1747. Ok, he probably instead really wished for something faster and more comfortable than a horse-drawn carriage.

I still remember being in high school, when I had a 30 minute ride between school and home, each way; wanting a portable typewriter that ran on batteries; all the homework I could have done.

In 1987, Deane Blazie and his company, Blazie Engineering, came out with the Braille ’n Speak, very awesome for its time. David Holladay told me there was nothing like it in the sighted world at that time. He considered buying one for his own use, even though he is sighted.

The original version was a braille note taker, had 192K of memory that could be separated into 30 separate files. It was kind of that portable typewriter I wanted; it had  actually already been out when I was a senior in high school, I just didn’t know about it until a month before   I started college.. The Braille ‘n Speak was 8 by 4 by 1/2 inches and was way easier to carry around than the Perkins braille writer which I still used in my dorm room, just not loudly in my classes.

People everywhere got excited when laptops became viable, affordable, and light weight enough to not break the average back, though the most portable models even today are still definitely breaking wallets. Even the lightest of laptops back then still had rather large footprints, until the netbooks came out in the late early two thousands. People liked netbooks because they were easy to carry, and although not very powerful were good for writing projects and web browsing.

Back in the blind community, Levelstar came out with both the Icon, and the Braille Plus, which were kind of the Braille ‘n Speaks of their time in 2007. The Icon was 3 by 5 by 1 inches, the Braille Plus slightly wider. They both ran linux,, excited many people, and probably would have gone much farther if it weren’t for the industry-wide disruptive changes brought by Apple and their iOS devices. Yes, they also killed the netbook market.

When the first iPads came out, people thought they were just personal screens to consume content on; but now, ten years later, as iPads continue to get more powerful, many people see with significant success, how much work they can do on them while on the run; just ask  Federico Viticci. You can even get very nice iPad cases with keyboards built-in so that they in some ways almost function like laptops, though probably a netbook would be more accurate. The problem is, most people are photon dependent, and they can’t seem to get anything done without a large flashy screen to look at.

I often hear people talk about how much they can get done with their iPads, and that’s great, but my annoyance is that this means the majority also think that the iPhone is useless as a productivity tool. This means: I can’t press command-tab to move between open apps on my iPhone, or get a list of keyboard shortcuts offered by an iOS app. Even though both of these are completely available using the exact same Bluetooth keyboard on  iPads. I can’t think of any technical reason why the same keyboard-focused productivity features used on iPads every day can’t also work on iPhones.

Since 2010 I still carry a foldable Bluetooth keyboard along with my iPhone, and since 2014 braille screen input, more often called BSI has also been available. This means that blind users have been able to be just as productive as their sighted counterparts with their larger iPad screens. If I were in college writing papers today, I could probably do 95% of them on my iPhone only needing my MacBook for finishing touches. If I were sighted, I’d probably want larger screens too, but I hopefully would still appreciate that small screens can also be just as effective.

It seems that many products only offer their high-end features on their larger screened devices. This can be interpreted by some as kind of a screen tax. “I, who can’t use a screen am forced to pay more money for a larger screen that I can’t benefit from, just to get a quad core CPU instead of two cores, etc.” “We only make phones with huge screens and no home button, because that’s what everyone wants.”  When individuals or companies come up with some new awesome product, I just wish they would not assume that everyone thinks the same ways they do.

I still smile when I remember how my friend Eric Knapp explained how confused people looked when they saw me typing on my keyboard but couldn’t see any screen or device; my iPhone was under the table attached to my belt. I wear bone conduction headphones in public, so VoiceOver speaks to me through them just fine. If the iPhone got those cool keyboard shortcuts on the iPad already mentioned above, I would consider that a nice step towards improvements.

There is a very cool app for both iOS and Android called Voice Dream Reader, it can read all kinds of file formats, and to some degree, makes digital text into a kind of audio book. I use it every day. I also thought how amazing it would be if I could have Voice Dream Reader on my Apple watch, the smallest e-book reader ever. Alas, the Apple watch can only play audio through Bluetooth headphones and speakers. Yes, I totally get how bad music would sound through the watch’s very small speaker, but for reading a book; especially if I’m just laying in bed, it seems like another opportunity to think outside the box not taken. Voice Dream Reader is on my watch now, but requiring a Bluetooth audio device makes it inconvenient for me to use.

If I were complaining just to complain, I could have succeeded by babbling to myself in an echo chamber. I wrote this to hopefully show mobile productivity from a different angle, hoping a reader or two might take the next opportunity they have to think or better yet just step outside the box and include more users, regardless of what screen preferences they might have.

My journey to the cool Amazon “Show and Tell” feature discovering along the way that the Echos Show 5 and 8 won’t ever be able to support it.

Posted Tuesday, December 3, 2019
One of the bigger challenges for a blind person is quickly identifying things that aren’t physically unique or labeled in braille. This is one of the frustrations technology has helped in a big way.

The first device that made a significant breakthrough in this area, was the I.D. Mate. A talking barcode reader with a built-in database of products. It was very cool, very efficient and also very expensive; still costs $1299 today. I considered buying one in the fall of 2007, but now am happy I didn’t.

I did however, in late 2009, buy a $300 USB barcode reader and attach it to a netbook which I already owned and had been using. Still expensive, but way less than the I.d. Mate. It also meant I had to keep the netbook on top of the microwave and plugged in. It did work though, was faster than more modern solutions still today, but it was also cumbersome and I finally gave up on it.

There are several nice apps, like Seeing AI,  for either or both iOS and Android that can identify barcodes today. The problem is the APIs assume that the barcode can be visually aligned in the camera view. The app developers have offered beeps to help the blind user do this, but it’s still not as efficient as a dedicated scanner. Smartphones are way more mobile than my old USB barcode scanner attached to my netbook though, so it’s still somewhat of an improvement.

The only annoyance for me in using a smartphone is placing round containers on the counter and then them rolling around; even holding them with one hand, I wished that I could have both hands free to position the item I was identifying.

Enter the Echo Show from Amazon. When the first generation Echo Show was announced with it’s flashy screen so you could watch videos in the kitchen I thought that was the most useless thing ever for a blind person; but then Amazon announced the “Show and Tell” feature in their September 2019 release. I was interested, and decided to go for it.

The Echo Show 5, their most recently announced version seemed the best for me. It was small and cheap, too cheap. I got it and then found it didn’t support Show and Tell. Amazon still says the feature is supported by Echo Show, first and second generation. The Echo Show 5 doesn’t have generation anywhere in its name, but I figure it’s new why wouldn’t it support show and tell. I then found it can’t, because its camera is only 1 mega pixel; I’m still wondering why anyone would want a camera that anemic. Pretty bad if i, totally blind knows 1 megapixel is that bad.

The problem is though, that it really  is that bad, it means that a blind person, who doesn’t need a screen at all can’t buy the least expensive visual models, grumble. This also excludes the new Echo Show 8, still only 1 megapixel camera, frown. The Echo Show  first generation is the best way to go. It’s around $100 for as long as it’s still available, the second generation is $230. More than twice the price, with very little benefit if you can’t see it.

It’s been setup in my kitchen for almost two months now, on top of the microwave, but still smaller than my old netbook. I find identifying food faster and do find having both hands to hold items as convenient as I had imagined. Its database is somewhat limited, but still not bad. I’m guessing it will grow over time. Some times if it can’t identify something exactly, it will read some of the text it can see which can be just as successful in my opinion.

turning on VoiceView, Amazon’s screen reader is easy, and adds some nice capabilities to the Show, and VoiceView gestures are very similar to those in VoiceOver on Apple’s iOS. The Echo Show, first generation, is definitely worth it, even if you’re totally blind and can’t see the screen. The Show and tell feature is more convenient than scanning barcodes with a smartphone, and it will be able to use the “cook to scan” technology if you ever decide to get the new Amazon smart oven in the future.

Yes, I later realized I could put the phone on the counter face down, and then have both hands to position items to be identified, but am still glad I got the Echo Show. It is always ready to identify things for me, even if my phone is in another room, or doing something else.

Another post I wrote with additional thoughts about Apple’s Face ID

Posted on October 1, 2018

Two weeks ago, I wrote about how I strongly dislike Apple’s Face ID; and although some in the blind community have agreed with my thoughts, there are also some who do not. They say “oh Face ID is just fine and it’s accessible,” etc. Accessible for sure, but not efficient; and in some cases totally breaks people’s workflows.

Because accessibility has lacked efficiency in many ways, some bigger than others over the years, blind people have often collectively accepted that they have to deal with it. There might be an app that is almost accessible except 1 screen so blind users memorize the screen and don’t complain. Then there are environments like  amateur or ham radio, where the equipment isn’t really accessible at all, or the accessible versions are often considerably more expensive, (though things are improving) so blind people write and share guides to help them get around the problems. I respect the people who wrote those guides, and appreciated them many times, and even wrote a few myself, but the question needs to be asked: why are we just rolling over as a community and accepting this? Why aren’t we pushing back harder and finding polite and respectful ways to ask for or joining in to help create more accessible and/or even more  efficient solutions.

With some pretext, I now return to the Face ID situation. To date, Twitter is the most accessible and efficient social media for blind users, and it is there will you will find them discussing anything and everything they find important. As we have had a year with Apple’s Face ID, there have been tweets among the blind community about it, though I find most of them just saying things like, ye it’s ok, I got used to it; or even a few of it’s amazing works great. Santiago’s tweets shared here I think encompass much of this mentality.

Santiago – @kevinrj There are blind people that say that faceID is an accessibility issue, but I don’t feel like it is. Unlocking your phone with touchID in your pocket isn’t an accessibility feature. It’s simply a convenience. A nice one at that, but not necessary.
Santiago – @kevinrj Well, that convenience certainly goes away, and I honestly wish I could have it back sometimes. Could FaceID improve? Certainly, but I think everyone experiences similar issues with it. Even sighted people.

Santiago – @kevinrj You do also have the security issue with it. When it comes to sighted people, the phone actually looks for your attention in order to unlock. It automatically turns it off if you’re using VoiceOver. I have a work-around, but again… not very convenient. 
Santiago – @kevinrj I’m all about efficiency. Heck, I sometimes upgrade my products, because they slow down after years and affect that greatly, but I, a totally blind person, have efficiently used my iPhone X for almost a year now. Is there a learning curve? Yes. But it’s accessible.

Yes, as I said earlier, it’s accessible, but that doesn’t mean efficient. Could I take a bus from New York to Las Angeles? Sure, it’s totally accessible, and would even be way cheaper, but if I had to do it every 2 months for my job I would not like wasting up to a week each time I could save by flying. For a blind person, Face ID is very much like that; even though some are making it work or even enjoying it, some also enjoy long bus rides; I haven’t found that from my own personal experiences, but I think it has something to do with the scenery.

Sina Bahram has an ABD in  PHD in computer science and is probably the most advanced computer user in the blind community who I know of. Last week I found a thread on Twitter with him and a few other people about why Face ID is a step back for blind accessibility. These are not just opinions, but hard facts that should be taken seriously.

In this thread, screen curtain is mentioned, but is mostly only called curtain, which I realized may be confusing to those who don’t know about it. Screen curtain is an amazing VoiceOver feature that along with bringing added security and privacy to VoiceOver users on Apple products can definitely also save battery life.

Sina Bahram – Wow, I was not expecting to do this, but I’m returning the iPhone 10S. I cannot believe what an atrocious experience this is now. FaceID is nowhere near as efficient or accessible as fingerprint sensor. Not having a home button is ridiculous. No more immediacy of access. #a11y

James Teh – @SinaBahram I suspect my feeling will be the same. Some people seem to cope okay, but I just can’t see how face ID could be more efficient for us. And my understanding is you have to disable the gaze stuff, which means we reduce security due to disability, but I might be misunderstanding.

Michiel Bijl – @jcsteh @SinaBahram I’d be curious to know how that is measured. If it’s done by detecting whether your eyes are pointed at the phone with eyelids open—it might not be a problem for everyone.
Of course you can always use the passcode but that’s a major step back from Touch ID.

Michiel Bijl – @SinaBahram The interaction to go home with VoiceOver is weird. I mess that up rather regularly. Any tips?

James Teh – @MichielBijl @SinaBahram Also, the whole idea of having to actually pick up my phone and bring it to my face just to unlock it… so many kinds of bleh. The number of times I just quickly look at my phone with one hand while it sits on my desk…

Julianna Rowsell – @SinaBahram A friend of my is having similar feelings. His physical access disability doesn’t allow him to effectively use it. The angles to his face are wrong and such so the recognition  software doesn’t authenticate. – Retweeted by SinaBahram

Sina Bahram – @jcsteh @MichielBijl Exactly. This is just simply unacceptable. I really hope that some advocates inside of Apple bothered trying to speak up. It’s just not like them, sir. There are so many use cases this completely destroys.

Sina Bahram – @MichielBijl Yes, the tip is to get another phone. I’m not being sarcastic. I just boxed mine up. I am an expert in this topic and the most power user I have encountered, not bragging just facts, and this is unusable. So, I’m done. I’ll try to motivate some internal Apple convos, but no idea.
Sina Bahram – @MichielBijl @jcsteh I, plus its default now if it detects VO running, have turned off attention requirements. That’s not the FaceID issue here. The issue is that it doesn’t work in the dark with curtain on and it requires your face instead of your hand that’s already touching the device.

Sina Bahram – @jcsteh You are absolutely not misunderstanding. You are reducing security because of disability. Welcome to every X phone from original to the S and Max. Other concerns make this unusable, though.

James Teh – @SinaBahram @MichielBijl Oh… I didn’t think of that, and that’s super frustrating. So I’d have to turn off curtain to unlock my phone or go turn on a light? How utterly ridiculous.

Sina Bahram – @jcsteh @MichielBijl Yup, I can confirm that. I turn off curtain, and oh look, it’s magically like 10X more accurate, and turn it back on … pin code it is!
Tomi 🇭🇺 – @SinaBahram wait, doesn’t work in the dark with curtain on? Is this a thing? Does having screen curtain change this? I thought infra-red works with low or no light anyway since it’s using its own infra-red beams, so most people I read about using it said it works at night /in dark.

Sina Bahram – @tomi91 Everyone assumes infra-red means works in dark. This is not true. Infra-red buys you depth sensing independent of (visible)  light. That barely matters since gaze is disabled by most VO users. Face ID  still needs (visible) light in addition to depth info.

Tomi 🇭🇺 – @SinaBahram oh that’s interesting. I wonder if people re-enable attention mode if it changes. But then again some people can’t even get their eyes straight (like me) so it’d probably just fail over and over. Man I’m really glad about my 8 now, thanks for that hope. lol

Sina Bahram – @tomi91 That feature is automatically turned off for VO users, so the eyes thing is not an issue itself, though it negatively impacts everything from lack of full messages on lock screen to everything else.

I wish Apple had an iPhone pro line, kind of like their iPad pros. Face ID would be a great feature on those phones, but then instead of an iPhone Xr they could have what the iPhone 9 should have been, still an a12 processor but maybe a slightly lower quality camera, a smaller screen, and also still a home button.

There are still people who would like a smaller phone. There are even still some sighted people who are not obsessed with the latest perfections in screen technology, or don’t even care if they have the best camera. There are even some sighted people who would still prefer Touch over Face ID, even Alex Lindsay, who is one of the most visually oriented people I know of, said on MacBreak Weekly recently, that he personally prefers Touch ID but thinks phones should actually have both.

 

How for me, bone conduction headphones by Aftershokz is one of those technologies that is not just nice to have but a huge game changer for blind people

Posted on July 17 2018
Shortly after Christmas when i. Was 5, my sister Andrea introduced me to headphones with one of those single ear plugs from the 1970s, and showed me how I could plug it into my radio and listen to it. After a few minutes of private listening I couldn’t understand at all why she or anyone else around me couldn’t hear it. That just blew my 5-year-old mind, and it hasn’t been the same since.

As I got older, headphones became more a part of my workflow. I knew there are good speakers out there, I’m just a headphone guy at ear. Part of this came from. Using screen readers on first computers and later phones, besides using headphones for privacy, I’m sure people around me appreciate not being annoyed by it. I even used headphones on the bus, at work, and even when I took classes in college. There was still one area where headphones couldn’t help me though, when traveling alone using the white cane. I began to use GPS apps on my phone, but all the headphones I knew of still blocked some of the sounds from my environment, thus I didn’t feel safe using them when walking, and when traffic was loud I couldn’t hear the GPS info on the phone. Then I learned about bone conduction.

My friend Hai Nguyen Ly told me about Aftershokz and their line of bone conducting headphones, and how they rested on the face using transducers to convey sound through the cheek bone thus leaving the ear completely uncovered and blocking none of a person’s natural hearing. I couldn’t afford them then, so Hai sent me an old pair he was no longer using. Like Andrea introducing me to headphones so long ago, Hai improved my life again.

The first time I used the Aftershokz psychologically I wasn’t quite sure that it really wasn’t blocking my hearing, but it didn’t take long for me to realize that they really weren’t. I could hear traffic just fine, and the GPS info from my phone was always hearable even in the loudest truck or bus roared by. Now 5 years later after buying 2 more pair of Aftershokz headphones, I still use them pretty much every day. I wear them all the time when I’m in public, they work great at meetings and conferences. Even when I’m not needing to hear traffic when cane traveling, they still let me have the ability to use my phone without interrupting anyone around me but still be able to hear what they’re saying. Yes, trying to understand both audio streams might not work as well as I’d like, but sighted people get distracted too.

Some of you reading this might be wondering, ok but why does this matter? In my last post I talked about how for most of us most of the time technology is a nice convenience, but for those with disabilities, technology can be a huge life changer; this is definitely one of them.
Especially for blind users of screen readers. Bone conduction headphones allow them to get the info they need in real time while still having full access to their environment through their primary sense. Bone conduction technology may have been initially invented for the military, but now thankfully it is now also being used to help humans also be more human.

My thoughts on how both technology and new workflows improved my life in 2016

People can look around and see new things they bought over the last year, but if they think a little deeper they might realize how some of their workflows also changed. Yes I bought a few new gadgets in 2016, but some of that was to support changes in my thinking and planning for better workflows in the near future.

I’ve had 2 talking medical thermometers in the past, the second of them quietly dying in 2015, I’m not sick often but decided having a way to take my temperature was a good idea, but instead of finding another blind-centric device I bought the Pyle in ear thermometer. It has bluetooth, pairs with an accessible iOS app and will even save temperatures along with date and time to my calendar; a workflow I hope to not need for some time yet. Oh, and it takes body temperature in about 2 seconds instead of 3 minutes like old traditional thermometers, that is game changingly awesome.

I replaced my corded hand vac with a cordless dry-wet vac, already finding no cord a nice convenience which might actually mean I use it more.

In the kitchen I now have the Instant Pot 7 in 1 programable pressure cooker, with bluetooth, really the only way to go for a blind person. Today I ordered the Waring PVS 1000 vacuum sealer, (refurbished because the price of unboxed models jumped $75 when I wasn’t looking) which among things may allow me to try some sous-vide cooking; along with preserving fresh food longer. I also got the Drop digital kitchen scale to measure food by weight instead of volume, I can also now answer the question of “What do you know” with a penny weighs 2 grams.

Since iBooks came out in iOS 4 back in 2010 I’ve been reading almost daily on my iPhones , but when the cool automatic scrolling-reading option in voiceover broke in the first iOS 10 beta last June, I started using the voice dream reader app; which I knew was awesome and had bought 3 years earlier, just hadn’t used it much. The voiceover bug was fixed in beta 3 but I still read now almost all long text with that app. I wish the voice dream reader had an Apple watch app, then I’d have the smallest ebook reader ever; and since I’m blind and not screen dependent, it would be awesome.

Already this year my 2009 MacBook pro died, and thanks to one of my cool friends, I now have a maxed out 2013 11 inch air on loan. Another workflow change is I installed Homebrew instead of Macports this time around. The newer air also means VoiceOver thus far is busy much less of the time, so I can finally play more with Xcode; and I’ll have a working battery when I speak at Cocoaconf Chicago in late April.

What kinds of changes in your workflows did you see last year, how might new workflows in the new year help you in the future. Rather than just coasting through life, it’s way better to “live on purpose”.