Category Archives: Blind world problems

My thoughts on how with some software, cameras could be much more useful for blind people

There are articles all over the Internet about how you can use security cameras, and which are the best ones to buy. I have also heard how people use cameras to keep track of their pets while away from home. I even have a friend who uses a web cam to make sure his 3d printing job at home hasn’t failed while he’s at work.

This is all cool, and I’m sure helps resolve many situations for people, but all of these examples require that the person can see. I would call it a passive use of cameras; that is, the image processing is left for the human. If I had sight I’d probably use some of those too; but I’m not.

In 2013 at the Snow*Mobiling conference in Madison Wisconsin, I heard Wes Bos speak about an ingenious project I never would have thought of. He put a web cam out by his mailbox and wrote a web app that would alert him if the image changed. He could then see if the mail person was there or the position of the mailbox flag.

I hadn’t thought of that for 5 years until earlier this week when Someone on my facebook page remarked how they hated wasting food, and their post reminded me about how sadly a few times a year I leave food out on the counter overnight that should be in the fridge or freezer making it no-longer safe to eat. (Puts face in hands and sighs.) Thus, as Allison Sheridan often likes to say, “a problem to be solved”.

Something I’ve been imagining since, is a way to have cameras plus hopefully somewhat smart software do things for blind people, a more active use of cameras; it would probably also require a bit of machine learning.

Maybe a blind person could have a wide-angle camera constantly looking at their counter top with a program that would recognize a person, something that looks like a package or box, and some more common foods. If the app thought that the person hadn’t been recognized for say 2 hours, and it saw something that looked like a box, package or common food item, like left out pizza; it could send an alert to the person. “Food has been possibly left out on counter for 2 hours.”

Another problem to solve would be that most blind people try to save energy and leave most if not all lights off as much as possible, maybe the camera could have a small LED flash ; just thinking as I write.

This doesn’t exist beyond thoughts and ideas right now, but probably could and not be too expensive. It might be possible with a camera, LED, and a raspberry pi; plus the time to write a program.

I have also thought instead of using a pi to use an old smartphone. Smartphones are compact and have all you need in one device, and like a pi would also be easy to power.
Or, after further thought, maybe only a camera with wifi would need to be in the kitchen, and a small personal server, maybe even just a computer somewhere else in the home could do the processing; more thoughts.
Let’s make it happen.

My thoughts on how productivity is way more portable than in the past, but how annoyingly some non-visual features only appear on products with larger screens

Posted on December 13, 2019

Johan Sebastian Bach probably wished he’d had a better way to work on his “Musikalisches Opfer”, “A Musical Offering” when he traveled back home to Leipzig from visiting King Frederick the great of Prussia  near Berlin in 1747. Ok, he probably instead really wished for something faster and more comfortable than a horse-drawn carriage.

I still remember being in high school, when I had a 30 minute ride between school and home, each way; wanting a portable typewriter that ran on batteries; all the homework I could have done.

In 1987, Deane Blazie and his company, Blazie Engineering, came out with the Braille ’n Speak, very awesome for its time. David Holladay told me there was nothing like it in the sighted world at that time. He considered buying one for his own use, even though he is sighted.

The original version was a braille note taker, had 192K of memory that could be separated into 30 separate files. It was kind of that portable typewriter I wanted; it had  actually already been out when I was a senior in high school, I just didn’t know about it until a month before   I started college.. The Braille ‘n Speak was 8 by 4 by 1/2 inches and was way easier to carry around than the Perkins braille writer which I still used in my dorm room, just not loudly in my classes.

People everywhere got excited when laptops became viable, affordable, and light weight enough to not break the average back, though the most portable models even today are still definitely breaking wallets. Even the lightest of laptops back then still had rather large footprints, until the netbooks came out in the late early two thousands. People liked netbooks because they were easy to carry, and although not very powerful were good for writing projects and web browsing.

Back in the blind community, Levelstar came out with both the Icon, and the Braille Plus, which were kind of the Braille ‘n Speaks of their time in 2007. The Icon was 3 by 5 by 1 inches, the Braille Plus slightly wider. They both ran linux,, excited many people, and probably would have gone much farther if it weren’t for the industry-wide disruptive changes brought by Apple and their iOS devices. Yes, they also killed the netbook market.

When the first iPads came out, people thought they were just personal screens to consume content on; but now, ten years later, as iPads continue to get more powerful, many people see with significant success, how much work they can do on them while on the run; just ask  Federico Viticci. You can even get very nice iPad cases with keyboards built-in so that they in some ways almost function like laptops, though probably a netbook would be more accurate. The problem is, most people are photon dependent, and they can’t seem to get anything done without a large flashy screen to look at.

I often hear people talk about how much they can get done with their iPads, and that’s great, but my annoyance is that this means the majority also think that the iPhone is useless as a productivity tool. This means: I can’t press command-tab to move between open apps on my iPhone, or get a list of keyboard shortcuts offered by an iOS app. Even though both of these are completely available using the exact same Bluetooth keyboard on  iPads. I can’t think of any technical reason why the same keyboard-focused productivity features used on iPads every day can’t also work on iPhones.

Since 2010 I still carry a foldable Bluetooth keyboard along with my iPhone, and since 2014 braille screen input, more often called BSI has also been available. This means that blind users have been able to be just as productive as their sighted counterparts with their larger iPad screens. If I were in college writing papers today, I could probably do 95% of them on my iPhone only needing my MacBook for finishing touches. If I were sighted, I’d probably want larger screens too, but I hopefully would still appreciate that small screens can also be just as effective.

It seems that many products only offer their high-end features on their larger screened devices. This can be interpreted by some as kind of a screen tax. “I, who can’t use a screen am forced to pay more money for a larger screen that I can’t benefit from, just to get a quad core CPU instead of two cores, etc.” “We only make phones with huge screens and no home button, because that’s what everyone wants.”  When individuals or companies come up with some new awesome product, I just wish they would not assume that everyone thinks the same ways they do.

I still smile when I remember how my friend Eric Knapp explained how confused people looked when they saw me typing on my keyboard but couldn’t see any screen or device; my iPhone was under the table attached to my belt. I wear bone conduction headphones in public, so VoiceOver speaks to me through them just fine. If the iPhone got those cool keyboard shortcuts on the iPad already mentioned above, I would consider that a nice step towards improvements.

There is a very cool app for both iOS and Android called Voice Dream Reader, it can read all kinds of file formats, and to some degree, makes digital text into a kind of audio book. I use it every day. I also thought how amazing it would be if I could have Voice Dream Reader on my Apple watch, the smallest e-book reader ever. Alas, the Apple watch can only play audio through Bluetooth headphones and speakers. Yes, I totally get how bad music would sound through the watch’s very small speaker, but for reading a book; especially if I’m just laying in bed, it seems like another opportunity to think outside the box not taken. Voice Dream Reader is on my watch now, but requiring a Bluetooth audio device makes it inconvenient for me to use.

If I were complaining just to complain, I could have succeeded by babbling to myself in an echo chamber. I wrote this to hopefully show mobile productivity from a different angle, hoping a reader or two might take the next opportunity they have to think or better yet just step outside the box and include more users, regardless of what screen preferences they might have.

My journey to the cool Amazon “Show and Tell” feature discovering along the way that the Echos Show 5 and 8 won’t ever be able to support it.

Posted Tuesday, December 3, 2019
One of the bigger challenges for a blind person is quickly identifying things that aren’t physically unique or labeled in braille. This is one of the frustrations technology has helped in a big way.

The first device that made a significant breakthrough in this area, was the I.D. Mate. A talking barcode reader with a built-in database of products. It was very cool, very efficient and also very expensive; still costs $1299 today. I considered buying one in the fall of 2007, but now am happy I didn’t.

I did however, in late 2009, buy a $300 USB barcode reader and attach it to a netbook which I already owned and had been using. Still expensive, but way less than the I.d. Mate. It also meant I had to keep the netbook on top of the microwave and plugged in. It did work though, was faster than more modern solutions still today, but it was also cumbersome and I finally gave up on it.

There are several nice apps, like Seeing AI,  for either or both iOS and Android that can identify barcodes today. The problem is the APIs assume that the barcode can be visually aligned in the camera view. The app developers have offered beeps to help the blind user do this, but it’s still not as efficient as a dedicated scanner. Smartphones are way more mobile than my old USB barcode scanner attached to my netbook though, so it’s still somewhat of an improvement.

The only annoyance for me in using a smartphone is placing round containers on the counter and then them rolling around; even holding them with one hand, I wished that I could have both hands free to position the item I was identifying.

Enter the Echo Show from Amazon. When the first generation Echo Show was announced with it’s flashy screen so you could watch videos in the kitchen I thought that was the most useless thing ever for a blind person; but then Amazon announced the “Show and Tell” feature in their September 2019 release. I was interested, and decided to go for it.

The Echo Show 5, their most recently announced version seemed the best for me. It was small and cheap, too cheap. I got it and then found it didn’t support Show and Tell. Amazon still says the feature is supported by Echo Show, first and second generation. The Echo Show 5 doesn’t have generation anywhere in its name, but I figure it’s new why wouldn’t it support show and tell. I then found it can’t, because its camera is only 1 mega pixel; I’m still wondering why anyone would want a camera that anemic. Pretty bad if i, totally blind knows 1 megapixel is that bad.

The problem is though, that it really  is that bad, it means that a blind person, who doesn’t need a screen at all can’t buy the least expensive visual models, grumble. This also excludes the new Echo Show 8, still only 1 megapixel camera, frown. The Echo Show  first generation is the best way to go. It’s around $100 for as long as it’s still available, the second generation is $230. More than twice the price, with very little benefit if you can’t see it.

It’s been setup in my kitchen for almost two months now, on top of the microwave, but still smaller than my old netbook. I find identifying food faster and do find having both hands to hold items as convenient as I had imagined. Its database is somewhat limited, but still not bad. I’m guessing it will grow over time. Some times if it can’t identify something exactly, it will read some of the text it can see which can be just as successful in my opinion.

turning on VoiceView, Amazon’s screen reader is easy, and adds some nice capabilities to the Show, and VoiceView gestures are very similar to those in VoiceOver on Apple’s iOS. The Echo Show, first generation, is definitely worth it, even if you’re totally blind and can’t see the screen. The Show and tell feature is more convenient than scanning barcodes with a smartphone, and it will be able to use the “cook to scan” technology if you ever decide to get the new Amazon smart oven in the future.

Yes, I later realized I could put the phone on the counter face down, and then have both hands to position items to be identified, but am still glad I got the Echo Show. It is always ready to identify things for me, even if my phone is in another room, or doing something else.

How to set up an iDevices switch as a blind person.

Posted on December 18, 2018

All over the internet you can find people write phrases like: “first world problems”. Here’s one probably most people haven’t heard before: “blind world problems”. E.g. problems sighted people will probably never experience.
Earlier this month, I setup my first iDevices switch, but until I knew what I was   doing, it was quite a challenge. I had bought it some time ago so I didn’t have the box anymore, so I  couldn’t get the number needed to set it up in the iDevices app from there. I looked all around on the device for the number using Seeing AI (except for the side with the plug that goes into the wall, didn’t think it would be there) and and gave up for the day. The next day, I got sighted help and was told it was on the side by the plug, that faces the wall, where I didn’t look. So blind readers, the 8 digit number you need for setting up iDevices is on the side of the switch facing the wall , in the space  closest to where you would plug in the thing you want to  control. .
Once I knew that, setting up the switch was very easy. Both Seeing AI and the iDevices app could see it, and the iDevices app is quite accessible.
There are 2 ways to get the number of your iDevice switch into the app during setup. You could paste or type it in to the edit box, but I just capture it with my iPhone’s camera, which, even as a blind person is very doable. When the app wants the number and displays the camera view:
1. Make sure you have good lighting, Even if you used the flash, which the iDevices app isn’t capable of, the number seems unreadable in low light.
2 place the iDevice switch on a flat surface with the plug that goes into the wall facing up.
3. hold the phone directly   above the switch  so the camera lens is positioned next to the plug that goes into the wall, towards   where you would plug in the thing to control.
4. Slowly raise the phone straight up about 1cm per second. You should hear that the number capture was successful within 5 seconds or so.
Beyond that, the setup was very easy and accessible.
People say a picture is worth a thousand words, so I needed almost that many to try and explain where the number was on the switch. Let me know if it made sense. 
One more thing,   I use the same camera technique mentioned above for capturing the number on the iDevices switch when I setup my Apple watch. It  works very well there also.