How the Mac app Type2Phone can make any keyboard attached to your Mac a Bluetooth keyboard

Posted on May 20, 2020

I’m typing this post on an old Logitech Ergonomic keyboard from 2003, and occasionally think about replacing it. I hope the keyboard isn’t paying attention….. ok, it’s still working. I’ve read about the Ergo Pro keyboard from Matias, which will probably be its replacement, though it’s a bit expensive; the good Ergonomic keyboards are. The only thing I wish is that the Ergo Pro was Bluetooth like some of Matias’s other models

Last week, I read on twitter where another person also  lamented over that there are very few if any Bluetooth ergonomic keyboards around, and that reminded me about Type2Phone, and I realized I had a solution

I bought Type2Phone made by Houdah Software on the Mac app store 9 years ago, and when I’m at my desk and need to type something on my phone, I am the world’s worst touchscreen typist; I open up Type2Phone and Voilà my MacBook is now a very expensive Bluetooth keyboard. It has worked well, I have no complaints. At first though, I thought I had to wait for Type2Phone to connect to my iPhone but now I have realized I don’t need to wait at all. I give it about a second and then just type away and it seems to always work.

. I wondered if it would work with more than one device, so . I then also successfully paired it to my iPad, and Apple TV, and can easily switch between the three of them. Apple clearly says that MacBooks may only be paired with up to seven Bluetooth devices simultaneously, though it doesn’t say how many can be remembered but not connected. Actually, the limitation of seven simultaneously paired Bluetooth devices is an official Bluetooth spec. Beyond that though, I’m guessing you can probably pair more than seven devices since you wouldn’t want to type to them simultaneously anyway. Think of the chaos that would ensue if that happened. Four devices is the most I’ve ever seen a Bluetooth keyboard capable of pairing with anyway. I think Type2Phone can probably pair to as many devices as you want, and then just connect to the one you need at that time.

So for an extra ten dollars, you can easily make whatever keyboard you have inside or attached to your Mac work with any of your Bluetooth devices, which makes the fact that ergonomic keyboards almost always seem to be wired much less of a deal.

My thoughts on how with some software, cameras could be much more useful for blind people

There are articles all over the Internet about how you can use security cameras, and which are the best ones to buy. I have also heard how people use cameras to keep track of their pets while away from home. I even have a friend who uses a web cam to make sure his 3d printing job at home hasn’t failed while he’s at work.

This is all cool, and I’m sure helps resolve many situations for people, but all of these examples require that the person can see. I would call it a passive use of cameras; that is, the image processing is left for the human. If I had sight I’d probably use some of those too; but I’m not.

In 2013 at the Snow*Mobiling conference in Madison Wisconsin, I heard Wes Bos speak about an ingenious project I never would have thought of. He put a web cam out by his mailbox and wrote a web app that would alert him if the image changed. He could then see if the mail person was there or the position of the mailbox flag.

I hadn’t thought of that for 5 years until earlier this week when Someone on my facebook page remarked how they hated wasting food, and their post reminded me about how sadly a few times a year I leave food out on the counter overnight that should be in the fridge or freezer making it no-longer safe to eat. (Puts face in hands and sighs.) Thus, as Allison Sheridan often likes to say, “a problem to be solved”.

Something I’ve been imagining since, is a way to have cameras plus hopefully somewhat smart software do things for blind people, a more active use of cameras; it would probably also require a bit of machine learning.

Maybe a blind person could have a wide-angle camera constantly looking at their counter top with a program that would recognize a person, something that looks like a package or box, and some more common foods. If the app thought that the person hadn’t been recognized for say 2 hours, and it saw something that looked like a box, package or common food item, like left out pizza; it could send an alert to the person. “Food has been possibly left out on counter for 2 hours.”

Another problem to solve would be that most blind people try to save energy and leave most if not all lights off as much as possible, maybe the camera could have a small LED flash ; just thinking as I write.

This doesn’t exist beyond thoughts and ideas right now, but probably could and not be too expensive. It might be possible with a camera, LED, and a raspberry pi; plus the time to write a program.

I have also thought instead of using a pi to use an old smartphone. Smartphones are compact and have all you need in one device, and like a pi would also be easy to power.
Or, after further thought, maybe only a camera with wifi would need to be in the kitchen, and a small personal server, maybe even just a computer somewhere else in the home could do the processing; more thoughts.
Let’s make it happen.

My thoughts on why we should be physical distancing, instead of social distancing

Posted on Tuesday April 14, 2020
During the Covid-19 pandemic, we have been asked to stay 6 feet or 2 meters apart misnamed as social distancing. I have wish from the beginning that we called it physical distancing; we are not told to stop communicating, nor should we. Staying farther apart to reduced the spread of the corona virus is what we are being asked to do. Communicating sincerely is what we could be doing more than ever. Doing so will make our world even more awesome once physical distancing is no longer needed.

Make sure you check in with your family and friends. Smash down those phone phobias, and dust off your web cams; enjoy some meaningful conversations.
Talk with relatives and build more memories to cherish. Plan an eventual family get together to celebrate the other side. Catch up with friends or colleagues you haven’t spoken with for way too long and make plans to meet up in the future.

Social distancing implies being even more introverted and disconnected than our society already is. Instead of doing that, let’s continue to physical distance as we communicate even more deeply than ever before. We have the technology to do it.

With April fool’s jokes canceled this year because of Covid-19, I wrote about how Haydn’s “Surprise” symphony would still be a tasteful laugh

Last Wednesday was April 1, aka, April fools day, and because of the Covid-19 pandemic, most people canceled the chaos and mayhem that usually happened on that day in other years, but I thought I’d remind some of us who already knew, and introduced others who didn’t yet, to one of the best practical jokes ever; The “surprise” symphony by Franz Joseph Haydn.

F. J. Haydn grew-up in Vienna in a very musical family. His younger brother, Michael, also became a composer in his own right.

As a child, Franz was a Vienna choir boy, and then as a young man, he worked hard at learning how to compose. Later, he then , worked for the Esterházy family living with them on their remote estate. This job lasted several decades, and being quite removed from any large city Haydn said he was “forced to be original”. While there he mastered his craft, and also became internationally revered as a composer. He was best of friends with Wolfgang Amadeus Mozart saying, “Mozart is the best composer I know”.

However, from between when Mozart died in 1791 until Ludwig Van Beethoven fully emerged as a composer in 1804 with his third symphony called the “Sinfonia Eroica”, F. Haydn was probably the greatest living composer in Europe. Sadly, today he along with his brother Michael even more, fly way under the radar today.

The symphony had already been used as material for a practical joke by the playful master when he wrote his “Farewell” symphony twenty years earlier, so when Haydn was spending time in London in 1792, his eyes lit up with another mischievous idea.

Franz had noted during his years with the Esterházy’s , that members of the royal court usually came to concerts after a big meal and would often nod off during the second (slow) movements of symphonies. Thus, he wrote the second movement of his symphony 94 in g major.

The movement is. marked andante, which means a walking tempo. The melody is light, mostly staccato, seemingly innocent,  and the first eight measures have the dynamic marking of piano. The melody is then repeated at pianissimo and this time ends with a bang. The rest of the movement follows the classical form of theme and variations on the original theme, and some times, to my ears, sounds like a musical laugh or smirk. I’ve known the piece since early grade school when I was introduced to it by my first piano teacher Gretchen, and decades later it still makes me smile today.

The symphony as a whole all deserves to be listened to, I cordially invite you to check it out.

After his time in London, which he loved; Haydn returned to Vienna a wealthy man. Something totally unheard-of for a musician during that time.

Also around this time between his stays in England, Haydn became a teacher/mentor to the then young  Beethoven. “I took instruction from Haydn, but didn’t learn anything” he said; but Ludwig still honored the aging composer with dedications of his first three piano sonatas.

It is actually not certain if Franz Joseph Haydn was born on March 31st or April 1st, but he preferred to celebrate it on the 31st. My first piano teacher Gretchen loved the music of Haydn very much, and ironically she also died on March 31st. No, that was not a joke, I really didn’t make that up; the symphony is way more tasteful of a joke than that could have ever ben.

Haydn’s music is as fresh now as it was back when he wrote it. Hopefully people will still be enjoying Haydn’s music, including his symphony 94 for another 200 years.

My thoughts on how productivity is way more portable than in the past, but how annoyingly some non-visual features only appear on products with larger screens

Posted on December 13, 2019

Johan Sebastian Bach probably wished he’d had a better way to work on his “Musikalisches Opfer”, “A Musical Offering” when he traveled back home to Leipzig from visiting King Frederick the great of Prussia  near Berlin in 1747. Ok, he probably instead really wished for something faster and more comfortable than a horse-drawn carriage.

I still remember being in high school, when I had a 30 minute ride between school and home, each way; wanting a portable typewriter that ran on batteries; all the homework I could have done.

In 1987, Deane Blazie and his company, Blazie Engineering, came out with the Braille ’n Speak, very awesome for its time. David Holladay told me there was nothing like it in the sighted world at that time. He considered buying one for his own use, even though he is sighted.

The original version was a braille note taker, had 192K of memory that could be separated into 30 separate files. It was kind of that portable typewriter I wanted; it had  actually already been out when I was a senior in high school, I just didn’t know about it until a month before   I started college.. The Braille ‘n Speak was 8 by 4 by 1/2 inches and was way easier to carry around than the Perkins braille writer which I still used in my dorm room, just not loudly in my classes.

People everywhere got excited when laptops became viable, affordable, and light weight enough to not break the average back, though the most portable models even today are still definitely breaking wallets. Even the lightest of laptops back then still had rather large footprints, until the netbooks came out in the late early two thousands. People liked netbooks because they were easy to carry, and although not very powerful were good for writing projects and web browsing.

Back in the blind community, Levelstar came out with both the Icon, and the Braille Plus, which were kind of the Braille ‘n Speaks of their time in 2007. The Icon was 3 by 5 by 1 inches, the Braille Plus slightly wider. They both ran linux,, excited many people, and probably would have gone much farther if it weren’t for the industry-wide disruptive changes brought by Apple and their iOS devices. Yes, they also killed the netbook market.

When the first iPads came out, people thought they were just personal screens to consume content on; but now, ten years later, as iPads continue to get more powerful, many people see with significant success, how much work they can do on them while on the run; just ask  Federico Viticci. You can even get very nice iPad cases with keyboards built-in so that they in some ways almost function like laptops, though probably a netbook would be more accurate. The problem is, most people are photon dependent, and they can’t seem to get anything done without a large flashy screen to look at.

I often hear people talk about how much they can get done with their iPads, and that’s great, but my annoyance is that this means the majority also think that the iPhone is useless as a productivity tool. This means: I can’t press command-tab to move between open apps on my iPhone, or get a list of keyboard shortcuts offered by an iOS app. Even though both of these are completely available using the exact same Bluetooth keyboard on  iPads. I can’t think of any technical reason why the same keyboard-focused productivity features used on iPads every day can’t also work on iPhones.

Since 2010 I still carry a foldable Bluetooth keyboard along with my iPhone, and since 2014 braille screen input, more often called BSI has also been available. This means that blind users have been able to be just as productive as their sighted counterparts with their larger iPad screens. If I were in college writing papers today, I could probably do 95% of them on my iPhone only needing my MacBook for finishing touches. If I were sighted, I’d probably want larger screens too, but I hopefully would still appreciate that small screens can also be just as effective.

It seems that many products only offer their high-end features on their larger screened devices. This can be interpreted by some as kind of a screen tax. “I, who can’t use a screen am forced to pay more money for a larger screen that I can’t benefit from, just to get a quad core CPU instead of two cores, etc.” “We only make phones with huge screens and no home button, because that’s what everyone wants.”  When individuals or companies come up with some new awesome product, I just wish they would not assume that everyone thinks the same ways they do.

I still smile when I remember how my friend Eric Knapp explained how confused people looked when they saw me typing on my keyboard but couldn’t see any screen or device; my iPhone was under the table attached to my belt. I wear bone conduction headphones in public, so VoiceOver speaks to me through them just fine. If the iPhone got those cool keyboard shortcuts on the iPad already mentioned above, I would consider that a nice step towards improvements.

There is a very cool app for both iOS and Android called Voice Dream Reader, it can read all kinds of file formats, and to some degree, makes digital text into a kind of audio book. I use it every day. I also thought how amazing it would be if I could have Voice Dream Reader on my Apple watch, the smallest e-book reader ever. Alas, the Apple watch can only play audio through Bluetooth headphones and speakers. Yes, I totally get how bad music would sound through the watch’s very small speaker, but for reading a book; especially if I’m just laying in bed, it seems like another opportunity to think outside the box not taken. Voice Dream Reader is on my watch now, but requiring a Bluetooth audio device makes it inconvenient for me to use.

If I were complaining just to complain, I could have succeeded by babbling to myself in an echo chamber. I wrote this to hopefully show mobile productivity from a different angle, hoping a reader or two might take the next opportunity they have to think or better yet just step outside the box and include more users, regardless of what screen preferences they might have.

My journey to the cool Amazon “Show and Tell” feature discovering along the way that the Echos Show 5 and 8 won’t ever be able to support it.

Posted Tuesday, December 3, 2019
One of the bigger challenges for a blind person is quickly identifying things that aren’t physically unique or labeled in braille. This is one of the frustrations technology has helped in a big way.

The first device that made a significant breakthrough in this area, was the I.D. Mate. A talking barcode reader with a built-in database of products. It was very cool, very efficient and also very expensive; still costs $1299 today. I considered buying one in the fall of 2007, but now am happy I didn’t.

I did however, in late 2009, buy a $300 USB barcode reader and attach it to a netbook which I already owned and had been using. Still expensive, but way less than the I.d. Mate. It also meant I had to keep the netbook on top of the microwave and plugged in. It did work though, was faster than more modern solutions still today, but it was also cumbersome and I finally gave up on it.

There are several nice apps, like Seeing AI,  for either or both iOS and Android that can identify barcodes today. The problem is the APIs assume that the barcode can be visually aligned in the camera view. The app developers have offered beeps to help the blind user do this, but it’s still not as efficient as a dedicated scanner. Smartphones are way more mobile than my old USB barcode scanner attached to my netbook though, so it’s still somewhat of an improvement.

The only annoyance for me in using a smartphone is placing round containers on the counter and then them rolling around; even holding them with one hand, I wished that I could have both hands free to position the item I was identifying.

Enter the Echo Show from Amazon. When the first generation Echo Show was announced with it’s flashy screen so you could watch videos in the kitchen I thought that was the most useless thing ever for a blind person; but then Amazon announced the “Show and Tell” feature in their September 2019 release. I was interested, and decided to go for it.

The Echo Show 5, their most recently announced version seemed the best for me. It was small and cheap, too cheap. I got it and then found it didn’t support Show and Tell. Amazon still says the feature is supported by Echo Show, first and second generation. The Echo Show 5 doesn’t have generation anywhere in its name, but I figure it’s new why wouldn’t it support show and tell. I then found it can’t, because its camera is only 1 mega pixel; I’m still wondering why anyone would want a camera that anemic. Pretty bad if i, totally blind knows 1 megapixel is that bad.

The problem is though, that it really  is that bad, it means that a blind person, who doesn’t need a screen at all can’t buy the least expensive visual models, grumble. This also excludes the new Echo Show 8, still only 1 megapixel camera, frown. The Echo Show  first generation is the best way to go. It’s around $100 for as long as it’s still available, the second generation is $230. More than twice the price, with very little benefit if you can’t see it.

It’s been setup in my kitchen for almost two months now, on top of the microwave, but still smaller than my old netbook. I find identifying food faster and do find having both hands to hold items as convenient as I had imagined. Its database is somewhat limited, but still not bad. I’m guessing it will grow over time. Some times if it can’t identify something exactly, it will read some of the text it can see which can be just as successful in my opinion.

turning on VoiceView, Amazon’s screen reader is easy, and adds some nice capabilities to the Show, and VoiceView gestures are very similar to those in VoiceOver on Apple’s iOS. The Echo Show, first generation, is definitely worth it, even if you’re totally blind and can’t see the screen. The Show and tell feature is more convenient than scanning barcodes with a smartphone, and it will be able to use the “cook to scan” technology if you ever decide to get the new Amazon smart oven in the future.

Yes, I later realized I could put the phone on the counter face down, and then have both hands to position items to be identified, but am still glad I got the Echo Show. It is always ready to identify things for me, even if my phone is in another room, or doing something else.

My thoughts about unlike a picture, an audio recording needs to last a longer amount of time to develop

Posted on October 19, 2019
Audio takes time.
Many have said, “a picture is worth a thousand words”, but it would take way longer to say or write those words than it did to take that picture. A picture is the capturing of a moment less than a second that lasts metaphorically forever. I imagine that someone could look at a picture over a period of time and realize more details they hadn’t before as time goes on, some of the strengths of a picture
Why am I, a totally blind guy writing about pictures? Because there is also an equivalence for people who can’t see them. I’m guessing many reading this post have pictures of their family and friends, or even of people they’ve just met; for me that’s recordings of those people in my life talking. This is where the similarities differ. Audio unlike a picture only lasts as long as it was originally. I can’t focus on a millisecond of the audio like someone can focus on someone’s face in a picture, and it’s something I think people aren’t realizing at first.

Some times when I ask someone to record, they just say a few words lasting less than 10 seconds, but I find that at least for me, it takes a minute or two before the person’s voice starts to resonate in my mind.
If you have a page of text you can take as long to read it as you want, but if that text were spoken to you faster than your brain could understand it, you would be overwhelmed. . If you only let cheese age for a day, you would probably only still have curdled milk. .

Music is also like this. It may be the only art form that actually exists in time, in the moment when it is heard; no matter when it may have originally been written. Also, if a slow song were sped up, its meaning would be lost and it’s essence destroyed.

Just some thoughts to consider.

How the Amazon Basics microwave oven has replaced the much more expensive talking microwaves made specifically for the blind

Posted on Friday September 6, 2019

In summer 1981 between my fifth and sixth grade years, my parents and I visited my oldest sister Kathi in Colorado. She had a microwave oven, and her kids who were in grade school could use it to cook or reheat foods. Kathy thought it would be good for me also, as it would be safer than a stove.
Later that summer Mom and Dad bought one.

Although not the panacea originally thought as, the microwave is still quite useful I use it almost every day. The model I’d been using from 2001-2018 was the Panasonic mid sized model, though I had to buy a second one after the first one died in 2014 but hey 12 years isn’t bad.
Still one problem for blind people with just about any microwave though, is that they have flat panels so if you’re blind it was hard to know where the buttons were. Many of us found ways to braille labels and put them on to the microwave panels. Braille takes up a lot of space though, so some of us also , used Highmarks, or fabric paint, which is much cheaper, to save space; but there was only so much room. This meant not all of the more advanced features or any features selected from a menu were usable unless memorized.
There were talking microwave ovens made specifically for blind people to use, but they were significantly more expensive, costing as much as $400,  and often hard to find. Some of them also didn’t have as many features or were lower powered.

Then last summer Amazon announced their Amazon Basics microwave oven, and you could control it with any of their Amazon Echo  devices.
I read reviews when they came out, but sadly many people thought it was silly or only a novelty. Why control something with your voice when the buttons are right there. They seemed not to think about how much more convenient it might be for someone with a disability, or even for someone without any disability at all. They also mentioned that the Amazon microwave was under powered, and it is at only 700 wats, and it is also smaller, so it may not serve a family of more than 2 people very well, but it’s a start. With that in mind I still felt it was an ok experiment.

It showed up, I plugged it in, and it configured itself. Because I had already configured an Amazon echo device, it knew my wifi network, and once on there, it also set the correct time. Awesome, now my sighted friends won’t have to tell me that the clock isn’t set anymore.
I still had a friend mark the microwave with fabric paint though, and one day when there was an internet outage I had to press the buttons like an animal. Most of the time though, , unless in a phone call, I control it with my voice.
One can say things like “Alexa, Cook microwave for 3 minutes” and it will do that, you can also say “at power x” where x is from 1 to 10. I also had marked the popcorn bacon defrost and reheat buttons on previous microwaves I’d owned, but this was about the most a blind person can do with most models.
With the Amazon model, you can say things as advanced as, “Alexa, Cook 8 ounces of broccoli,” “cook one cup, (or bowl), of soup;” “cook one cup of coffee. saying heat also works. baking a potato also works. There are more commands, I’ve only mentioned a few here.
One might say that’s cool and all, but the expensive talking microwaves specifically for the blind could tell you how much time was left in cooking. That’s true, but you can get the same on the Amazon microwave as well. You can also add more time while it’s cooking; so if you say “Alexa add 1 second to microwave,” it will tell you something like “cooking 45 seconds on power 10.”

It is true that the Amazon basics microwave is under powered, so it takes a bit longer to cook things, but at the end of the day, it’s really not that big of a deal. It is also quite small, large dishes may not fit. I still call my experiment a success with a few miner caveats.

General Electric also has a model of microwave for about $150 that can be controlled by an Amazon device. It is larger, 0.9 cubic feet,  and runs at up to 900 wats, so also not as under powered. It also has a feature where you can scan barcodes on frozen food packaging with your smartphone, after which your phone will look up cooking directions and send them to the microwave. This in the reviews I read seemed not to work so well, but hey probably future models will get better.

Even if one  bought the GE model and an Amazon Echo Dot   for around $200 , they could have a talking microwave for about half the price  than one of those previously mentioned talking models specifically for the blind. As I have said, mainstream device plus talking cell phone, or in this case Amazon echo device, equals talking, or accessible mainstream device. Seemingly novelty features to many are game changing accessibilities for others. Designing with inclusion from the start, is always the best way to go.


Appended on April 11, 2020

Shortly after I wrote this article, Amazon announced their Amazon smart oven.  It costs $250 so still less expensive than some of those talking microwaves made for the blind, and does way more. Yes, it can be a microwave, but it can also be a food warmer, a convection oven, and an air fryer. It is also bigger, 1.5 cubic feet. It is very cool.

My thoughts and discoveries on how Amazon now uses proprietary AC adaptors on their echo dots.

Posted on January 7, 2019
A month or 2 ago, I read that the new Amazon Echo Dot third generation model no longer was powered from micro USB, but now required a proprietary AC adaptor, I remember the old days when every device had its own AC adaptor,  those were frustrating times. I then got my own Amazon Dot third Gen model, and was annoyed. Then, three weeks ago I read some tweets where others were also annoyed, and questioned why Amazon made that change. I then realized, hey I have an iDevices switch and it can measure energy, so I decided to do some testing and find out.
When  idling, the Dot 3rd Gen model draws about 1.8 watts. I then played Beethoven’s 5th symphony at full volume and could only nudge the Dot up to 3 watts, even in the moments marked fortissimo .  If someone could double that some how, so the Dot drew  6 Watts, or even triple it to , 9 watts, that would still  fit in the generally accepted ceiling of 15 Watts on micro USB 3.1. Yes, even the Amazon Echo Dot third generation can run on micro USB, so it is not a technical limitation.
My thoughts are that people tried to power their new Dots from USB ports on their computers or old USB hubs, where they only got the traditional USB specification of 5.0 volts at 500 milliamps which only multiplies out to 2.50 watts. I’m guessing Amazon got lots of nasty calls when people tried to play music and had problems. They decided instead of trying to tell people that the Dot required 1 amp or throwing in a USB power adaptor they just went proprietary.
Ye it’s still annoying, but I also still enjoyed figuring it out.

How to set up an iDevices switch as a blind person.

Posted on December 18, 2018

All over the internet you can find people write phrases like: “first world problems”. Here’s one probably most people haven’t heard before: “blind world problems”. E.g. problems sighted people will probably never experience.
Earlier this month, I setup my first iDevices switch, but until I knew what I was   doing, it was quite a challenge. I had bought it some time ago so I didn’t have the box anymore, so I  couldn’t get the number needed to set it up in the iDevices app from there. I looked all around on the device for the number using Seeing AI (except for the side with the plug that goes into the wall, didn’t think it would be there) and and gave up for the day. The next day, I got sighted help and was told it was on the side by the plug, that faces the wall, where I didn’t look. So blind readers, the 8 digit number you need for setting up iDevices is on the side of the switch facing the wall , in the space  closest to where you would plug in the thing you want to  control. .
Once I knew that, setting up the switch was very easy. Both Seeing AI and the iDevices app could see it, and the iDevices app is quite accessible.
There are 2 ways to get the number of your iDevice switch into the app during setup. You could paste or type it in to the edit box, but I just capture it with my iPhone’s camera, which, even as a blind person is very doable. When the app wants the number and displays the camera view:
1. Make sure you have good lighting, Even if you used the flash, which the iDevices app isn’t capable of, the number seems unreadable in low light.
2 place the iDevice switch on a flat surface with the plug that goes into the wall facing up.
3. hold the phone directly   above the switch  so the camera lens is positioned next to the plug that goes into the wall, towards   where you would plug in the thing to control.
4. Slowly raise the phone straight up about 1cm per second. You should hear that the number capture was successful within 5 seconds or so.
Beyond that, the setup was very easy and accessible.
People say a picture is worth a thousand words, so I needed almost that many to try and explain where the number was on the switch. Let me know if it made sense. 
One more thing,   I use the same camera technique mentioned above for capturing the number on the iDevices switch when I setup my Apple watch. It  works very well there also.