Category Archives: Uncategorized

How I can edit files on a remote server using Textedit on my Mac through SSH

Posted on December 1 2018

Two years ago, I wrote about how I use Textedit instead of vi, vim, or nano  when editing files  in my macOS terminal. It’s still working well for me, but then I wanted it when remotely logged into other computers. Textmate can do that, but you need ruby on the remote machine and have to move over a ruby script etc. Last month I got a Ubiquity edge router x which I have no regrets with even after taking all the Cisco classes up to CCNP, configuring Cisco routers for 8 months  as an intern for the state, and running my home network off   Cisco   routers for 14 years  since 2004. I also recently  remembered a Nano Pi Neo, I’d been neglecting for 2 years, and between the 2 devices was needing to edit files on them, and the router doesn’t have ruby installed; I wanted my  Textedit. (Imagines a small child throwing a tantrum because they didn’t have their   security blanket.)
I downloaded Transmit from Panic software which is an awesome program, but its price was also a little too awesome for my wallet, so back to the drawing-board. 
I googled around looking for a way to edit files remotely through SSH connections and found out about SSHFS. FUSE is a way to create a file system in user space, or in simpler language,  to create kind of a virtual file system. SSHFS uses FUSE to create a session in-which a specific folder on a remote machine appears in a specific folder on your  machine in front of you, and then keeps them synced together. This lets me use my Textedit cheat to edit files on my mac as I have been doing for 2 years except in addition now they get synced back to the remote machine through SSH. It’s pretty cool, but if using a mac  you’ll need a few extra pieces to do it.
Hard core *nix users probably argue that a mac really doesn’t have BSD unix because it doesn’t have a package manager, but there are a few. Home brew is probably the best now though it had to compete with several others like Mac Ports and Fink for some years before emerging. You will need Home Brew to eventually get SSHFS so here’s the process for getting itFirst you’ll need Xcode command-line tools if you don’t have them already. Paste the following into the terminal:xcode-select –installand decide if you only want the CLI tools or the whole Xcode install. With only 128gb on my MacBook air, I think you can figure out which I chose. The Xcode command-line tools are very nice to have as they give you gcc clang, and other programming goodies.
Next it’s time for Home brew. Paste/usr/bin/ruby -e “$(curl -fsSL”
Next we will need 2 packages before we can get sshfs working.  brew cask install osxfuse

brew install sshfs

Now we can actually make it happen. 
First on your machine make a folder where you want the files from your remote machine to momentarily exist.
Then type something like sshfs username@server-ip:/path-to-folder ~/folder-for-remote-filesObviously typing server-ip won’t work  neither will username but it should be easy to fill in your specific information there.The first time you do it, you will have to answer some questions from the security and privacy preference pane. After you say yes to them, it should work.
It seems to stay connected until the next reboot. You can do anything to those files you want and changes will sync between both devices. It’s very nice.

How I discovered that the audio in Live Photos can help blind people identify and organize them

Posted on May 17, 2018

When Apple announced live photos along with their iPhone 6s in 2015 almost everyone I know or read thought they were nothing more than a stupid gimmick, and promptly turned them off. It took 2 plus years before advantages of Live Photos started to show up as mentioned by Allison Sheridan recently on her blog, as well as a post last year on How to Geek; but if you’re blind, you’ve had something cool since day one.

When everyone else thought Live Photos were stupid and just a silly way to waste space, I immediately realized that they brought accessibility to photos in an interesting way. With 3 seconds of audio, someone could easily provide an audio label for those pictures. If a blind person went on vacation and wanted a few pictures for their sighted family and friends, they and/or someone with them could add audio like “Uncle John and Grandma on the beach” or “Julie standing near the Eiffel Tower at night”. The possibilities are endless. Then, when a blind person scrolls through their library and opens one of their Live photos they hear the audio and can rename them for faster browsing in the future.
A sighted person could even take live photos on their phone adding audio tags, and then send them to a blind friend or family member.

I could even see a future version of iOS offer to transcribe the audio in Live photos and use that text to rename the file, that would just be cool and make my love of efficiency side all warm and happy.

Just another reminder that when a feature seems silly or useless, maybe it helps someone else in huge unimagined ways.

My thoughts about looking at the sun last Friday with the BrainPort, preparing for the solar eclipse

Posted on August 3, 2017

North America had its last total solar eclipse of the 20th century on February 26, 1979. My 3rd grade teacher, Mrs. Love, told the class the next one would be in 2017. My 9 year-old brain almost had a meltdown trying to imagine how far in the future that would be; I remember when I was five thinking 20 was old.

Decades passed, and among many things that happened, Dr. Paul Bach-y-rita invented the BrainPort device made by Wicab, and over time I became one of their primary testers.

I late summer, 2009 I wondered if the BrainPort could show me the moon, it could and I saw 3 lunar eclipses over 2014-15.

I then began to think hey maybe I could see the upcoming solar eclipse too, , so last Friday I successfully looked at the sun using the BrainPort device. I had to stand with my face pointing almost straight up which was quite uncomfortable and made me slightly dizzy, but maybe I could use a chaise lounge and lay almost flat.
The sun was round and quite small, seemingly smaller than I remember the moon being. The interesting thing is that to see the sun I had to turn invert mode on, which means the BrainPort would show me dark objects on a light background. When I tried looking at the sun with invert off, which means the BrainPort would show me light objects on a dark background; the bright sunlight completely overloaded the camera sensor. Conversely, when looking at the moon, invert mode needs to be off.

I remember when looking at the lunar eclipse in September 2015, for a time there were clouds almost covering the moon. With invert off, I could see the moon; with invert on, I could see the clouds. The clouds visually seemed to be almost touching the moon, though I knew the moon was over 200000 miles away, maybe one of the most transformative things I have experienced with the BrainPort.

Even though the eclipse will only be 83% here in Madison, it will still be another one of those transformative moments for me, as using the BrainPort device helps me more understand visual concepts.

Why blind people should care about social media and contact photos, facial recognition

Posted July 29, 2017

I have observed in the blind community over the years that many of us seem to care little, if at all, about pictures. In the past, I admit, they didn’t do much for us most of the time, but times are changing.
Starting with TaptapSee,  KNFB Reader, and other less known apps like Live Reader, and currently popular apps like Seeing AI, blind smartphone users are finding more meaningful ways to take and use photos; but now I have another way not yet realized.

Sighted friends have told me that until recently many blind users, maybe unintentionally, often didn’t have any picture on their social media profiles at all. For some, it hasn’t mattered as many of their followers are also blind, but now with apps like Seeing AI and more in the future; the potential of facial recognition will be huge.
If, or more likely when Seeing AI or some similar app can use social profile, or contact photos for facial recognition, just imagine for a moment how awesome that could be. A blind person could wave their phone camera around and find colleagues or friends at a restaurant, at work, or maybe at a conference, or family reunion. It would be somewhat like how a sighted person just looks around and sees someone they haven’t talked to in years and walks over to them to say hi.

At work or a conference, maybe a blind person hears someone presenting and doesn’t know who they are. , With an app like Seeing AI, and a contact list with photos attached in their phone, they could easily find out quickly who was speaking without interrupting anyone else. Add multiple people asking questions after the talk, and the feature may be used several more times. That’s only one of the many ways I or any other blind smartphone user could more independently fit in to an ever more increasingly visual world, and yes this is a form of augmented reality at work.

It’s not just social profile pictures though, it’s more importantly the pictures that Android and iPhones associate with the user’s contact list. This means that some of us, me included, will have to ask people we know to send honest pictures of themselves to us to add to our phone contact list; even just a head shot is good. It’s not a project i could even begin to finish in a day, so let’s start now. the effort we put into building a personal database of pictures of people we know today, will connect our digital tools to our human world more than we could ever imagine tomorrow.

My realization that blind users of VoiceOver have had touch screen macs since 2009

In the early 1990s , Neal Stephenson released his now well known book “Snow Crash“. Then in 1999 he wrote the even more famous book Cryptonomicon. He also wrote a lesser known and much smaller essay entitled “In the Beginning was the Command Line“. In this essay Neal Stephenson talks about interfaces; not just of computers but how every object we use has an interface, beginning with his best friend’s dad’s old car. He talks about how beginning with the first mainframe terminals up to Microsoft Windows and Apple’s Macintosh, the way humans first interacted with the computer was through the command line. The command line is still great, takes few resources, and even still today potentially simultaneously provides many more options than any graphical interface often called a GUI. The GUI was invented though, for the same reason the command line replaced punch cards, the command line was way more efficient than punch cards for everyone, and then later the GUI was more convenient than the command line and easier to use , at least for sighted people . Graphical interfaces meant people didn’t have to remember tons of commands, and could become more familiar with a system faster. The mind with sight available to it, is great at making data points of spatially presented, and intersecting pieces of information. The GUI is great at displaying information in 2 or 3 dimensions to the visually enabled mind, instead of 1 dimension the command line presents. It was a great match, except for the abstractions we have still today. The arrival of Apple’s first Macintosh in 1984 blew the world away with it’s amazing graphics for that time, and the mouse? I’m sure many wondered why they would ever want a small furry rodent on their desk.


Along with computer mice, we also saw trackballs and trackpads, but they all still have the problem of dynamic rather than static reference.
When using a trackpad, if the mouse pointer is in the center of the screen, but the user places their finger on the lower left corner of the trackpad and slides to the right, the pointer will move from the center of the screen to the center of the right edge; and depending on how the settings are the finger may have only moved a half an inch, or 6 inches, still on the bottom of the trackpad. The mouse is even more removed by abstraction. I played with all 3 of these input devices during my years on Microsoft Windows, but was never productive with any them.


In early January 2007 while having dinner with my friend Nathan Klapoetke  he was ecstatic about the new iPhone that had just been announced; at the time I cringed in fear knowing that soon all cell phones would no longer have buttons and had no idea how a blind person would use them.

Two years later at WWDC 2009 Apple announced that VoiceOver was coming to the 3GS  and the blind community was completely blown away, no one saw that coming. Late in June 2009 I went to the Apple store and played with VoiceOver for the first time. I’d already read the iPhone manual’s chapter on VoiceOver, so I had a bit of an idea what to do, or at least how to turn VO on. I only had an hour to play, but except for typing, reading text and getting around basic apps didn’t seem too bad; 9 days later I bought one. The first text message I tried to send though, was a complete disaster, but I still knew my world had changed for the better.

The idea that when you touched some part on the screen, you were directly interacting with that icon or word made a lot of sense to people; blind and sighted alike. Even young children before they can read understand tapping on icons to start games they already , , know how to play. In some ways, the touch screen is the command line equivalent of visual interfaces. Being able to directly touch and manipulate screen elements is efficient on such a basic level, that I wouldn’t be surprised at all if using touch screen interfaces activated the same parts of the brain as making something out of play dough  or clay. There’s an interesting topic of discussion currently going on over how Microsoft tried to make Windows 8 a touch first interface, failed, and now how Windows 10 offers touch based interfaces for those who want it but still behaves like a traditional desktop. On the other hand, Apple has never tried to bring touch screens to their macOS at all until the 2016 line of MacBooks with the new touch bar, which really isn’t a screen at all and currently must only be an extra program’s offering as many macs still don’t have it.

And now, as Paul Harvey used to say, “, the rest of the story.” as most people would tell you, and as google searches would reply with, there are no Apple computers with a touch screen. Except, unless you’re a totally blind person using VoiceOver. The gestures VoiceOver users learn on their iPhones have been available to them on their macs as well starting with Snow Leopard. ; with trackpad commander on VoiceOver , behaves very much like it does on iOS. If with trackpad commander on, I touch the exact center of the trackpad, the mouse pointer is also on the exact center of the screen, and if VoiceOver announces an icon i want i just double tap to activate it. All of the abstraction I struggled with trying to use a mouse or trackpad without the commander mode are gone; but here’s a rare moment where sight still gets in the way. It is so instinctive for someone who can see to visually follow  where their hand is going, that even if most of them turned VoiceOver and trackpad commander on and speech off while still looking at the screen, they still would find it quite difficult to use. that the screen being separate from the trackpad visually is too abstract for many of them. The trackpad is obviously much smaller than the actual screen, though since I can’t see it that doesn’t really matter anyway, but beyond that as a VoiceOver user I’ve had a touch screen on my mac for 7 years. I and probably most other blind users still don’t use it as much as we probably should, or for many of us hardly at all, though I have found some ways in which it is way more efficient than using more traditional VoiceOver keyboard commands.


If I’m told that an interface control I want is in the lower left corner of the screen, using trackpad commander, I can go there directly. If I’m using an interface with a list of items in the center and buttons around the edge I can get to the list way faster than navigating there with a keyboard.

Tim Sniffen has published a book entitled “Mastering the mac with VoiceOver” in which he for the most part ignores keyboard commands altogether and teaches with trackpad commander mode instead. He trains many veterans who lost their sight while deployed. , and says after they become comfortable with VoiceOver on iOS it’s an easy transition for them to their macs. We VoiceOver users should probably listen more to Tim and learn from his experiential wisdom, and for the sighted proud, at least you know if your vision ever degrades so far that in the end you have to use VoiceOver, at least you’ll have a touch screen on your mac.

Thoughts on how I keep remembering Beethoven’s birthday, music and technology

It is possible that if it hadn’t been for Charles Schultz, and through his Peanuts cartoons , December 16th being the birthday of Ludwig Van Beethoven would have been as unknown to most of us today as the birthdays of the other great composers. In the Charlie Brown Christmas special recorded in 1965, one of his best friends Schroeder plays the opening measures of Fur Elise to celebrate the birthday of the Viennese master. Though played with a simplified left hand part, it’s still a nice touch decades later. In one of his  comic strips, Schultz has Schroeder forget Beethoven’s birthday and then be reminded by Lucy  that he had forgotten.


My youngest though still older sister, Andrea, introduced me to Beethoven’s music when I was in middle school and with Schroeder still in my subconscious, I decided i wanted to celebrate Beethoven’s birthday in my 8th grade year. I copied 2 records I had to a tape, in mono no less, and listened to them on my bus ride to school and home, half an hour each way. By the end of the next year, I had taped recordings of all nine Beethoven symphonies, and hearing all of them in a row has been my celebratory tradition ever sense still 32 years later.. Not only have I heard the quality of the recordings improve over the decades, hearing a set of pieces spanning over a composer’s lifetime is a great way too also experience first hand how their style developed and changed over time. From tapes played on a mono player, then to a walkman, then to cd’s, to an mp3 player, to streaming lossless off my Synology network server first time this year, not only has the audio improved amazingly but also the convenience e.g. I don’t have to make sure the next tape is lined up.

Beethoven’s first 2 symphonies though still somewhat classical in style paying homage to Haydn and Mozart, still have moments here and there that totally make the listener sit up and take notice, moments that are clearly Beethoven’s. His 3rd symphony completely changed what a symphony had been to that point, and is considered the first of his mature symphonies, as well as in the minds of some musicologists the first piece of the Romantic period. Looking at the Beethoven symphonies, beginning with the 3rd, each symphony has a story to tell, and all of them collectively as well as individually are strong affirmations to life, to the human condition. That, even though Beethoven’s childhood was harsh and much of his adult life silent due to his deafness, that there are still moments totally worth living for and hoping for in the future; Beethoven’s symphonies are all more than worth the time to explore and understand.

While writing this post, I listened from the finale of the 3rd symphony, to the end of the scherzo movement 3 of symphony 5.
There’s a little of Schroeder in me too, I went to Edgewood College  who  puts on a Christmas concert every year and twice during these concerts I played a Beethoven solo. The coolest of them was on 12/16/1991 when I played the 1st movement of his piano sonata op. 31 no. 2 “The Tempest”; Andrea along with our parents were in attendance

Thoughts about my experiences with a new iPhone 7 without a headphone jack

Early this year the tech world exploded over that there wouldn’t be a headphone jack in the new iPhone, there are still blind people who say they won’t buy another one because of that; and Although I wasn’t happy about it, i thought boycotting them altogether was a bit too far. When Belkin announced their Lightning audio plus charge Rockstar  with 2 Lightning ports and Apple announced including their new $9 Lightning to headphone adaptor, I felt the annoyance was easily resolvable. Even with this resolution, some of those earlier mentioned extremists still won’t buy another iPhone saying it just makes the phone $50 more expensive than the already high price. Yes, spending $40 for an extra dongle to carry around so you can use wired headphones and charge simultaneously is annoying, but in the end I still don’t find it that big of a deal, even with a low income. Even though Apple does include one of their adaptors, I have bought 2 more and 1 sits in a drawer as a spare..

I got my iPhone 7 on October 4, and I was told the Belkin dongle wasn’t available yet until November second, ye that was annoying. A few Skype calls ended prematurely because of a dying battery, and some times I had to use Bluetooth headphones so I could charge, and then there was the pillow speaker I use when going to bed. For the first week I used my old iPhone 6 plus to play podcasts or audio books before sleeping, but I really wanted to use the new iPhone for everything and didn’t want to wait for the dongle.

Then I had a bit of intuition. I remembered I had 2 Anker Bluetooth adaptors, that I used for my combining audio streams into one Bluetooth headset project, so I paired one of them to the pillow speaker and it worked, great; except for 1 problem. The adaptors also run on a battery, and batteries still don’t last forever for some reason. I turned the adapter on so that it was receiving Bluetooth audio from my phone and piping it to the speaker, then I plugged it into USB to charge, and it shut off. I was momentarily frustrated, but then I just turned it back on again and it worked, even when charging on USB and it kept working, for 3 days until I unplugged it, and was still all charged up; problem solved. Finally the Belkin Lightning port Rockstar was available, even a week early, and it is nice to have; though I don’t need 2 of them as i originally thought I would with 1 going to the pillow speaker. The Bluetooth adaptor costs $30, so saved $10 there too. Now that I can charge and use wired headphones simultaneously even when charging like I could before when iPhones had headphone jacks, I can sigh with relief; that is until I forget to take the dongle with me somewhere. Still, for the most part, it puts the no headphone jack problem back into the tea pot with all the tempests of tempestuous pasts.

how to make ping audible, and equally more useful for both blind and sighted users

Ping is a little network troubleshooting command on operating systems that have any command line capability. Ping sends an ICMP packet to tell you if a specific network connection is up if the device pinged is configured to send back a reply. Back during my Cisco networking days I was annoyed that I couldn’t see if network connections were up while configuring them in a separate router window; my sighted classmates and coworkers would just display 2 windows and look between them. My friend Sean Randall was learning programming at the time and wrote a cool little autoit script for me that could ping and beep when it got replies and we called it sping; then I moved to the mac and missed that utility.

about a year ago a friend Jacob White, who is fully sighted but spends time under desks plugging networks together, told me about the ping command flags -a or uppercase -A that totally resolved that function. The -a flag will emit the system default beep when a reply is received, and the -A flag   beeps if no reply comes back. Just type out the ping command and add one of those flags e.g.

ping -a
or -A if you’re looking for lost packets.
As far as I know this is present in the ping command for all current versions of Unix Linux or macOS. There is also a nice little utility for Microsoft Windows called bping that beyond adding beeps to ping can also scan a subnet for you. Actually, any ping command can scan a subnet for you if you ping the network broadcast address.

e.g. ping


Adding audible beeps to ping make the command more usable for blind users, but it also is convenient for sighted people under their desks, another example of where some “accessibility” features are also convenient and useful for those without any disability.

My rant on how many online videos and podcasts have way too much cognitive load, like music beds

In an attempt not to just only complain, I’d like to mention a significant frustration I have with many videos I often find on the internet. I get that they are “videos” but often the authors don’t find the audio aspect important at all. Many videos, like this one only have music as the audio so that a blind listener has no idea what’s going on, or can’t hear the object of the video. Just 5 minutes ago I was trying to find out what the Segway miniPRO personal transporter sounded like, one of those blind quarks, but all the videos I could find only had meaningless techno music as audio.

Music in the background when speaking in a video is also annoying, especially when the person is trying to communicate something important the viewer and/or listener needs to know. authors may think it’s cool and/or makes the presentation less boring, but all it really does is add cognitive load to the experience. I’ve even heard podcasts, made by blind people, with background music while the person speaks; where the whole experience is audio, no video at all. I imagine it would be like if libraries had bright strobe lights on whenever people tried to read anything.

With our life styles accelerating more all the time, and the need to learn and understand more every day, I encourage everyone to consider how much cognitive load bling is adding to their presentations, and to consider that not everyone has all the senses and/or skills you might enjoy. The most important thing every human will ever do in their entire lives is to communicate understanding, let’s all try harder to communicate as completely as possible so that all of us can more easily understand more deeply.

A virtually unknown iOS VoiceOver feature, automatically announcing the time every minute

As an iOS VoiceOver user, several years ago I discovered that if I touched the clock status bar item VoiceOver would continue to automatically announce the time until interrupted by touch or certain incoming notifications. I can’t remember exactly when this became a feature, but it was more than 3 years ago, and I’ve never heard anyone else mention it nor have I seen it documented anywhere; so I thought I’d share it, as I can imagine it being helpful to many others.

This time announcement feature is very useful to me, especially when I’m in a hurry, and need to get ready for something quickly. I even use it occasionally with my Anker Soundcore Bluetooth speaker in the shower; time can really accelerate there. Time announcements are also available on macOS in the Date & Time Preference Pane, near the bottom of the clock tab. Though not customizable to the exact minute; 15, 30, and 60 minutes are optional. I could also see this useful on the Apple watch, though it’s not there yet.