[BRLTTY] [OT] How to integrate BSI into a Linux cell phone?

Rich rdm at cfcl.com
Wed Aug 12 11:39:29 EDT 2020


Although this question is somewhat off-topic, I'm hoping that some folks here
will find it interesting enough to give me a bit of slack.  Anyway, here goes...

Background

I'm a sighted, semi-retired, volunteer developer who is very interested in the
possibility of re-purposing Android cell phones as blind-accessible computing
and communication devices.  There are a number of projects working on re-using
the billions of aging cell phones that are out there.  Some, such as LineageOS,
start with an open source version of Android.  Others, such as postmarketOS and
Mobian, start with a more vanilla flavor of Linux (e.g., Alpine, Debian).

Although the first approach makes tons of Android software available, most of
this is GUI-based, so accessibility will generally be a challenge.  Basing the
system on Linux allows a wealth of CLI-based software to be used and helps to
anchor the resulting system more closely in the open source community.

Challenges

One of the biggest challenges, I suspect, will be providing an accessible form
of text input.  Most screen-based keyboards for cell phones aren't suitable,
a physical keyboard would add cost and bulk, and voice recognition (without the
support of cloud computing) still seems to be beyond the phone's capabilities.

So, I've been speculating about how to support Braille Screen Input (BSI) on a
Linux-based cell phone.  BSI is available on Android, Fire OS, and iOS (please
let me know if I'm missing any others!), but I haven't found any indication that
anyone is working on supporting it for any form of Linux.

I think I have a handle on how to architect the front end of the code, using a
set of Actors (lightweight processes) running on a foundation of Elixir, Erlang,
and OTP.  However, I'm totally out of my depth when it comes to interfacing the
code to other apps, the window system, and the rest of the OS.

Should I be looking into D-Bus, GTK, Qt, or what?  Also, are there any ways the
code could leverage BRLTTY, Emacspeak, Orca, etc?  How would typical blind users
want a BSI subsystem to act in this context?  I'm really confused here, so I'm
asking for advice, comments, and so forth.

-r

P.S.  There is at least one physical keyboard which isn't totally out of the
question for use with a cell phone.  It's a folding keyboard/touchpad combo that
supports both Bluetooth and USB.  It's about 6" x 3.8" x 0.5" when folded up and
12" x 3.8" x 0.25" when unfolded:

Foldable Bluetooth Keyboard, Jelly Comb Dual Mode Bluetooth & USB
Wired Rechargable Portable Mini BT Wireless Keyboard with Touchpad Mouse
for Android, Windows, PC, Tablet-Black
https://www.amazon.com/gp/product/B07S9XZDGY

 


More information about the BRLTTY mailing list