[BRLTTY] Footsteps towards better accessibility in Linux
Jason J.G. White
jason at jasonjgw.net
Wed May 7 10:52:43 UTC 2025
While discussion future BRLTTY development ideas, I would like to note
that multiple-line Braille displays are becoming more easily available,
and this trend can be expected to continue. Some devices have tactile
graphics capability as well. I know BRLTTY already has some support for
them (e.g., Canute 360, DotPad).
Much of the necessary work is presumably in screen readers (obviously,
Orca for Linux desktops) rather than in BRLTTY. It includes user
interface design decisions - how to present the UI appropriately on a
larger, multi-line display - as well as implementation.
Currently, I am particularly interested in the Monarch display from APH
and HumanWare (a 96 by 40 pin-matrix device - 10 lines of 32 cells each
when displaying Braille). It also accepts touch input. A protocol to
support screen readers is on the development agenda for implementation,
according to public statements.
On 7/5/25 05:23, Aura Kelloniemi wrote:
> Hello,
>
> On 2025-05-07 at 00:05 +0200, Sébastien Hinderer <Sebastien.Hinderer at ens-lyon.org> wrote:
> > Aura Kelloniemi (2025/05/07 00:15 +0300):
> > > - Resource usage: if the user does not use VTs (or they are not available),
> > > having a core BRLTTY that reads unused screens, parses configuration files,
> > > manages braille tables, starts a speech driver, etc. is useless. All
> > > BRLTTY's screen reading features would be just extra bloat (and would also
> > > be additional attack surface).
>
> > It's possible to make BRLTTY rather minimal, e.g. by compiling it with
> > the no screen driver.
>
> This is why I expected that it would not require a lot of work to separate
> BrlAPI and BRLTTY from each other, because it is almost doable already just by
> building BRLTTY twice with different configure options.
>
> > FWIW a project like the one you describe did exist
> > years ago, it was called libbraille and as far as I know nobody used it.
>
> I tried it with gnopernicus, but I did not use it constantly as BRLTTY was not
> able to share the display with gnopernicus.
>
> > > - Multiple displays: if the user uses multiple displays (like I sometimes do),
> > > they cannot connect them all to the same BRLTTY instance. Running a second
> > > BRLTTY for the second display does not help, if the user wants to use
> > > anything that relies on BrlAPI.
>
> > Well in text mode having two instances of BRLTTY works, a bunch of times
> > I hacked with blind friends and we did manage to do that. It's true that
> > it's not yet well supported at the BrlAPI client leel, but it's not
> > inconceivable that clients connect to several instances of BrlAPI at the
> > same time.
>
> Yes, but it is not easy. I don't need to use multiple displays, just one
> display is enough, if I normally use it with bluetooth and then plug it into
> USB port for charging. Then a new BRLTTY is launched for the "new" display,
> the new instance does not have BrlAPI server, and even if it had, my graphical
> session still communicates with the old server.
>
> I know that I can remove the udev rules (which I actually did). I know I can
> change the server port and set up an environment variable to connect to a
> different port. I know I can kill my graphical session, edit my .xinitrc to
> export BRLAPI_HOST, relaunch X and every application that was open before the
> change. These kind of things are those which make it more difficult to rely on
> GUI at the moment.
>
> The point of this complaint is that BrlAPI is not a first-class citizen in
> BRLTTY world, and I think it should be. And one of the best ways to make sure
> it is would be to try and run BrlAPAI and BRLTTY (the screen reader)
> separately.
>
> > The only think that I might be hard to achieve is the synchronization of
> > the different braille displays, but (1) it's probably not completely
> > impossible to implement on top of BrlAPI parameters, and (2) I guess
> > it's not unconceivable either to imagine that BRLTTY runs several
> > braille drivers at the same time. So for that I don't see how decoupling
> > BrlAPI from brltty would help.
>
> Probably just by making sure that BrlAPI clients are treated the same as
> BRLTTY screens.
>
> > > - Better support for BrlAPI-enabled applications: currently it is non-trivial
> > > to write BrlAPI applications which don't utilize BRLTTY's command set in
> > > their input handling. BRLTTY's command set works well if the application is
> > > a terminal screen reader, but this becomes a limitation if the application
> > > does something different.
>
> > I think the set of commands could be enriched as much as desired. I can
> > even imagine a notion of context which would allow to bind a key to
> > differnet commands.
>
> I have also thought about this, and I'd like this to happen.
>
> > Also, the only thing you have to do if you do not
> > want BRLTTY commands is to ask for key codes whenyou enter in tty mode.
>
> Yes, but BRLTTY already provides a lot of machinery to handle driver-specific
> key codes and reimplementing this logic to every BrlAAPI application that
> requires a different command set form BRLTTY is tedious, errorprone and wasted
> time.
>
> Currently BrlAPI's support for describing raw keycodes is incomplete, BRLTTY
> cannot for example tell the client that the pressed key is a routing key – it
> just sends a code.
>
> BrlAPI applications would benefit a lot from a setup where they can ask for
> some BRLTTY commands and allow the user to configure the rest of the key codes
> how they wish. I don't have a ready-made design for this. It could reuse
> BRLTTY's key table format and the parsing routines provided by BRLTTY, but how
> exactly, I don't know.
>
> > And finally, I am skeptical about thae fact that splitting BRLTTY from
> > BrlAPI would lead to progress in this direction. Well even if it would,
> > that progress would have to outweight the manpower necessary to do that
> > split and which is scarce and may be better used on a different front.
>
> Assuming the split is quite simple, adding the new features (for command
> handling, device management, etc.) would certainly take the most manpower.
>
> At this point even BRLTTY as a BrlAPI client is not fully functional. For
> example it cannot produce speech, control status cells (on display which have
> them) and it cannot recover from situations such as crash (or restart) of the
> core BRLTTY.
>
> > > For example I have written an e-book reader for BrlAPI, but BRLTTY does not
> > > have commands for navigating menus, exiting contexts, activating buttons,
> > > etc. Thus the application works well only with my display and with my custom
> > > BRLTTY key bindings.
>
> > I'm sure this can be discussed and improved without changing the current
> > architecture.
>
> Me too. It just depends on how much we value BrlAAPI clients vs. BRLTTYs
> internal screen reading capabilities.
>
> > Did you publish the tool? Would be happy to give it a try, although I am
> > a super happy user of the nov.el Emacs mode for reading ePubs.
>
> I could try to clean it up. The code is almost 15 years old and all key codes
> are hard coded. For example you need to invoke the HELP command to quite the
> application, because it was a nice combination in my key table at that time.
>
> The features that this reader provides are:
>
> - Only uses braille display for input
> - Reading without scrolling the terminal: no need to press PageDown
> - Seamless support for long lines
> - Word wrapping (cannot be disabled): BRLTTY did not have word wrapping when I
> wrote this tool
> - Text editing with braille keyboard
> - Local and global fuzzy regexp search
> - Dictionary lookup (with a built-in command)
>
> > > Separating BrlAPI from terminal screen reading (BRLTTY)
> > > would require rethinking the interface between the application and BrlAPI
> > > server.
>
> > Likely, and as I said, given the work that would represent and the
> > scarceness of resources, I woudl rather think twice if not more before
> > doing that. I thihnk I would have to be convinced that somehtign that is
> > very important is simply not achievable in the current set-up.
>
> I think it is achievable in the current setup, just that the end result would
> be cleaner, if different programs have clearly separate responsibilities.
>
> The point of all this is to pave way for first-class support of BrlAPI clients
> which in turn would allow for more complex applications (like screen readers)
> to better utilize braille display's input facilities.
>
> > I am not saying I cannot be ocnvinced, but I am saying I am not, at this
> > stage.
>
> I admit that I am very much drawn to the idea that separating BrlAPI and
> BrlTTY would make the architecture cleaner and would put BRLTTY and non-BRLTTY
> screen readers to the same class. Everything that I proposed can be
> implemented without the separation, and if the separation itself causes
> resistance (as it seems to) or if separation requires a lot of work, I will
> pull back my suggestion.
>
> > Also, in the same way it may be possible to embed several braille
> > drivers in an instance of BRLTTY, it may also be possible to run several
> > screen drivers in the same instance.
>
> Sure, and why not. This design seems to imply that BRLTTY's screen reading
> capabilities will always be more important than capabilities provided by
> BrlAPI clients.
>
> Let me describe one possible idea on how braille accessibility could work in
> Linuxafter the rearrangement:
>
> On the system level there would be brailled daemon communicating with all
> attached braille displays and serving clients. Brailled would send rendered
> braille text to the displays and send braille input to clients. It could
> probably provide virtual keyboard or other hardware related support, like
> BRLTTY does. It would also contain a minimal user interface for managing
> displays or clients, for changing global configuration, etc.
>
> Brailled (or a multiplexer client) would choose, which application gets
> control of the display(s). This could be an interactive process.
>
> Applications would sit on top of these. They should tell brailled what kind of
> commands they'd like to receive, maybe by passing brailled a key bindings
> table (or by pointing brailled to such table on the disk).
>
> The application would be responsible for rendering braille itself. This is
> important, because the application needs to know how many braille cells any
> given text occupies. Braille text length can vary because of contraction or
> because of widget decorations. There would be a library for doing braille
> rendering (based on braille tables) – liblouis already does this, I suppose,
> but it could be extended to support global user-defined representations for
> widgets.
>
> BrlTTY would be one of the applications. It would have different screen
> drivers for different terminal emulators. It would also have some multiplexing
> capabilities so that it can tell brailled which console/window/pane is
> focused. There could be multiple instances of BrlTTY running (one for each
> kind of terminal emulator), but there would always be just one brailled.
>
> Other screen readers – like orca, would communicate with brailled just like
> BrlTTY woud. There could be other types of applications as well, not just
> screen readers. An example might be a program that monitors laptop battery
> level and signals the user when action is needed.
>
> Communication between brailled and the clients would be through sockets with
> BrlAPI. I don't believe it would incur too much of latency.
>
> In case brailled crashes, the clients would keep on trying to connect to it,
> and the system service manager would try to restart brailled.
>
> I'm not saying that all this could not be achieved by running multiple
> BRLTTY instances with different purposes, and I am not against that. I just
> think it would be cleaner this way.
>
More information about the BRLTTY
mailing list