[BRLTTY] BRLTTY and touch screen input for Braille?

Rich Morin rdm at cfcl.com
Mon Apr 4 08:59:45 EDT 2022


> On Apr 3, 2022, at 22:53, Aura Kelloniemi <kaura.dev at sange.fi> wrote:
> 
> ... My vision has been that this would be completely independent of any
> screen readers, but of course screen reader support might make it more capable.

Braille translation and screen readers involve a huge amount of complexity.
My take is that touch event and gesture recognition are quite separable tasks,
so I'd really like to handle them without worrying about the back end issues.
Of course, if someone wants to try integrating things more tightly, they would
be welcome to do so.

> ... I assume that recognizing touch gestures has quite tight
> real-time requirements. I'm not sure if an interpreted and garbage-collected
> language is the best fit for this kind of task.

The Elixir / Erlang ecosystem is based on a byte code interpreter and recently
has added some JIT compilation support.  So, it's a bit faster than a simple
interpreter might be.  Also, because the GC is done independently for each
lightweight process (aka Actor), the incidence of long GC latencies is small.

This platform is often used for tasks that have soft real-time requirements,
so I'm hoping that it can work in this case.  

> I think uinput is a good way to go. It may even be possible to make it so that
> it does not need BRLTTY's braille key translation, but can input straight to
> the Linux console (if that is where you want to run this).

As Samuel pointed out, having multiple places where translation is done could
lead to confusion.  That said, it might be interesting to have a simple-minded
translator which is optimized for input to the shell, etc.

-r



More information about the BRLTTY mailing list