[BRLTTY] HandyTech's TOUCH_AT feature and BRLTTY's learn mode

Mario Lang mlang at blind.guru
Thu Apr 2 09:23:53 UTC 2026


Sébastien Hinderer <sebastien.hinderer at ens-lyon.org> writes:

> For those who don't know: the devices from Handy Tech can report the
> finger's position on the braille display.
>
> I am not sure how known / used this feature is, but thinking about it I
> can imagine very cool uses, e.g. fast double touch on a link would
> simply open it in a browser, or do whatever you want with that link.

You'll have to experiment.  The protocol used to be a lot
more versatile.  With initial ATC implementations from HT,
*every* cell sent a pressure value, so you could even see several
fingers / do a kind of multi-touch.  We implemented
highlighting on-screen, so a sighted co-worker could actually
see (by background colour) what text you were currently touching/reading.
I used that at work for a few years.
In later firmware revisions, they unfortunately removed that capability
and replaced it with a single "finger position", whatever that means.
I was particularily unhappy as a customer/developer because
HT changed my firmware while doing device service, so the new
reduced feature-set was kind-of forced down my throat without my consent.
Current ATC is still useful, but the biggest use case seems to boil down to
TOUCH_NAV.  BRLTTY implements that, which you should be able to turn
on with Chord+Dot1+Dot4 (Capital-A for ATC).  The idea is, if
you have finished reading a line, BRLTTY will automatically advance
the display to the next reading position.  Especially useful
for reading books and other longer texts.

> For the time being: I notice that, when BRLTTY's learn mode is run
> interactively, the TOUCHAT events are not shown. I find this convenient
> because of course, to read you have to teach so whatever key ou press,
> if its description is immediately replaced by a TOUCHAT event, then you
> won't see it and the whole purpose of learn mode is defeated. But this
> does not happen in BRLTTY's interactive learn mode, fortunately.
> However, it does happen when running apitest in learn mode. And to some
> extent it makes sense. But should we have an option for apitest to
> ignore some keycodes / blocks?

Since brlapi already provides keycode/command filters, I am surprised
apitest doesn't already do so?

> Also, one question to the owner of Handy Tech devices: I thought it was
> possible to completely disable the fiongers detection feature but I just
> checked the option menu (firmare 2.0 of an Active Star 40) and couldn't
> find any relevant option, apart from something that I think makes the
> finger detection more or less sensitive, but even with the lowest level
> I was still getting the TOUCHAT events in apitest.

I believe we currently unconditionally enable ATC during driver
initialisation.  What we *could* possibly do is ignore the touch events
in the core if TOUCH_NAV is off, but that feels a bit hacky, since
theoretically, other future features could also rely on that event.
So if apitest doesn't do so already, I would think adding
some filter capabilities to apitest would be the better way to approach
this.

-- 
CYa,
  ⡍⠁⠗⠊⠕


More information about the BRLTTY mailing list