[BRLTTY] HandyTech's TOUCH_AT feature and BRLTTY's learn mode

Sébastien Hinderer sebastien.hinderer at ens-lyon.org
Sun Apr 12 15:54:49 UTC 2026


Hello Mario, thanks a lot for your feedback on this.

Mario Lang (2026/04/02 11:23 +0200):
> Sébastien Hinderer <sebastien.hinderer at ens-lyon.org> writes:
>
> > For those who don't know: the devices from Handy Tech can report the
> > finger's position on the braille display.
> >
> > I am not sure how known / used this feature is, but thinking about it I
> > can imagine very cool uses, e.g. fast double touch on a link would
> > simply open it in a browser, or do whatever you want with that link.
>
> You'll have to experiment.  The protocol used to be a lot
> more versatile.  With initial ATC implementations from HT,
> *every* cell sent a pressure value, so you could even see several
> fingers / do a kind of multi-touch.  We implemented
> highlighting on-screen, so a sighted co-worker could actually
> see (by background colour) what text you were currently touching/reading.
> I used that at work for a few years.
> In later firmware revisions, they unfortunately removed that capability
> and replaced it with a single "finger position", whatever that means.
> I was particularily unhappy as a customer/developer because
> HT changed my firmware while doing device service, so the new
> reduced feature-set was kind-of forced down my throat without my
> consent.

I don't know at what firmware version the change happened. Still
nowadays I use their version 2.0, even if that means downgrading the
firmware after my devices get repaired when my requests not to upgrade
them get ignored.

I do not know what I am missing from more recent firmwares. I mean, I
know neither which features / improvements / bugfixes they bring, nor
which of those I would really be interested in. The few times I tried it
felt more processing of the keycodes of the PC keyboard was happening in
the firmware, meaning that the scan codes where transmitted less
faithfully. I even wonder if it was stillpossible to actually get the
scan codes with later firmware. In any case I didn'tlike the experience
so I downgraded and am thus a happy 2.0 firmware user.

> Current ATC is still useful, but the biggest use case seems to boil down to
> TOUCH_NAV.

I must say, I don't know what ATC means. Active Touch Cells?

> BRLTTY implements that, which you should be able to turn
> on with Chord+Dot1+Dot4 (Capital-A for ATC).

Is Chord a key in itself, or do you just mean to press dots 1 and 4
together, actually?

> The idea is, if
> you have finished reading a line, BRLTTY will automatically advance
> the display to the next reading position.  Especially useful
> for reading books and other longer texts.

Okay yes, I get the idea. I believe I prefer to have the control on the
scrolling and it would stress me if it would be done before I agree on
it to happen.

> > For the time being: I notice that, when BRLTTY's learn mode is run
> > interactively, the TOUCHAT events are not shown. I find this convenient
> > because of course, to read you have to teach so whatever key ou press,
> > if its description is immediately replaced by a TOUCHAT event, then you
> > won't see it and the whole purpose of learn mode is defeated. But this
> > does not happen in BRLTTY's interactive learn mode, fortunately.
> > However, it does happen when running apitest in learn mode. And to some
> > extent it makes sense. But should we have an option for apitest to
> > ignore some keycodes / blocks?
>
> Since brlapi already provides keycode/command filters, I am surprised
> apitest doesn't already do so?

I assume either apitest has been implemented before this feature appeared, or
it was tested only with devices that do not implemnet it, so that the
problem was actually never encountered.

> > Also, one question to the owner of Handy Tech devices: I thought it was
> > possible to completely disable the fiongers detection feature but I just
> > checked the option menu (firmare 2.0 of an Active Star 40) and couldn't
> > find any relevant option, apart from something that I think makes the
> > finger detection more or less sensitive, but even with the lowest level
> > I was still getting the TOUCHAT events in apitest.
>
> I believe we currently unconditionally enable ATC during driver
> initialisation.  What we *could* possibly do is ignore the touch events
> in the core if TOUCH_NAV is off, but that feels a bit hacky, since
> theoretically, other future features could also rely on that event.

If there indeed is a possibility to choose at initialization time
whether the feature is enabled or disabled, wouldn't it be nicer to have
a driver parameter to control this? I wouldn't mind the parameter being
enabled by default. But I think if one is not interested at all by the
feature then it's better to disable it completely rather than letting
the event flow from the device to the PC and then ignore them
afterwards.

Seb.


More information about the BRLTTY mailing list