[BRLTTY] BrlAPI Raw key code mode?

Mario Lang mlang at blind.guru
Sun Aug 27 09:24:18 EDT 2017


Shérab <Sebastien.Hinderer at ens-lyon.org> writes:

> Hi,
>
> Mario Lang (2017/08/24 16:54 +0200):
>> The client would still need to do their own key code
>> handling, which seems fine to me if what is desired is more control.
>
> To make this easier and to warranty as much coherence with the core's
> way of handling the bindings, would it make sense to build the code in
> charge of this as a library that the clients could link to and use?

I don't see why adding link time complexity would improve things.
The client already links to the BrlAPI client library.

>> In any case, if we proceed with an attempt to generalize for graphical
>> screen readers, we really should talk to Orca dev what they actually
>> would want to have.  They might have some insights that we haven't
>> thought about yet.
>
> I am fully convinced it will be beneficial to talk to Orca developers.
> In particular, I always have the impression that Orca, like Jaws,
> Window-Eyes and VoiceOver are speech-centric screen review system while
> brltty is, in my opinon, braille-cedntric. By this, I mean that the
> features offered by the screen reviewing system, brltty excepted, seem
> much reacher to me when it comes to speech than when it comes to
> braille. Also, I believe that these systems have been developed with
> speech in mind and that braille support has been incorporated to them
> only afterwards, at east that is my impression.

While I see what you mean, the generalisation doesn't quite fit the
history of Orca, actually.  Marc Mulcahy, the original author of Orca,
was actually using a Braille display himself.  And parts of the very
early design show an intent to care for Braille as much as for Speech.
Braille support basically suffers because, overall, there
are more blind people using speech only.  This is simple to explain:
Software speech synthesis is much more affordable then actual Braille display hardware.

> As an example, I am pretty sure these tools can be used quite easily
> with speech only. But I believe one could not use them with only
> braille turned on, or at least it would not be as comfortable to use
> as the pure speech way.

To come back to the topic of this thread a little, I suspect at least
part of that is actually due to the limitations our API currently has.

-- 
CYa,
  ⡍⠁⠗⠊⠕


More information about the BRLTTY mailing list