Promoting idea of getting Ubuntu to adapt to users' accessibility preferences...

Eric S. Johansson esj at harvee.org
Thu Jul 30 22:53:47 UTC 2009


Brian Cameron wrote:

> This does seem like an interesting idea.  To expand upon it, I think
> GNOME also needs a solution that works more generally.
> 
> There has been talk of enhancing gnome-settings-daemon so that it is
> possible for users to hit particular keybindings or other sorts of
> gestures (e.g. mouse gestures) to launch AT programs.  This would
> allow a user to launch the on-screen-keyboard, text-to-speech, or
> magnifier by completing the appropriate gesture (e.g. keypress or
> mouse gesture).
> 
> I would think that using a specific smart card or USB stick is another
> form of "gesture" that would also be good for launching AT programs.
> However, wouldn't it be better to come up with a solution that would
> support all of these sorts of "gestures" in one place?
> 
> Providing a solution that can recognize different sorts of gestures
> (perhaps configurable so users can define their own sorts of gestures -
> perhaps with other unique hardware based solutions - like pressing a
> button on their braille display) seems a way to go about implementing
> your idea and also supporting other mechanisms that could be used to
> launch AT programs as needed.

as I added as a counter proposal

"""	
It is unrealistic to expect all machines a user uses to have accessibility
software. There may be multiple reasons for this ranging from administrative
overhead to licensing issues to interference with normal operation. By adopting
the perspective that the user interface moves with the user and not the machine
opens up new possibilities for widely available accessibility. By associating
the user interface software (speech recognition, text-to-speech, various dog and
pony tricks, etc.), the impact on the general machine is lessened, and
administrative costs are lowered, licensing issues are reduced or eliminated,
and the user has increased control over the software they need to function.

This can be implemented today using virtual machine technology and relatively
minimal bridge software making the accessibility software interface visible on
the host and enabling interaction between the application and the accessibility
software."""

The Web for all model doesn't address something I consider fundamental flaw of
accessibility technology. I should be able to use any machine I have access to.
I shouldn't have to wait for an administrator or buy new license just because
I'm using a new machine whether it be for a lifetime or just a few minutes. I
should be able to plug-in, click a few icons and start working. After all,
that's what keyboard and mouse allow tabs to do. Why put any further barriers
disabled people?

I believe the future of accessibility will start with putting accessibility
tools on a netbook and connecting that network to other systems on demand. I
believe this because if you give me an accessibility interface, you control how
I use the computer. If you give me an API, and a remote accessibility toolkit, I
can control how I use any computer.

Yes, I'm a wee bit cranky about this because I spent the past 15 years watching
speech driven user interfaces get almost no support and I am seeing speech
recognition on Linux (NaturallySpeaking on wine) sit at the cusp of being useful
by disabled people and getting no traction with the developer community.




More information about the Ubuntu-accessibility mailing list