There are people whose physiology dramatically lowers their ability to communicate with their loved ones. We seek to build Augmentative and Alternative Communication (AAC) systems which empower them to make the most of their modest output.
We draw a distinction between user symbols, evidence states which we directly estimate, and task symbols, task relevant instructions which we seek to learn. For example, in T9 phone entry a user selects from among 26 letters (task symbols) by using only 9 phone buttons (user symbols). We fix our user symbol source and examine how to efficiently encode a task symbol as a sequence of user symbols. We offer a recursive Bayesian decision framework which incorporates prior distributions over task symbols, is robust to user symbol classification errors and accounts for user symbol confusion. Strong inference systems can change their beliefs quickly: we use Maximum Mutual Information (MMI) coding which seeks to query the user in such a way that our belief about the target task symbol changes as quickly as possible.
We present two instantiations of this framework. The first, ‘Shuffle Speller’, is a letter by letter keyboard which offers a robust communication channel accessible by eye gaze or Steady State Visually Evoked Potential (SSVEP) responses. We extend this framework into ‘Web-Speller’ which offers multi-character querying, similar to word completion. Multi-character querying offers a 30% improvement in the communication rate, measured in Bits, over character by character querying in a simulated typing task. Furthermore, this scheme is shown to approach the information theoretic capacity of the discrete memoryless human input model. Web Speller accepts any 2D cursor movement, such as eye gaze or imagined movement via brain-implanted electrodes, to allow severely paralyzed people to express themselves quickly and reliably.
It is our hope that this work plays some small role in helping someone.
Prof Deniz Erdogmus (Advisor)
Prof Steven Bedrick
Prof Edmund Yeh