Let me tell you about my Honours Project, I’ve been working on it for six months now and have just had a proposal for a paper I wrote about it accepted for the ISAAC 2012 Conference in Pittsburgh so it seems silly that I haven’t written a blog post about it yet.
In my final year of University, I find myself working on an honours project that is both rewarding academically but also, through the enormous sense of satisfaction of helping people with Severe Speech and Physical Impairment tell stories where doing so has been frustratingly difficult in the past.
My project is an AAC project. That stands for Augmentative and Alternative Communication. AAC devices help those with Severe Speech and Physical Impairment (SSPI) communicate. People with SSPI vary greatly, while one user may have permanently or temporarily non-functional speech only, others may find themselves with much more physical impairments. They could be confined to a wheel chair much of their lives, or simply have difficulties with motor control, meaning they find it difficult to operate a mouse or keyboard efficiently.
A recent example of someone who found themselves without functional speech recently would be the British singer Adele who after surgery, was ordered by doctors to rest her voice, she downloaded an app on her phone to speak for her. However, most regular AAC users are more permanently affected by non functional speech and often posses severe disabilities that hinder their use of a computer.
Adele apparently attempted to download several text to speech apps for her phone and finally settled on one particular app that would allow her to swear. This is actually something that we share a belief in at Dundee University. AAC devices are a user’s voice, under no circumstances should we censor them, even if they are designed for children. You wouldn’t teach a child not to swear by cutting into their brain and removing the ability to do so. Perhaps in some kind of cross between Orwell and Asimov, this might make an interesting story but unfortunately, many AAC devices do indeed censor their users.
Probably the most famous user of AAC equipment is Professor Stephen Hawking. Hawking is a perfect example of someone with Severe Speech and Physical Impairment. With very little motor function, Hawking uses a computer attached to his wheelchair which he operates using a single switch that he presses using his cheek.
As so many AAC users do fall into this category of Severe Speech and Physical Impairment (SSPI), interface design becomes a huge factor as users with SSPI often require interfaces designed specifically for how they use a computer. For instance, users who, like Hawking, use a switch to select items on a screen will usually require what we call a scanning interface. This is where the system highlights a selectable element of a screen individually for a moment before then moving on to highlight the next element, after waiting for the desired item to become highlighted on their screen, the user can then hit their switch to select it. As you can imagine, this can be slow and tedious however, it is of course more accessible than not being able to use the interface at all. As such, systems need to be designed that respect how the user will be using their device. Other users may not need a switch interface and while they can not necessarily operate several small buttons, can in fact use a touch screen with large, easily understood buttons to choose from.
Another factor that needs to be remembered is that disabilities are in most cases combined with other disabilities. For instance, someone with SSPI is very likely to also have learning difficulties such as dyslexia. One issue that arises regularly in the AAC user community is illiteracy. As such, basic text to speech is often not the best method for an AAC device as if the user can not read or write, they can not type text to be spoken.
Chronicles – My Honours Project
For my honours project, I’ve joined the AAC Research Team at Dundee University where I’ve joined in the development of a new project called Chronicles, a narrative telling system for adults with SSPI.
While there has been lots of research and development (R&D) in AAC systems, most of these have been in aiding transactional communication. When we say transactional, we are referring to communication aimed at expressing needs, this could be as simple as “I want coffee” or “I need the toilet.” However, in terms of narrative systems, there are less systems. Narratives is an AAC term that really just means stories. While this may seem like a simple matter, story telling is actually a major part of communication and you may not realise just how many stories you tell every day.
When you speak to your friends, you may tell them about what you did last night. You may tell your mother about a hard day at work or you may tell people about a vacation you went on. With close friends and family, people with SSPI are usually more comfortable telling stories, possibly because those who are close to them have less difficulties understanding gestures or the broken speech that some with SSPI are capable of. However, when meeting a new person, these stories are more difficult for users to tell and so they come to rely on AAC devices much more.
However, the other main part of communication is conversation. When you tell a story, you do not just tell the whole story all at once and take questions at the end of it, you say a sentence and you stop as the person you are telling the story to asks a question.
I had a nasty experience at the shops the other day.
Oh yes? What happened?
There was a robbery at the store!
Really?! Tell me more!
When we tell stories, interruptions and questions are a major part of the narrative and so a story telling system has to allow for the story to be told in individual utterances, allowing questions to be asked. This also allows the user to skip parts or tell the story in a different order to how it was written, allowing the story to be told in a different way each time to different people.
With most AAC systems, the user can enter a narrative in one long block of text that can’t be interrupted or have the sequence changed to allow the user to respond to questions. As such, users with SSPI often don’t tend to tell stories with AAC devices and would choose to have a communication partner tell the story for them instead. This may be a close friend or family member or a support worker. However, when faced with a situation where they wish to tell a story, if the user’s communication partner is not present or indeed, if the communication partner with them does not know the story, many choose simply not to tell the story at all.
Chronicles aims to change this by allowing the user to generate their stories and store them in a system that allows them to be told as part of a conversation naturally and be kept with them on their own AAC device wherever they go. This extends the work already done here at Dundee University’s by AAC Research Team members Rolf Black and Annalu Waller on a system called “How Was School Today…?” which generated stories for children based on what they had been doing in school, allowing them to tell stories to their parents and friends.
We hope to bring similar functionality found in “How Was School Today…?” to Chronicles, such as Natural Language Generation implementing data to text and text to speech. However, we plan to extend these proven applications of narrative telling to a device that can document all of a users life. And thus we arrive at the part I’m working on, how do you design an interface for those with Severe Speech and Physical Impairment that allows the user to easily find one story, from what could be hundreds!
Retrieving One Narrative From Hundreds
The work that I’m doing on Chronicles is to investigate how best the narratives we’re collecting/generating are to be stored in a database on the user’s system and most importantly, how the user interface should be designed to allow a person with severe physical impairment, learning difficulties, and possibly illiteracy, to easily retrieve a single narrative from a feasibly very large number.
This has been my challenge for the last six months and through evaluation from my initial prototype designs, it is the challenge that will continue for the final few months of my time at university.
One of the highest held principles at Dundee’s School of Computing is User Centred Design. For many, this simply means designing a system with the user in mind, for us, it means that the user group you are designing for is involved in the actual design process. For the AAC Research Group at Dundee University, that means real AAC users with SSPI coming into the lab on a regular basis to give insight and thought into systems that we are designing. This may seem entirely obvious to you as a reader however the sad truth (tragic even) is that almost all AAC development in the world is done by researchers who have never spoken to a real AAC user in their lives!
At the start of this project, myself and my supervisor had several assumptions as to what kind of system Chronicles would be and more to the point, how its interface would appear and be used, however through speaking to real AAC users, we could see just how much interaction with our user group is invaluable. It is such an important factor that my supervisor, Dr. Suzanne Prior, wrote her PhD on the subject of user involvement in the development process of AAC systems.
We believed that a category system would be much more useful for finding stories than a timeline interface that would show stories in a line. We had in fact resigned ourselves to our belief so much that we were quite sure that such a system would not be developed at all. However, it turned out that when discussing their progress, our user group is often asked to to plot events in their lives on a time line drawn as a long winding road. This visual aid was actually much easier for them to picture events that have happened in their past and so, Chronicle’s Timemap was born.
Notice that I haven’t just come up with an interface based on what we believed would be a good idea, if I had, the system would look extremely different to what it does and after receiving poor results from evaluation, a prototype would likely have needed scrapped and a new one made up, by involving users in the requirements gathering process right from the start, time is not wasted building systems that just aren’t suitable.
Evaluation of the initial prototypes, as designed based on the expert knowledge of real AAC users, has shown that the system is being received well. There are elements that through watching members of our own user group interact with, I now know need changes. Elements that I would not have thought of if I hadn’t involved them in the testing process.
I’m now on to making adjustments based on really useful feedback and in a few weeks time, I will be giving each member of the AAC User Group that I’ve been working with a copy of this software to take home and test in a longitudinal study and with the current search engine that I’m implementing in the system, I’m looking forward to hearing how my efforts have been received. Even negative feedback is positive to the AAC Research Group as it teaches us more and more on how to design systems for users with Severe Speech and Physical Disability.