Thoughts on symbol support and picture support

People with certain kinds of disabilities often need more than words in order to be able to communicate. One thing that can be helpful is the use of symbols or pictures.

Using symbols can expand and support someone’s expressive vocabulary. (For instance, picture symbols on a communication device can enable someone to use words they couldn’t use by typing or speaking).

Symbols can also expand and support someone’s receptive vocabulary. For instance, symbols can be used to illustrate materials, or to explain something to someone. They can also be used in things like powerpoint presentations in various ways.

Symbol support can do a lot of other things that make communication more possible for people with a wide range of disabilities. It’s not just about literacy; literacy-related things are just the easiest to explain.

Something I’ve been realizing matters is that everyone who uses symbols to communicate is a symbol support user. Even people who normally communicate in words; even people who only use symbols to communicate when they are talking to people with disabilities or listening to people with disabilities.

It’s important to remember that communication in symbols is happening on both sides of the interaction.

If someone is communicating with you by showing you symbols, then you are using symbols for receptive communication.

If you are using symbols to explain something to someone, then you are using symbols for expressive communication.

It’s important to keep this in mind.

If you’re using symbols, the symbols are part of the communication. Even if every symbol is attached to one word and only one word. The symbols don’t just tell people what the words are. They also have content, and it’s important to pay attention to what you’re saying with the symbols. They might not mean the same thing to the person you’re talking to that they mean to you. Particularly if they understand picture-concepts more readily than they understand word-concepts.

For example:

Sometimes people might select symbols on communication devices based on what the symbols mean rather than what the words they’re associated with them mean:

  • If someone is putting together phrases that don’t make obvious sense to you, they might mean something by it
  • It might *not* be stimming, random exploration, or that kind of thing
  • It might be intentional communication based on what the pictures mean to them
  • I think it is important to take that possibility seriously (even for someone who also speaks, or also uses words)
  • And *especially* important to take seriously if they’re indicating with body language that they want you to look at the screen)
  • (This is also true if someone is using PECS symbols in a way that doesn’t appear to make literal sense. It might be because the pictures mean something different to them than they mean to you)


  • If you’re using symbols to explain something to someone who needs symbols, the symbols matter
  • It’s not always enough to just pick words, then pick symbols that go with those words one-by-one
  • The content of the symbols can matter beyond literal word-by-word meaning
  • The way the symbols combine can also matter. (ie: the fact that a sentence makes sense in words and each symbol corresponds well with a word does *not* necessarily mean that the symbol-sentance makes sense)
  • The symbols also might not mean the same thing to the person you’re communicating with that they mean to you
  • If someone finds symbols easier to understand than words, they may derive more meaning from the symbols and your tone of voice and body language than they do from the words themselves
  • It’s important to pay attention to what you’re communicating with the symbols you choose as well as the words that you choose

Some considerations for symbol use:

  • Consistency between symbols matters. Symbols combine in ways that make more sense when there’s an underlying logic to the symbol system.
  • Symbols should not be childish or cutesy, even for young children.
  • Because nobody, not even young children, wants to be forced to communicate in cute ways.
  • And some really important topics (eg: abuse, boundaries, sexuality) are decidedly un-cute. People with disabilities need and deserve respectful communication about things that aren’t cute or shiny-happy.
  • Symbols should be comprehensible at a variety of sizes. (Eg: overly complex symbols don’t work well for small buttons on a communication device).
  • Symbols should be respectful, especially when they are symbols of people doing or thinking or being things (eg: protestors should look powerful rather than cute; adults should look like adults; symbols for “choice” should either be abstract or be age-neutral)
  • Symbols should be accurate. (eg: the symbol for anger should not be a smiling person; the symbol for diabetes should not be the same as the symbol for “no sugar”; wheelchair users should have the kind of wheelchairs that individuals own than hospital wheelchairs; the symbol for intellectual disability should not be the same as the symbol for the special olympics)
  • In all of these ways and other ways I’m not sure how to explain yet, I think that SymbolStix is the best existing symbol set.

tl;dr Symbols can be really helpful for supporting communication and comprehension. If you’re using symbols to help someone else communicate or understand, it’s important to keep in mind that the symbols and the words both matter. Pay attention to what you’re communicating in symbols and what they’re communicating in symbols. Sometimes there are things going on beyond the literal meanings of the words that someone decided to associate with the symbols.

AAC does not replace nonverbal communication

This is a continuation of a series on why I think it’s a mistake to ignore nonverbal communication in an attempt to force someone to use AAC. (The short version: it’s disrespectful, it undermines someone’s ability to communicate, and it prevents people from developing a valuable skill.)

One reason nonverbal communication is important for AAC users is that you always have your body with you. That is not necessarily the case for AAC devices.

AAC best practices say that someone should have them available constantly. In practice, people don’t. This is for several reasons. One is that it’s not practical to take a device to some places (for instance, most people are not willing to take a high tech device to the beach, and low tech devices are a lot more limiting.) Another reason is that sometimes people forget, or vastly underestimate how close a device needs to be in order to be immediately available. Or any number of reasons, some innocent and some horrifying, and many a mixture of both.

Also, people take devices away from AAC users. They shouldn’t, but they do. Sometimes it’s accidental; sometimes it’s on purpose. It’s never ok, but people do it a lot. If you’re teaching a nonverbal child to communicate, you need to keep this in mind when you’re considering what to teach them. You can’t assume that people will always treat them appropriately, and you can’t assume that they will always have their device. If they are capable of communicating with their body, it is an important skill for them.

Whatever else happens, someone always has their body with them. People can do a lot more if they can use their body to communicate. Communicating in body language can make it possible to communicate in a swimming pool. It can make it possible to communicate with dirty hands. It can make it possible for someone to indicate that their device isn’t within reach and that they need it. It can make it possible to communicate about pain in medical situations. It can make it possible to communicate when someone else doesn’t want you to, and has taken your device away. It can make friendship possible that otherwise wouldn’t be. And any number of other things, all of which are important.

And in order to be able to communicate with body language, people need opportunities to practice and develop this skill. If you ignore someone’s nonverbal communication to encourage AAC use, you’re making it harder for them to develop comprehensible body language. That’s not a good idea, because comprehensible body language is important. People won’t always have access to their device. They will always have their body.

tl;dr Nonverbal communication is important for nonverbal people, but parents are often encouraged to pretend not to understand it in order to encourage AAC use. This makes it harder for people to develop body language that others can understand. One reason this is a problem is that people don’t always have access to their devices, but people *do* always have access to their bodies. Nonverbal people should have support in developing nonverbal communication, because it is an important skill.

AAC is not a cure

This is a continuation of a series on why I think it’s important to listen to the nonverbal communication of nonverbal people. Often, parents are encouraged to not listen or to pretend not to understand, so that kids will be forced to learn AAC and use words. I think this is a mistake, for any number of reasons. The first post focused on the general importance of listening.

Another problem with this advice is that ignoring nonverbal communication discourages people from developing their nonverbal communication skills. That’s a bad idea, because nonverbal communication is a very useful skill for nonverbal people. It should be encouraged, not discouraged.

It’s valuable for several different reasons (and I assume, for many reasons I don’t know about.)

One is that AAC is not a cure, and it doesn’t make nonspeaking people just like people who can talk. Nonverbal people who have communication devices are still nonverbal. Currently existing AAC devices can’t do everything that speech can do. For instance:

  • AAC devices mostly can’t do tone. Voices usually can.
  • AAC devices can’t go everywhere. Voices usually can.
  • AAC devices can be taken away much, much more easily than voices can.
  • AAC is usually slow. That makes interrupting hard-to-impossible. Voices can usually be used to interrupt.
  • AAC is usually fairly quiet. Voices can usually yell.
  • Symbol-based devices generally don’t have anywhere close to sufficient vocabulary for emotional or physical intimacy. Voices do.
  • Many AAC devices give others a lot of control over what someone can say. Voices are usually more flexible.

For a lot of these things, body language and movement can be a more effective way of communicating than using a speech device. For instance, putting up a hand to say “stop!” is a lot more likely to be understood quickly than using an AAC device to say the same thing.

Similarly, most symbol sets developed that touch on sexuality at all assume the main reason people need sexual vocabulary is to be able to report abuse. Most of them don’t have robust symbols for discussing sexuality and sexual desire — and most of them don’t have any symbols for emotional intimacy at all. Body language can communicate things that a system designed this way can’t.

Another reason AAC is not like speech is that people who are nonspeaking, are nonspeaking for reasons. And AAC does not make those reasons go away.

Some people are nonspeaking because words are unnatural, painful, and cognitively draining. People like that deserve to be able to communicate in ways that are natural and comfortable. And it’s important for people close to them to listen to their natural communication. Ignoring someone’s most natural communication it is a rejection of their personhood. It’s important not to do that to people.

It’s also dangerous, because someone who finds AAC cognitively difficult and draining is likely not going to be able to use it all the time. For some people, this can be especially true when it’s particularly important to communicate, or when they’re sick. If you’re responsible for someone and you only know how to listen when they use AAC, that’s dangerous. If there’s another way they communicate, it’s important to develop your ability to understand it. (Or, if you can’t, to find someone who can.)

Similarly, if someone has apraxia or other difficulties controlling their body well enough to point, their physical ability to use AAC is likely to vary. And it’s still important to listen to them when they aren’t able to use it in the ways they sometimes can.

tl;dr Access to AAC is important. It’s not the only thing that’s important, and it’s not a cure. Nonverbal people who use AAC are still nonverbal. Body language and using one’s body to communicate are also important skills. (Not everyone can learn to do this. For people who can, it’s valuable.) It is not a good idea to discourage AAC users from using body language to communicate.

In defense of nonverbal communication

Lately, I’ve been seeing a lot of posts giving parents of nonverbal kids the advice “pretend not to understand your child so that they will be forced to use AAC and communicate in words”.

I think this is a mistake.

I think that if you want to teach someone to communicate, it has to be built on a foundation of listening to them. And that means listening to all of their communication, not just communication that happens in words.

I also think that all of someone’s communication methods are important, and that they all need to be respected. There isn’t one true method of communication. They all matter.

Communicating through body language is useful for all people. People who can talk are allowed to communicate through body language, and actively encouraged to develop the skill of doing so. It’s expected that, when I smile, point to things, frown, or whatever, that people will listen to what I’m communicating. Nonspeaking people deserve the same respect.

People say “communication shouldn’t wait for speech”. I agree with that. And I think it shouldn’t wait for words either. Because words may never come. If you wait for someone to reliably use words to listen to them, you may end up never listening to them. And everyone deserves to be heard.

And even if they will eventually use words and sentences, the things they’re saying *now* still matter. And listening to them is still important.

Presuming competence shouldn’t mean assuming that with the right support, people will eventually base most of their communication on words. Presuming competence should mean assuming that, with the right support, people will choose the means of communication that work best for them. Which may be speech. Or a voice output communication device. Or sign. Or body language. Or pointing to a letter board. Or speech. Or any number of other things. Or any number of combinations of things.

tl;dr Everyone deserves to be listened to. If you want to support someone in learning to communicate, it has to be built on a foundation of listening to them — in whatever form their communication takes. Ignoring one form of communication to force them to learn a different form is not respectful, and probably won’t help.

youneedacat : 
 This is Proloquo4Text, an AAC program for iOS devices. It does only text, unlike Proloquo2Go, and stays on a single page. It has word prediction, sentence prediction, and a side panel of pre-written words and sentences. 
 It’s cheaper than Proloquo2Go. For the moment, it’s US $64.99, €59.99, AU $69.99, and £44.99. Starting December 31st, it will be twice that amount. This is similar to the discount offered Proloquo2Go users in the beginning (which is when I got my copy of Proloquo2Go). 
 You might be interested in this if you need AAC with good word prediction and high quality voices, but not have to pay for picture symbols you’re never going to use. 
 The colors and fonts are customizable. It doesn’t have to have a brown background or Avenir fonts, that’s what I did to it. 
 I’m using it for my word-based communication and Proloquo2Go for my picture-based communication because it has always been hard for me to switch between picture and word modes on Proloquo2Go. (A cognitive problem, not a fault of the program.)


This is Proloquo4Text, an AAC program for iOS devices. It does only text, unlike Proloquo2Go, and stays on a single page. It has word prediction, sentence prediction, and a side panel of pre-written words and sentences.

It’s cheaper than Proloquo2Go. For the moment, it’s US $64.99, €59.99, AU $69.99, and £44.99. Starting December 31st, it will be twice that amount. This is similar to the discount offered Proloquo2Go users in the beginning (which is when I got my copy of Proloquo2Go).

You might be interested in this if you need AAC with good word prediction and high quality voices, but not have to pay for picture symbols you’re never going to use.

The colors and fonts are customizable. It doesn’t have to have a brown background or Avenir fonts, that’s what I did to it.

I’m using it for my word-based communication and Proloquo2Go for my picture-based communication because it has always been hard for me to switch between picture and word modes on Proloquo2Go. (A cognitive problem, not a fault of the program.)


Social skills for autonomous people: Acknowledging power


When you have power over someone, it’s important to acknowledge it. If you don’t acknowledge that you have power, it’s hard to examine your use of it. If you’re not paying attention to how you’re using your power, you will come to abuse it, and you won’t notice.

Sometimes, when people are…

This post made me think, I call my client my boss sometimes (I’m a PA to a profoundly disabled woman). It’s not totally a joke because I do think she should be in charge and her parents encourage me to do what she seems to like. But I know that most people wouldn’t call her my boss so it’s kind of cutesy for me to call her that and also…she can’t directly fire me or even tell me what to do a lot of the time because she can’t talk, write, or use AAC consistently.

I don’t like terms like “caregiver” because that doesn’t include the idea that I should be helping her do what she wants (not just “taking care of her” like you would say about a baby). But this post made me think that calling her my boss is a little much and maybe a little insulting. Not just because it implies things she can’t do but also because it sounds like a joke and makes a joke of the idea that her preferences are important. (It’s better to just say I am her PA/aide/assistant which is a more normal term, but also implies what I want to imply.)

That’s interesting. I get the sense that there’s a lot more to be said and thought about there, but I don’t know enough to say it.

If anyone who does wants to weigh in, that would be most appreciated.

Proloquo2go now has a bunch of new features.


I don’t know if the new version will be crashier or less crashy than the last version. But it’s got some really cool new features.

Particularly, certain voices have had the voice actors record a number of new words. Some of them include swearing. Such that when you type the words (or sounds, like #aargh3 or something, or certain words with exclamation points after) they either make certain sounds, or else say the word with feeling. The best voice actor I’ve heard so far is the Lisa voice, which is female adult Australian English. So I’m using that one, because it has all the swear words available. And yes it will now say things like Fuck off! And, Piss off! And, Arsehole! With actual feeling. Which I love. Some of the voice actors are better than others. Some of them will say angry things while simply sounding vaguely generally emotional.

You can get a list of all the voice effects by creating a button, then at the side of the first line, there’s a little speech bubble you click on. Then you can listen to samples of all of them, or add them in. There are also macros, so that you can get it to say things like the current date.

I’m really going to enjoy this. Time to reprogram my swear words page, for instance, to include a lot of the more emotional renderings of the words. I’m glad they finally recognized that AAC users swear. Although there’s a lot of variation between the voices on what is available. Some will only say damn while others will say fuck, arsehole, bloody hell, piss off, etc. And I love this acknowledgement that AAC users actually need to be able to cuss, when so many people are hell bent on making sure we can be nothing but polite and passive.

Example of assuming we're not listening: TalkingTiles

There is an AAC app called TalkingTiles, that uses an interesting approach to managing communication pages. It offers cloud-based subscriptions and a library of available tiles, and can sync across multiple devices. It’s also possible to edit communication pages on a computer rather than on the iPad or whatever other mobile device it’s running on.

That’s good. What’s bad, is that they think of the users of this app as parents and therapists working with people who need communication software. They don’t seem to consider the people who actually use the app for expressive communication to be users; the app is something that is used on them rather than by them.

They describe it thusly:

TalkingTILES has helped those inflicted with a communication disability brought on by a neurological disorder (such as Autism, Cerebral Palsey, ALS or Parkinson’s) as with those who have suffered a trauma (such stroke or brain injury). We’ve taken a collaborative approach to AAC bringing together professionals (therapists, educators and support workers) and their clients,caregivers and families, enabling remote programming and remote content sharing across each other’s devices making AAC therapy more productive and efficient for both client and professional. 

Do you see what’s missing here? The end user, the person who will actually be using the device to communicate, is not addressed directly.
And it’s much more explicit in the user accounts. You have to have a cloud-based subscription to make the software do much, and there are two kinds of accounts:
TalkingTILES for Professionals & Caregivers :
• Professional AAC Support Teams - therapists, educators & support workers- your Mozzaz Cloud Subscription is FREE. We want you to learn, try and trust TalkingTILES as an effective AAC solution for your clients.
• Caregivers & Families - we understand the challenges and commitments that come with supporting and helping your loved one with a communication disorder. TalkingTILES offers the most flexible and adaptable AAC app that can be tailored to your personal needs at a very affordable price. We make AAC therapy productive and efficient through our collaborative model with professionals on your team. A Mozzaz Cloud Subscription plan will save you time, money and be more efficient with your AAC goals and programming.
There isn’t an option to register an account for yourself, as the person who will actually be using the app to communicate. There should be. People who use software to communicate have agency, and sometimes look for software options for themselves. Product descriptions, marketing, and design ought to take this into account.
When you’re designing adaptive equipment or software for people with disabilities, please remember that people with disabilities use it, and address them when you describe your product.