I’m delighted to publish this article by Michael Crossland, an optometrist who has a flair for making eye-related matters not only comprehensible to the layperson but hugely fascinating and enjoyable. I first met Michael on a creative writing course, although from his very first story, which had the rest of us transfixed, I wondered what on earth he was doing there. In this article he discusses our use of vision-related expressions.
The English language is a full of visual metaphors. We say "I see" when we mean "I understand," companies use the word "vision" instead of "aims" and Tony Blair used to answer interviewers’ questions with the word "look," when what he actually meant was "listen (to me)". Where does all of this metaphorical seeing leave blind people?
I asked Kelly Carver, a friend of mine who has severe vision impairment, whether he takes offence at my clumsy use of these phrases — I am forever suggesting books for him to read and films for him to watch, although I know perfectly well that for him this involves listening to an audiobook and using audio description to follow the action on a cinema screen.
Kelly tells me that he doesn’t mind me using these terms but that as his vision has got worse he has modulated his own language — he’s now more likely to say "It’s good to be with you" than "It’s good to see you". As well as avoiding his companion second guessing ("Hmmm, but can he actually see me?"), it’s a much more meaningful phrase which I might use more in my own daily life. After all, when I have lunch with Terry, the editor of this blog, I take far more enjoyment from our conversation than I do from looking at his face (sorry, Terry).
Kelly accesses most text with his ears rather than his eyes. He uses Voiceover on his Mac to listen to his emails, and dictates his replies. Other than one memorable occasion when it took me a while to realise that part of his message was directed at his dog rather than me ("Hi Michael, Great to get down, I’ll take you out in a few minutes, hear from you as always") I don’t really think about our email correspondence being based on speaking and listening rather than seeing and typing.
Kelly doesn’t only dictate short items. He is also using speech to write his first book – an extremely funny and moving memoir about buying a condemned fraternity house in Minneapolis and converting it into a guest house, despite being legally blind and, more significantly, initially not telling his wife what he was doing. Unlike James Joyce, Frederick Delius and William Blake, Kelly doesn’t need to employ a human amanuensis to transcribe his thoughts — his computer does all of this for him.
Like most people with vision impairment, Kelly has never learnt to read Braille. Kelly has an inherited retinal disease which didn’t affect his vision until he was at university, so he had good eyesight as a child. Braille is notoriously difficult to learn in later life, although in my work in an eye clinic I have met several older adults who have managed to pick up some Braille skills.
Braille books are expensive to produce and very cumbersome – the Braille version of the New York Times looks a bit like an old London telephone directory — but modern Braille displays have solved lots of these problems. A Braille display is about the size of a box of breadsticks and connects by Bluetooth to a phone or computer. It has an array of small pins which rise and fall to create Braille characters. The user slides their fingers across the display then taps a key to allow the next line of text to be displayed.
It is quite entrancing to watch the fingers of someone reading Braille as they glide across a line of text. Braille users sometimes hear uninvited comments about how amazing it is to see someone reading in this way. "I don’t understand what people find so remarkable about watching me read," one Braille reader told me recently. "Surely it’s just as incredible that you can move your eyes across some ink marks on a piece of paper and decode this into a voice which you hear in your head?"
It is fantastic that that almost every book in print is available in an electronic format and that most adults in high-income countries carry a device in their pocket which has the computing power to offer high quality text-to-speech output, but there are still some important barriers to electronic reading. Not everyone has the technology skills to be able to set up their devices to do this (although many sight loss charities provide technology lessons and support for people with vision impairment); not everyone can afford a smartphone with enough data to download books; plus not everyone enjoys reading in this way. Computerised speech synthesisers have got much better recently, so it no longer sounds as though a Dalek is reading your bedtime story, but many people prefer choosing books from the more limited list of titles read by a human actor.
I don’t think I write differently when I am working on something that I will send to Kelly for him – or rather, his computer - to read. I am sure that I still use visual metaphors as much as anyone else in our vision-centric society. But the next time I see Kelly (yes, I know), I will make an effort to say that "it’s nice to be with you."
Michael Crossland is an optometrist, writer and expert in vision impairment. He works at University College London and Moorfields Eye Hospital. He is currently working on a nonfiction book called ‘Tales of Blindness’ for UCL Press. He has recently switched from Twitter to Mastodon, where he can be found @mikecrossland@urbanists.social
Great essay, Michael. Do you think terms like "the blind leading the blind" are problematic, and if so what can we do about them?
Such a fascinating post - thank you, Terry and Michael! I don't think I'd ever considered the minefield surrounding sight-related language - I'm grateful that you've got me thinking about it. Great post.