Yesterday I was signposted to news about Google’s latest developments for people who are D/deaf or have hearing loss. The one which is most interesting is Live Transcribe – an app which uses your phone’s microphone to transcribe speech in real time. It’s meant to facilitate clearer two way communication. Curious to know how well it worked I trialled it in a variety of situations, here’s what I found.
To compare the text properly I used the same two paragraphs from the news report I was originally signposted to, it looks like this:
I also made sure the language was set to English (Great Britain).
Situation 1 – a quiet office, no music minimal background noise
My office is a fairly quiet space, the walls are thin as they’re plasterboard partitions but my office neighbours aren’t a particularly noisy bunch. On Live Transcribe there is a blue dot in the left hand corner, if you tap it it tells you current loudness and background noise, on this first screen recording you can see the dot is pretty small.
Here’s what Live Transcribe recorded in a quiet situation:
Hmm…Maybe situation 2 would be better?
Situation 2 – quiet office, with music in the background played through a laptop at low-ish volume
I often play music whilst I work, it’s not very loud and acts as a bit of background noise, so I thought let’s try Live Transcribe in what is another normal, everyday situation. In this recording you can see the background noise indicator is larger – it’s clearly picking up the music.
Oh dear…let’s try outside.
Situation 3 – Outside the office which is next to a busy main road
The office sits on a main road into and out of a market town, it’s always busy – cars, lorries and buses are regularly trundling up and down it. To record this one I stood at the entrance to our car park, about 10ft from the road. It was lunchtime on a weekday, not rush hour but the traffic was continual. Already you can see the blue dot is even bigger – the background noise is significant. Here’s what it recorded…
You may have noticed the last line of this is new. At this point, I was cold and confused by how weirdly wrong it had gone.
Is Live Transcribe worthwhile for those with hearing loss?
Although it’s worth mentioning that the app is on early access only to certain devices it seems the fundamental issues with all apps like this still exist. Background noise is a persistent and omnipresent challenge for people with hearing loss and it’s such a huge part of us being involved in conversations down the pub, in town, in a restaurant – anywhere there’s other people or things. As this short experiment shows, transcription apps are like Live Transcribe are susceptible to even small amounts of background noise, but that’s not the day-to-day reality of someone with hearing loss, we don’t sit in quiet, still surroundings waiting on a conversation.
The other issue is accents.
This is a tricky one, mapping every accent in the world is not viable (or is it? tech/app dev people please advise?) which means the app will work on the generic language selected. The thing about selecting English (Great Britain) as the language option is that it’s never a Scottish, Welsh or Northern Irish accent default. It’s always English. I don’t speak English the way the app expects, so the translation is very hit and miss. It’s the same when I use the transcription service on my Mac. The result is that if I’m using this to converse with someone and they have a regional accent it’s likely to go awry and from there I can lose the sense of the conversation and what’s being said. Whilst I try to get back on track, the conversation has progressed and I’m playing catch up again.
As someone with limited hearing and who already struggles in social situations with hearing aids, I really wanted this to work, but it has a really long way to go before it’s viable in what I’d class as normal situations.