An article in The Verge recently described the introduction of LG’s new neckbuds, which will have Google Translate and Google Assistant built right in. The idea behind this is that it will remove any need to say “Hey Google” to summon Assistant or Translate services, which feels like a step in the right direction.
Or it would, anyway, were it not for the need to wear hideous-looking neckbuds in order to take advantage of this slightly more streamlined process. Sam Byford with The Verge seems to agree, adding to his article that “awkward neckbuds are probably a necessary step along the way” to something like what one sees in the babel fish from The Hitchhiker’s Guide to the Galaxy.
As the link above suggests, the babel fish is, at its core, a type of brain-computer interface. Though its functionality may differ from that of EMPATHY, I thought it might be interesting to explore how EMPATHY might attempt to provide translation services of its own given that it’s installed right in one’s mind.
Imminent Dawn, the first book in the EMPATHY series, features little mention of translation services. The only instance in which translation figures into EMPATHY canon is during one of the earliest EMPATHY evaluations, when the scientist leading the study, Wyatt Halman, asks a study participant to provide the German word for “badger.”
The patient in question is a native English speaker who can also speak French and some Hindi, but has no background in German. In order to find the German word for “badger” then, she must actively search for it through the EMPATHY interface’s link to the internet, or what it calls the cerenet.
In this way, the version of EMPATHY seen in Imminent Dawn provides translation services on par with typing something like “German word for badger” into one’s search bar. It’s not especially efficient, and certainly not managed through a proprietary software add-on of any kind à la Google’s Google Translate.
This might feel underwhelming. The EMPATHY series is sci-fi, right? Where’s the innovation?
Well, just as in real life, innovation in the EMPATHY universe comes to fruition as a matter of degree; the translation mechanism in Imminent Dawn is clunky, yes, but certainly less clunky than wearing neckbuds and having to push a button to invoke translation services every time they’re required.
In the later books in the series, particularly by the time we reach books three through five, the necessity of translation services will be increased as a result of the action on the page, and with EMPATHY (in theory) having had time to be developed more fully, those translation services might evolve to receive linguistic input that, on the fly, is translated into one’s native language when that particular feature is engaged. This on its own is significantly more innovative than that which is on the market in our world currently, especially when one considers all of it is happening directly in one’s mind.
The ultimate in EMPATHY-based translation services, however, would be to have the nanochip so wired to the brain that it allows for the manipulation of one’s speech production mechanisms themselves. In this way, a single-language English speaker could go through the motions of producing speech in English, but EMPATHY could filter linguistic intent such that the phonetic output would be in the target language.
In other words (literally), someone who knows absolutely nothing about Vietnamese could then speak the language with a fluency that might otherwise take years of study and immersion to achieve.
Now that’s what I call innovative.
Whether EMPATHY will ultimately achieve this degree of innovation as the series of five books remains to be seen, but readers worldwide will have the chance to see where the innovation begins with the release of EMPATHY: Imminent Dawn in January 2019.