ITL #576 The appreciation test: artificial communication in the light of evolution
6 months ago
(Comments)
The age of AI in organizational communication has already begun, but there is a test it may never pass. By Jens Seiffert-Brockmann.
As a science fiction fan, I have always been impressed most with artificial life forms. R2-D2 and C-3PO in Star Wars, David and Bishop in the Alien series and, most notably, Data, the android in Star Trek who above all things wants to become more human. In particular, the story of the android Data is an enlightening one, due to his many difficult encounters with human features that evade his computing power: the appreciation of art, humor, and most of all, intimate and deep relationships. With all his computing power, for Data the human mind remains a mystery.
At the end of his seminal work “On the Origin of Species”, Darwin predicted a bright future for the study of the human mind. Over the past few decades evolutionary psychologists have concluded that the mind must be a “set of information-processing machines that were designed by natural selection to solve adaptive problems faced by our hunter-gatherer ancestors”, as the evolutionary psychologist Leda Cosmides, and her partner anthropologist John Tooby wrote in 1997. If the mind has evolved like any other part of ourselves, it surely must include our ability to communicate as well. Hence, it seems plausible to also ground the study of human communication in evolutionary thinking.
Evolution provides us with insights into the workings of human communication. When evolutionary psychologists say that we possess stone age brains, they mean that the wiring that we base our abilities to communicate upon goes back way longer than our university education, kindergarten, or our parents and friends interacting with us. Evolution has ingrained communicative programs in our minds that have proven useful to our ancestors in the past.
At the beginning of the 1990s, the British anthropologist Robin Dunbar and his team had an intriguing idea. Dunbar proposed that a major factor in the evolution of our mind was our social life and the need to navigate the social fabric of our communities. He predicted that based on the size of our neocortex, the average human individual can establish and maintain about 150 meaningful social relationships – a number that has been proven empirically correct in numerous studies since, for example by looking at the amount of Christmas cards being written, or the average number of Facebook friends.
Surely, some of us can entertain more than that, but then again, others maybe less. The key message when it comes to strategic communication and public relations is straightforward: The trustful relationships that communicators can establish on behalf of an organization are limited. Even the best professionals can talk and engage only so much.
Upending the game
As is so often the case these days, artificial intelligence is in town to upend the game. With all the seemingly endless possibilities of AI, one does not need to be Nostradamus to predict that the technology will have a deep impact on the communication professions – it already has.
In 1950, the brilliant British mathematician Alan Turing devised a test he called the imitation game, but which has entered the collective memory as the Turing-test. Passing the test would mean that a machine is capable to pass as a human agent in a conversation with another human individual. But for a very long time, the age of the human-like machines seemed far out of reach.
Artificial intelligence has racked up stunning achievements over the past few decades. The IBM computer Deep Blue beat then chess world champion Garry Kasparov in a match at the World Trade Center in New York City in 1997. Google’s AI Alpha Go famously beat the world-renowned Go master Fan Hui in 2015, and in 2019 the poker bot Pluribus proved that even the best human poker players were no longer a match for AI in no-limit Texas Hold’em. The list of achievements, first slowly than ever quicker, became too long to recount.
But we as humans could take solace in the fact that we surely would be able to detect a machine’s attempt to talk to us. As communicators, we might have acknowledged the calculating prowess of the machine. But most have, and still do, scoff at the notion that AI might soon do the communication for organizations as well. And then, inevitably, a new generation of AI arrived on the scene, AI built upon large language models.
It didn’t take long for Open AI’s ChatGPT or Google’s LaMDA to ace the Turing test – tearing down that last line of defense in 2022. With all the literature and knowledge in the world being fed into the AI – the poems of Omar Khayyam, the wit of Mark Twain, or the stories of Chimamanda Ngozi Adichie, just to name a few – why wouldn’t AI become as powerful a storytelling agent, as a human one?
Conversations of strategic significance
Imagine an organizational AI that can have conversations of strategic significance not just with some stakeholder group, but with all of them? Are we reaching a new age of communication, where machines can do the relationship building for us? It is no big leap anymore to imagine a powerful AI, executing the communication strategy of its human overlords, producing all of the content by itself.
Maybe the day of the fully functioning society, as proposed by Robert Heath, brought about by AI, is closer than we imagine it to be. But maybe it is not. AI has still another test to pass, a test I would term the appreciation test.
As our minds are not equipped to comprehend the modern, digital world, driven by AI, the same is true in reverse for AI. Even with all the human knowledge in the world, the machine most certainly lacks an internal model of what lies at the core of human relationships: mutual appreciation and reciprocity. The machine doesn’t want anything from us, it just takes if it is offered, and gives when it is asked. Passing as a human being in a fleeting conversation is one thing. Getting into an interdependent relationship built on trust, is something else entirely.
At the end of the day, our human nature might still get the better of us and make us yearn for some good old appreciation, the feeling that somebody is listening to what you have to say and act on it. The age of AI in organizational communication is already upon us and there is no turning back. Big companies especially will be able to employ powerful AI to do their normal day-to-day content creation, listening, and information for them. But it is still up for debate as to what the human element in it will look like. It might very well be that after all those conversations with a ChatGPT-like AI, stakeholders demand to speak to a “real” person, somebody who is able to not just express appreciation verbally but live it.
But then again, maybe the virtual AI characters populating the metaverse might satisfy this need. Or we will have android-like creatures like Data from Star Trek, for whom the human mind won’t be such an intricate riddle, but just another normal thing you encounter at work at the communications department.
The Author
Dr. Jens Seiffert-Brockmann
Univ.-Prof. Dr. Jens Seiffert-Brockmann, Head of Department of Business Communication, Program Director Master in Business Communication, Associate Editor International Journals of Strategic Communication, WU - Vienna University of Economics and Business.
mail the authorvisit the author's website
Forward, Post, Comment | #IpraITL
We are keen for our IPRA Thought Leadership essays to stimulate debate. With that objective in mind, we encourage readers to participate in and facilitate discussion. Please forward essay links to your industry contacts, post them to blogs, websites and social networking sites and above all give us your feedback via forums such as IPRA’s LinkedIn group. A new ITL essay is published on the IPRA website every week. Prospective ITL essay contributors should send a short synopsis to IPRA head of editorial content Rob Gray emailShare on Twitter Share on Facebook
Comments