The humanities in a tech-rich world
I’ve been running a small experiment in recent weeks amongst friends and colleagues, simply by asking them what they think of a controversial article in the Washington Post by Elizabeth Dwoskin. The article talks of how poets and playwrights are increasingly working in Silicon Valley’s tech firms to help make machine bots more ‘human’. The idea is to give personal assistants like Microsoft’s ‘Cortana’ and Apple’s ‘Siri’ back stories and teach them to speak natural language so that their interaction with humans is more, well, ‘human’.
There has been an interesting difference of opinion amongst the very unscientific sample I’ve consulted – even, dare I say, a gendered divide. Some are excited about the future opportunities this might bring; others concerned that if we teach bots to be human, we might no longer need humans. According to other sources, such as Fortune magazine, not only personal assistants might no longer have a job, but also pilots, teachers, lawyers, surgeons, reporters and financial analysts. I’ll let you guess how the reactions broke down by gender.
So is this a good time to be training for a white collar job in the ‘knowledge’ economy? Is it a good idea to go for a social science, humanities or ‘liberal arts’ degree, when so much knowledge can now be accessed automatically? You might think not. We were treated to a careers talk this week by Neville Crawley, an alumnus who now runs tech firm ‘Quid’ in Silicon Valley. Quid is a pretty amazing company, but also an illustration of the kind of initiative that might make human analytical capacity redundant.
In a nutshell, Quid uses automated and very fast searching of digital content to analyse and visualize the ‘world’s collective intelligence’. Through searching vast quantities of material in the blink of an eye, and building up a visual picture of similarities between narratives that emerge in different places, it can provide insight into the emergence of news stories, security threats, or market trends that would take a well-trained journalist, security analyst or marketing professional days or weeks to develop manually – and it can do this much more accurately.
But here’s the thing. Neville says that whilst at its outset, Quid was a company of software engineers, tackling the computing challenge of how to process natural language really quickly, more recently it is looking for a much more diverse range of graduates as it brings its product to market. The computational power still has to be trained; more important, it is the use of the technology that Quid needs to understand to gain a return on its investment.
I’m tempted to go further. Take another example of how the vast computational power of internet search engines much less sophisticated than Quid has provided to most people the power to ‘check’ what they see, what they are told, for accuracy, authenticity, or reliability. Gone are the days when a lecturer could just lecture, and expect students to sit and listen and believe. Now, whether you like it or not, if the tutor inadvertently gets it wrong in a lecture there will be at least a dozen smart students spotting your mistakes on their smart phones and emailing you about it later. Best to encourage them to check – as a means to engage with what you are saying and play an active role in their learning.
If that is true in a lecture, how much more so for the ‘real world’. Take the entertainment industry. However compelling a story, however great the acting, if a film or mini-series contains historical inaccuracies, scenes shot out of place, or factual errors, the endless review pages of the internet and social media can shoot your creation out of the sky before the professional reviewers have put pen to paper.
But there is an opportunity here too, as two recent SOAS successes show. One is Amazon Prime’s recent mini-series Man in the High Castle, loosely based on the 1962 science fiction novel by Philip K. Dick. The book and film imagine a post-war world in which Germany and Japan won WWII and are in charge in North America. This might not seem fruitful ground for ‘historical accuracy’, and yet it is precisely advice on how to make the film more historically and culturally believable that was offered by SOAS Japan expert Griseldis Kirsch. Just because a story is untrue in a literal sense does not mean it can get away with being unbelievable. And our capacity to know if something is believable has never been greater.
Another great series launches in the US this Memorial Day weekend – the long-awaited remake of the famous US mini-series Roots. Produced by Mark Wolper, the son of the original producer, one of the starting points for the remake was that the original mini-series was quite poor on historical accuracy – a fact that is increasingly evident to today’s savvy TV-watching public. The SOAS contribution to the Roots remake is truly inspiring. Lucy Duran advised on West African culture and music, composed songs for specific scenes and brought in three top griot musicians from Mali to play in pre-colonial styles such as might have been heard at the time of Kunta Kinte. She also taught the cast to speak and sing in Mandinka. Meanwhile Kadialy Kouyate played the Kora (and a couple of other roles) in the film, and translated much of the material. The result is not only visually and audibly stunning, it is also true, in a surprising way for a piece of fiction.
What can we take from this? Is artificial intelligence soon going to make us all redundant? I don’t think so. On the contrary, the substantial improvements in computational power of recent years are – counter-intuitively – making the arts, humanities and social sciences much more and not less relevant. For now, that may be less so in London than in Silicon Valley, perhaps because the London finance sector is more averse to risk than its counterparts in the Bay Area. But the trend is clear.
Yet for synergies to emerge rather than conflict, social science and humanities students and scholars need to be curious about, not afraid of these new technologies. Whether it is history graduates working for tech start-ups, or computing engineers working professionally with musicians or political scientists, the opportunities for productive collaboration are substantial indeed. It starts by just talking to each other. Better still, there is no need for the mathematically-challenged to learn how to write code or create an algorithm, just a need to have confidence in the specialist value of our own fields.