International Migrants Day

Today (December 18th) is International Migrants Day – celebrated in honour of the adoption of the International Convention on the Protection of the Rights of All Migrant Workers and Members of Their Families in 1990. It’s an odd day, and an odd convention. But we should make more of it, if we can, because the issue of international migration has never been more salient, or understanding of an important issue so lacking. So what is the problem, and what can we do about it?

First, the day itself. International Migrants Day always strikes me as a bit like migrants themselves – rather hidden from view. As if falling in the week before Christmas, when half of the world’s population is distracted by the rampant consumerism of the festive season, were not enough, this year it falls on a Sunday, the day when half of the world tries not to think about work at all. The last UN day of the calendar year, it also comes too late to be celebrated on most university campuses.

And then there is the Convention it marks. Championed by the Philippines and signed in 1990, it aims to foster respect for migrants’ human rights. Yet it took thirteen years to gather the minimum 20 signatories to enter into force, and still has only 49 signatories, most in Latin America, West Africa and the Middle East, and virtually none hosts to significant numbers of international migrants. And it seems to have done little to help migrants – on the contrary, attitudes to migrants appear to have hardened as populations across Europe and North America have turned to populist politicians and rejected the globalisation of which migration is a central part.

Yet we do need to think about migration, and never more so than now. For example, globalisation may be slowing – 2016 marked a turnaround both in attitudes to globalisation, and in its strongest marker, international trade, which has taken a step back after what seemed like an inexorable rise. Global governance is also under threat, with the surprising sight of a successful US Presidential candidate questioning the value of the NATO alliance. Yet migration seems not to have slowed, with the UK apparently seeing historically high levels of migration in the months prior to the Brexit vote.

More important, attitudes to migration are not just polarised, they are highly complex. On the right, there is significant and growing anti-immigrant feeling, yet advocates of the free market have perhaps done more than any to stimulate migration. East Europeans have not simply ‘appeared’ in East Anglia and Lincolnshire, two of the heartlands of Brexit voting – they have been brought there by the companies and gangmasters working for the very farmers that have been the most strident supporters of an anti-EU position.

And on the left, there is a similarly schizophrenic approach: migrants are seen at the same time as deserving of celebration, the manifestation of global class struggle; yet at the same time there is a powerful discourse that if only it were not for poverty, global inequality, environmental degradation, climate change, or proxy wars (choose your factor, or combination of factors), people would not need to move and migration would be much lower. Perhaps this is the definition of a dialectic, but nonetheless there is confusion – migrants are good, but migration is bad, because its causes are the things that make our world divided. But would our world really be better off without migration? And isn’t that what the xenophobes say?

Universities – their researchers and students – are relevant here. First, there is the issue of whether international students themselves are migrants. Common sense says they are not, and university leaders in the UK have rightly campaigned hard that they should be removed from the migration targets set by our current government. Yet many demographers would say they are migrants, on the grounds that – like other migrants – they spend more than one year in another country.

For me, international students seem to encapsulate our problem in understanding migration – whether they are migrants or not. Rather than challenging the absurd policy of putting a cap on migrant numbers (a policy placed in the Tory manifesto of 2010 at the last minute with little working through of its logical consequence, and rashly accepted by the Lib Dems in their legitimate desire to be part of a coalition government), we too easily accept the premise that migration is bad, and try to argue that international students are not migrants.

Yet looked at another way, international students place a different light on migration itself – they can be seen as one of a number of examples that migration is not simply an expression of inequality and poverty, but part of a legitimate process in which young people seek to get a better life, where there can be benefits both for themselves, their families and their host communities when they do so. This is true for many international students; it is also true for the many families of migrant workers I have met in my own research in poorer areas of Africa, Asia and Eastern Europe. Here, migration is usually understood as an opportunity, not a life sentence.

That should also shape our approach to research on migration – of which there has also probably been more in the last few years than at any other previous point in history. That this research is happening is important. But we should also step back and ask difficult questions – has this research opened up a space for honest debate about migration, or simply entrenched well-established positions? And what starting premises have we accepted in order to carry out the research in the first place? Are these defensible?

This International Migrants Day is probably not one in which such debates will figure large. But on the Day’s 20th anniversary next year, they should. And perhaps we could get them going by moving the Day forward a couple of months, so that everyone is paying attention.

What questions are worth asking?

Universities are all about asking (and answering!) the world’s most burning questions – it is core to what our staff, our students, and our academic programmes should be doing. For many, the most important questions for universities to ask are seeminly obvious. They should have value for society, for example asking how we can reduce suffering, cure diseases, or invent new materials or build machines that are lighter, faster, smarter. As a London university recently put it (and I paraphrase), ‘The world asks, we answer’. The assumption is that what the world is asking is not in doubt.

The SOAS equivalent to these questions can also seem obvious. In a striking campaign video launched today as part of our Centenary celebrations, we focus on important questions that are clearly worth asking, and to which we believe we can help find answers – answers that are not obvious. For example, our researchers are helping to identify ways of dealing with migration and displacement that place such movement in its regional and historical context. Asking about context should deliver solutions that are more likely to work because they are grounded in local and political realities. We ask too about ‘what happens after war?’, not just as an abstract notion, but because, despite the horror of war and conflict, there are some actions that can help reduce suffering and promote a sense of justice and reconciliation rather than perpetuate a cycle of violence and retribution.

But in focusing our campaign on ‘Questions Worth Asking’, we are going much further than this. Indeed, ‘Questions Worth Asking’ is an idea that goes to the heart of what a modern university could and should be. Because not all of the questions that are worth asking are obvious, let alone the answers. Many are apparently obscure, esoteric, unsettling or potentially dangerous. Yet they are crucial for a functioning democracy, a liberal education, and a thriving and dynamic society.

So how do we know what are the questions worth asking? If non-obvious questions are still worth asking, does this mean that all questions are of equivalent worth? And what does this have to do with the meaning of a university, higher education, or with the state of our democracy?

Over the summer vacation I had the pleasure of reading Michael Roth’s recent book Beyond the University, given to me by a SOAS alum who like me is concerned about the threats currently mounting up against ‘liberal education’. Liberal education has a much more specific meaning in the US context, and Roth, who is President of Wesleyan University, traces its origins back to Benjamin Franklin and his founding of the University of Virginia in 1819, soon after American independence.

Franklin was concerned that the universities of the time – what are now known as the ‘Ivy League’ and all created as colonial institutions – were focused narrowly on orthodoxies – whether religious, philosophical or commercial. Students had to master the ‘canon’ of accumulated knowledge. Yet the result was deeply conservative – students trained to accept the status quo.

In contrast, Franklin wanted a university in which students were free to explore questions for themselves, so they could be active citizens of the new republic. In time, and subsequently informed by the German model of research universities, an idea of a liberal education emerged in which students are free to explore, and take themselves to the frontiers of knowledge and innovation. This has become the core of a US system that is in many respects world leading, and deeply influential in the UK too. For example, at SOAS we encourage our students to challenge, to think outside the box and look at alternative perspectives. We want our students to think global, to be global citizens.

Yet the challenges to this idea of education are multiple. From outside the academy, there are many who criticise the idea of pursuing knowledge for knowledge sake. Why should we teach or research ‘dead’ languages, for example? If public money is to be spent on higher education, surely there should be a demonstrable benefit for society? Surely we should train students to do useful jobs, not just to sit and think? Roth shows how long such a critique has been around, but also how rife it is today. And such a critique is prevalent not only in the US, but in the UK too – it has long been a dominant narrative in the Labour party, and is likely to increasingly inform opinion in the Conservative cabinet of Theresa May. How do we place a value on what our young people learn?

From inside the academy there is plenty of critique of the liberal model too. Many in the professions and disciplines maintain that there is a corpus of literature that must be mastered in order for a degree or a research project to count. Can you have an English degree without Shakespeare? (the answer is maybe not, although at SOAS the Bard is only one part of a global approach to literature that starts in year 1). Or, can you have a research project that does not take as its starting point the existing literature and seeks to critique it?

At SOAS we teach and research at least seven ancient and classical languages – not Latin or Greek, but Sanskrit and Prakrit, Avestan, Babylonian and Assyrian, as well as classical Arabic and Chinese. How are the questions in this field ‘worth asking’, when these languages are not spoken any more? And if there are questions about Babylonian that are worth asking, who decides what they are?

Let me take an example of SOAS research on ancient Babylonian to illustrate the point. SOAS researcher Andrew George has been deciphering cuneiform tablets in the Schøyen Collection in Norway, which houses probably the finest unpublished collection of Old Babylonian letters in the world. In engaging with these letters, he asks what life was actually like around 1,800 years before Christ. Is this a question worth asking? It seems to me it is – as the answers allow us to populate history with real personalities, not idealised stereotypes. Surely such understanding is a worthy goal, with resonance for moving beyond stereotypes today?

Another challenging example is Drew Gerstle’s work on the Shunga art of Japan, recently shortlisted for by the THE for Research Project of the Year. Shunga has traditionally been seen in Japan as a form of low-grade pornography, definitely not worthy of academic study. But by asking questions about what is art, and what is obscenity, the project has led to a re-appreciation of this work, and unprecedented public discussion of art and sexuality in contemporary Japan.

Or take new work being developed by Jieyu Liu on Chinese families. China’s ‘One Child’ policy is much studied, including its demographic, social, political and economic ramifications. But what does it, and the many other profound changes in Chinese society mean for intimacy within Chinese families? This project is still at its early stages, but it is already opening up a new world of understanding of how people live in this fast changing region of the world, with fewer children. It should go on to shed further light on global questions around family, development and modernity.

One view on what makes questions like these worth asking is that the key is academic freedom. We must be free, in universities, to set whatever questions we want, as only by doing so can we be free to find the unexpected, reveal the concealed, or challenge the orthodox. Indeed, I’m a staunch defender of academic freedom – in particular the right for all views on a given subject to be heard on campus (and there are plenty of threats to this for example in the implementation of the UK government’s PREVENT agenda). Yet this position is perhaps at its core rather arrogant; or at best relativist. Are academics really more qualified than politicians or indeed the general public to have a view on what is worth studying?

An alternative view is that test of a worthwhile question is whether it advances knowledge. Clearly nobody – government or philanthropic donor – wants to pay for work to tell us what we already know. But we need to do more than this. Surely the knowledge we are generating through asking questions needs to open up further areas of enquiry, diversify the tools we have to hand in thinking about and understanding the world, and allow us to challenge assumptions and prejudice? These are the things that are key to a healthy democracy.

As it turns out, that is what SOAS has been doing throughout its first 100 years. As Ian Brown’s new book on the School’s history tells it, though set up as a place to train colonial officials, right from the start we could not resist the expansion of learning as a loftier goal. Long may that continue.

Innovation in higher education: the next 10 years

It was an honour this week to be asked to speak at the 12th World Islamic Economic Forum, held in Jakarta Indonesia. The theme of the forum was ‘Decentralising Growth, Empowering Future Business’, and certainly empowering future leaders has been the business of higher education for many years. I was asked in particular to focus on the topic of innovation in the higher education sector, and there could not be a more important topic for the Islamic world, or more broadly for the sector. Indeed, the last few decades have witnessed a phenomenal transformation in the HE worldwide, one that has involved genuine decentralisation and democratisation of knowledge.  All the indications are that these trends will continue and accelerate in the future.

When I went to university in the 1980s, I was one of just 15% of UK 18-year olds who benefitted from higher education, and although some nations had much higher rates – notably the US– many had lower rates of participation. By contrast, today well over 50% of UK 18-year olds go into higher education, and there are new and expanded universities all around the world, especially in Asia, and across the Islamic world. Much of that transformation has happened in the last 10 years, and it is changing the face of global society.

Alongside this growth in student numbers, there has also been a revolution in learning. When I studied, I went to lectures, wrote essays, sat eight final examinations in a week to determine my degree result, and because I was at a very traditional university, I also had the privilege of small group tutorials. But by the time I became a senior lecturer in the mid-1990s, I was already teaching in ‘flipped classrooms’, where students took responsibility for delivery of a part of the lesson, and there was continuous assessment and problem-based learning, ensuring that students were taught how to think, not necessarily simply how to do.

The advent of digital technology over the past 10 years has transformed the classroom still further. From the mid-1990s to the mid-2000s, we had signs saying ‘turn off your mobile phone’ as they were viewed as a distraction to students in a lecture. Now, I expect my students to have their smart phone switched on in a lecture. If there is something they don’t understand, they can google it to see if they can find a different perspective. If they have a question, they can send a message to my ‘text wall’, a site where their questions are collected. I can refer to these at the end of the lecture, picking out and answering the most popular questions.

And if they still don’t get it, or want to go back over the experience, we have ‘lecture capture’ technology that records the lecture and plays it back on their e-learning site. They’ll also be able to sample other lectures by staff at MIT, Stanford, or other schools that have made lectures freely available online. They can look at TED-X talks online. They can discuss their findings with other students through a chat room facilty, moderated most likely by a PhD student or ‘Graduate Teaching Assistant’.  And as I learned at the 12th WIEF, they can also have textbooks delivered on loan by drone – should they wish!

In turn, transnational education is spreading these approaches and methods around the world. In the Islamic world, Malaysia has invited universities such as Nottingham, Reading, Newcastle, Southampton or Herriot-Watt to bring these innovative approaches to a local context. Meanwhile, countries such as Qatar, Malaysia and elsewhere are creating ‘education cities’ in which a range of foreign and domestic universities develop new faculties in the same physical space, promoting cross-institutional as well as cross-national and cross-disciplinary learning.

And students are on the move, not only to the UK, Europe and US, but in the opposite direction too. It is early days yet, but a number of universities, including my own, are moving towards offering a period abroad to all students, not just in Europe (in fact in our case mainly not in Europe) but also in North America, Asia, the Middle East and Africa. Recent years have seen growing numbers of British students studying alongside counterparts in other parts of the world, whether in summer schools, language classes or regular classes.

In recent years, my students have witnessed first hand the aftermath of the Fukoshima nuclear accident in Japan, and the Arab spring, both somewhat to the concern of our health and safety officer. Next year, SOAS students will study across the Islamic world, including Palestine on the West Bank at An-Najah University, and at Mashaad University in Eastern Iran. We are also working with the Cambodian government to create a ‘field school’ in Banteay Chmmar for SOAS and Cambodian students of art history to train museum curators and art historians of the future.

These are all innovations that are happening already. Meanwhile, there is much talk in the HE sector about ‘disrputive’ innovation in higher education – especially models like the ‘Massive Online Open Courses’ or MOOCs that appear to offer a new model of free on-line learning that can be ‘cashed in’ for a qualification when the student is ready, but otherwise allow flexible and life-long learning alongside the demands of work, childcare, and the other elements of modern life. The scope for development of this as a growing number of people have access to broadband-enabled smart phones and tablets is huge.

I’m less sure that this is the decisive change, or at least that this represents ‘disruption’ as it is viewed in Silicon Valley. Much of the change of the last two decades has been driven by innovation within the established HE sector; in turn, much of it seems to me to build on the past, rather than represent a break from it. Take for example the growth of online education and MOOCs. Here although the leading companies driving innovation are new, such as Coursera and FutureLearn (we are the only UK university to work with both of these platforms), the content is coming from established universities.  And one of the first ‘distance learning’ universities – the University of London – through which we at SOAS offer all of our online courses, currently to around 5,000 mainly postgraduate students – is still a market leader.

If I look at our own online courses, in Finance & Management, Development Studies and most recently Diplomacy, I see evolution and changing pedagogy but not rupture. So for example our finance courses involve the delivery of materials to a ‘learner’ for self study with a tutor available for questions and an exam to be sat at the end of the course; whilst our diplomacy courses involve regular interaction with lecturers and other students in small groups through weekly ‘webinars’. Both are sensible models, each arguably more appropriate for the particular student audience they are trying to reach. And of course, all of our students, whether on campus or at a distance, still ask the key question: ‘how do I pass my exam?’

In turn, ‘transnational education’ is not new at all. There has always been student mobility, and indeed, distance learning has a very long pedigree. What is new now is thre relative ease with which both mobilty and learning at a distance can be done – an ease that has made our campuses and teaching styles much more diverse and more innovative. Distance learning is no longer the ‘poor relation’ of on-campus teaching – rather, on-campus delivery has much to learn from the innovative techniques of distance learning.

So what of the next 10 years? I’d like to highlight three areas in which I believe there could be substantial changes, some seemingly more revolutionary than others.

First, the new and brave world of ‘bots’. Silicon Valley is working hard on this – you have probably already heard of Siri or Cortana, online help assistants that work with your Apple or Windows machines to allow you to use them more easily. There are an increasing number of personal assistant bots – for example to help with healthcare enquiries and support to people to self-medicate. They are also emerging in the HE sector, and it is certainly not inconveivable that in 10 years time, a significant proportion of our teaching assistants will be computers.

For example, one of the more interesting stories of the year was of a class at Georgia Tech in the US, admittedly in Artificial Intelligence, where half of the students were assigned their usual student advisor, and another half got ‘Jill’, who was a computer (but they didn’t know that). The feedback scores at the end of the year for teaching quality were slightly higher for Jill – indeed, one of the students said he was going to nominate her for a teaching award before he found out she was not human!

Is this a threat? Will teaching by people like me become a thing of the past? Probably not; rather I see automated student advice as potentially a huge step forward. Student advice in this context is helping students to understand the learning that they are doing in class – it’s online support that comes after the lecture. The creativity involved in learning and teaching, the 1-2-1 interactions are not going to go away, but they can be massively enhanced by automated support targeted at students who are struggling (something we can also know more clearly through analysis of data on performance in class assessments – another thing that will likely be generalised in 10 years time). Of course, we do have a huge challenge to make this automated advice seem and feel real – most of us don’t like interacting with a computer.

A second change, which is also happening, but I believe will accelerate, is distributed or decentralised learning. Lets be clear – MOOCs will not replace degree programmes, and degrees at most univresities will not simply move online. However, there is scope for online learning groups, extending beyond institutions and national boundaries. Why should a class at SOAS and a class in Jakarta not interact on a regular basis over the internet? Think of the economies of scale from bringing together students interested in learning an ancient language such as Pakrit or Hittite (two languages taught at SOAS) from different parts of the world. Here the online connection is the missing link in a system where increasing numbers of students get to travel – we can have blended models in which students spend part of their degree in one university, part in another (or others) and part online, inteacting as a cohort with students who have started in different places. Some of the more innovative business schools are already delivering this sort of model, and it will grow, not only for professional education.

And then there is integration –of teaching and research; and of learning and practice. This might seem a strange thing to predict, as the trajectory in the UK is currently in the opposite direction, with the separation of our funding and regulatory bodies into separate research and teaching arms, sitting in different ministries. But I still strongly see this as the future – why?

Because if, as now widely predicted, computers and artificial intelligence are taking over the complex but essentially routine tasks – the learning of facts, rules, theorems, the accumulation of knowledge, experience and precedent, then what is left for higher education to focus on teaching human students is precisely the creativity that comes from and in research and practice – the thinking differently, laterally, the combining of knowledge from different subjects and disciplines, the physical and mental dexterity, the innovation that is at the core of human research and practice that computers will find incredibly hard for years to come.

At SOAS, we are proud to have three gamelans; in ten year’s time, I am confident we will have a Professor of Music who continues to teach students how to play them, and how to compose for them. I believe we will also continue to teach and research Persian poetry, classical Arabic, and Sumerian, and we may also have added Pyu, the ancient language of Burma, which SOAS linguists are currently using computing power to decipher for the first time.

And we will still be teaching students to think, and dream, hopefully in many languages, to make the world a better place. It is that which sets us apart from machines, and is the source of true innovation in education.

The humanities in a tech-rich world

I’ve been running a small experiment in recent weeks amongst friends and colleagues, simply by asking them what they think of a controversial article in the Washington Post by Elizabeth Dwoskin.  The article talks of how poets and playwrights are increasingly working in Silicon Valley’s tech firms to help make machine bots more ‘human’.  The idea is to give personal assistants like Microsoft’s ‘Cortana’ and Apple’s ‘Siri’ back stories and teach them to speak natural language so that their interaction with humans is more, well, ‘human’.

There has been an interesting difference of opinion amongst the very unscientific sample I’ve consulted – even, dare I say, a gendered divide.  Some are excited about the future opportunities this might bring; others concerned that if we teach bots to be human, we might no longer need humans.  According to other sources, such as Fortune magazine, not only personal assistants might no longer have a job, but also pilots, teachers, lawyers, surgeons, reporters and financial analysts.  I’ll let you guess how the reactions broke down by gender.

So is this a good time to be training for a white collar job in the ‘knowledge’ economy? Is it a good idea to go for a social science, humanities or ‘liberal arts’ degree, when so much knowledge can now be accessed automatically?  You might think not.  We were treated to a careers talk this week by Neville Crawley, an alumnus who now runs tech firm ‘Quid’ in Silicon Valley.  Quid is a pretty amazing company, but also an illustration of the kind of initiative that might make human analytical capacity redundant.

In a nutshell, Quid uses automated and very fast searching of digital content to analyse and visualize the ‘world’s collective intelligence’.  Through searching vast quantities of material in the blink of an eye, and building up a visual picture of similarities between narratives that emerge in different places, it can provide insight into the emergence of news stories, security threats, or market trends that would take a well-trained journalist, security analyst or marketing professional days or weeks to develop manually – and it can do this much more accurately.

But here’s the thing.  Neville says that whilst at its outset, Quid was a company of software engineers, tackling the computing challenge of how to process natural language really quickly, more recently it is looking for a much more diverse range of graduates as it brings its product to market.  The computational power still has to be trained; more important, it is the use of the technology that Quid needs to understand to gain a return on its investment.

I’m tempted to go further.  Take another example of how the vast computational power of internet search engines much less sophisticated than Quid has provided to most people the power to ‘check’ what they see, what they are told, for accuracy, authenticity, or reliability.  Gone are the days when a lecturer could just lecture, and expect students to sit and listen and believe.  Now, whether you like it or not, if the tutor inadvertently gets it wrong in a lecture there will be at least a dozen smart students spotting your mistakes on their smart phones and emailing you about it later.  Best to encourage them to check – as a means to engage with what you are saying and play an active role in their learning.

If that is true in a lecture, how much more so for the ‘real world’.  Take the entertainment industry.  However compelling a story, however great the acting, if a film or mini-series contains historical inaccuracies, scenes shot out of place, or factual errors, the endless review pages of the internet and social media can shoot your creation out of the sky before the professional reviewers have put pen to paper.

But there is an opportunity here too, as two recent SOAS successes show.  One is Amazon Prime’s recent mini-series Man in the High Castle, loosely based on the 1962 science fiction novel by Philip K. Dick.  The book and film imagine a post-war world in which Germany and Japan won WWII and are in charge in North America.  This might not seem fruitful ground for ‘historical accuracy’, and yet it is precisely advice on how to make the film more historically and culturally believable that was offered by SOAS Japan expert Griseldis Kirsch.  Just because a story is untrue in a literal sense does not mean it can get away with being unbelievable.  And our capacity to know if something is believable has never been greater.

Another great series launches in the US this Memorial Day weekend – the long-awaited remake of the famous US mini-series Roots.  Produced by Mark Wolper, the son of the original producer, one of the starting points for the remake was that the original mini-series was quite poor on historical accuracy – a fact that is increasingly evident to today’s savvy TV-watching public. The SOAS contribution to the Roots remake is truly inspiring.  Lucy Duran advised on West African culture and music, composed songs for specific scenes and brought in three top griot musicians from Mali to play in pre-colonial styles such as might have been heard at the time of Kunta Kinte.  She also taught the cast to speak and sing in Mandinka. Meanwhile Kadialy Kouyate played the Kora (and a couple of other roles) in the film, and translated much of the material.  The result is not only visually and audibly stunning, it is also true, in a surprising way for a piece of fiction.

What can we take from this?  Is artificial intelligence soon going to make us all redundant?  I don’t think so.  On the contrary, the substantial improvements in computational power of recent years are – counter-intuitively – making the arts, humanities and social sciences much more and not less relevant.  For now, that may be less so in London than in Silicon Valley,  perhaps because the London finance sector is more averse to risk than its counterparts in the Bay Area.  But the trend is clear.

Yet for synergies to emerge rather than conflict, social science and humanities students and scholars need to be curious about, not afraid of these new technologies.  Whether it is history graduates working for tech start-ups, or computing engineers working professionally with musicians or political scientists, the opportunities for productive collaboration are substantial indeed.  It starts by just talking to each other.  Better still, there is no need for the mathematically-challenged to learn how to write code or create an algorithm, just a need to have confidence in the specialist value of our own fields.

The Nurse review of UK research councils: where next?

In December last year, the UK government published the long-awaited Nurse review entitled “Ensuring a Successful UK Research Endeavour”.  The review, written by eminent scientist Sir Paul Nurse, was tasked with looking at the UK’s Research Councils, which have the responsibility of funding ‘demand-led’ research in Britain across seven broad discipline areas.  It had high level input from government and from the scientific community.  So what does it say, and how is the UK’s research funding landscape changed in its wake?

When Sir Paul’s review was first announced at the end of 2014, there were arguably few reasons to be optimistic about the future of UK research funding.  Science funding had been ‘ring-fenced’ under the then coalition government’s plans, but was declining in real terms.  ‘Blue skies’ research seemed almost forgotten in a tide of research council priorities, themes and special initiatives.  With the emphasis on STEM, the future also looked particularly bleak for the humanities – a problem in many countries, but particularly so in the context of weak public spending on higher education.

Worse, the UK research councils in particular had suffered significant budget cuts in recent years, leading them to pass an increasing array of costs onto universities.  Thus all grants were already subject to ‘efficiency savings’; so-called ‘demand management’ had sought to shift the burden of peer review away from the research councils and onto universities themeselves; whilst ‘doctoral training partnerships’ had passed most of the administrative costs of PhD support back to the universities that provide the training.

With the election of a Conservative government in May last year, the appointment of McKinseys to conduct a separate ‘review’ of research councils over the summer, and a new spending review in the autumn, you would be forgiven for thinking that the days of independent UK research councils were numbered.

Yet three months on from the publication of Sir Paul’s report in December 2015, and despite the announcement of a new research body – ‘Research UK’ – the research councils are with us still.  Not abolished, they are to become more like ‘faculties’ in an umbrella ‘Research UK’ structure.

So how should we view the Nurse review, and what does the future look like now for UK research?

Certainly there is much to commend Sir Paul’s review.  For a start, we can probably never be reminded too often of the importance of research, and there is much in the report that stresses importance.  In particular, Sir Paul stresses the need for research funding decisions to sit above the short-term funding priorities of politics and public opinion.  In his invocation of the ‘Haldane principle’ of a century ago – that decisions about funding should be made by those with the expertise and experience to know where it will be best spent – he makes an important point.

Moreover, Sir Paul has clearly not sought to abolish the UK’s research councils, nor is this what observers perceive him to have done.  Indeed, he makes recommendations about how the research councils should continue to provide ‘scientific leadership’, as well as taking the lead on engaging with a range of stakeholders, including the commercial world, government, charities and Europe.

This in itself is perhaps a major achievement of the report – as on efficiency grounds alone, there must have been many in the UK Treasury who were calling for a single research council, with a massively slimmed down structure and simplified mechanisms for distributing research funds, based perhaps on some metric formula.  Instead, we will continue to have the same number of research councils as before, each responding to the specific interests and concerns of their own scientific area, each even with a ministerial appointment for its CEO.

Yet it seems to me that there is little cause to cheer this quasi status quo.  What is wrong, and what could be done about it?

First, Sir Paul has recommended the creation of an overarching body called “Research UK” to bring together the strategic side of our research councils, with direct access to government and its own Chief Financial Officer.  This might have some positive effects, but it seems highly unlikely that it will deliver any more finance to actually support research in the UK.

Unfortunately, that matters – because just a decade after devising ‘full economic costing’ (or FEC) as a mechanism to properly estimate the cost of research so it can be adequately supported, the UK has now retreated so far from FEC that it is almost a joke.  Now, the new mantra from government, including the research councils, is ‘matched funding’ or ‘contribution’ from universities, in which HEIs are expected to support from their own resources grants that are provided by the government.

Yet this is a device that ensures nobody receives full funding.  It implicitly favours large institutions that can generate surpluses over smaller ones that very often generate the best ideas.  My own institution for one would struggle, for example, to find the 50% match funding for PhD scholarships that the Arts & Humanities Research Council is currently consulting on. And even if we could find such funds, we might ask whether so much of our own money should be devoted to the priorities of a UK research council, and implicitly the UK government, rather than our own priorities. For example, the AHRC expects us to use this funding only for British or European students, whereas our priority is to recruit and train the best talent internationally.

And that leads to a second point: the problem that Research UK will certainly not resolve, but exacerbate, is the divide that is opening up in the UK between research and teaching.  It could be argued that one of the key strengths of UK higher education – the thing that attracts academics and students from around the world to work and study in the UK – is the way our ‘research-intensive’ universities combine the best of research and teaching and exploit synergies between the two.

Yet this is rapidly eroding, and this at a time that – more than ever before – universities will in practice need to cross-subsidise research from teaching income in order to make up the shortfalls in research funding.

To be clear – such cross-subsidies are justified, at least in moderation.  An inspirational teacher at tertiary level needs to know the research frontier that she is trying to bring her students towards, to have worked herself to ask and solve research questions.  Excellent teaching cannot consistently be delivered if the information being imparted or the problem-solving skills shared have been learned second-hand.

Yet a ‘Research UK’ with its constituent research councils and quite possibly incorporating core funding as well, but separated from an ‘Office for Students’ does not look likely to stand up for synergies between excellent research and teaching.  On the contrary, it looks likely to enhance the sense of a conflict between the two that does not need to exist.

Finally, what about the ‘wider research endeavour’ that is the subject of the third of Sir Paul’s four chapters?  Interestingly, much is said about connections with business, and quite a bit about connections with government.  Yet in spite of the correct observation that “openness to scientific strengths beyond the UK is one of the defining characteristics of the UK research base”, the report says little about Europe, nothing at all about the European Research Council, and nothing much about how international connectivity could be strengthened further.

This is a serious gap.  Working across borders – not only in Europe – is really one of the success stories of science and scientists in the past two decades, and not just in the UK.  The UK in particular runs a trade deficit in manufacturing, yet this is dwarfed by our trade surplus in services, a surplus that exists with most countries and rests in part on the excellence of our higher education institutions, including their research.

The true story of a successful UK research endeavour in the future will not be one of supporting public policy over a range of government departments or training the workforce in the skills necessary for the country’s economy, but one of the UK’s contribution to global knowledge and understanding.  This is an area in which UK performance – despite chronic underfunding – has arguably been little short of outstanding.  But it remains to be seen if the UK’s new-look research councils can deliver the same in the future.

This piece was originally written for May 2016 edition of Realising Research magazine from the Association of Commonwealth Universities”.  See: https://www.acu.ac.uk/membership/member-communities/research-knowledge-information/realising-research/

Europe’s refugee / migrant crisis

It is too early to say if September 2015 was a turning point in western Europe’s attitude towards migrants and refugees, but the tragic and very public death of Aylan Kurdi has certainly had a powerful effect on both public and political opinion.  It should also have a galvanizing effect on those of us who have researched refugee and migration issues for many years – the voice of the academy has been there, in radio and TV interviews, commentaries on the crisis and in emerging proposals and some funding for new research, including some from SOAS researchers. But a systematic engagement remains to develop.

There is much to understand – but one way of approaching the crisis is to think through the historical precedents, to consider whether they offer us pointers as to what the space is for political and public action.  This is not easy, as historical precedents are never quite the same as what is going on now.  But surely this is an area for analysis and debate that is currently lacking.

Some have gone down this route. A recent excellent posting by Becky Taylor draws parallels between the public reaction to the Hungarian refugee crisis of 1956 and emerging signs of compassion and solidarity in Europe today, but the Hungarian uprising is hardly comparable in terms of either the geopolitical circumstances (it happened at the height of the Cold War) or the numbers of refugees involved (an order of magnitude lower, at least).

Others have suggested that the closest parallel involves the events at the end of the Second World War, as millions of people found themselves homeless or stateless, or indeed tried to move home.  Certainly the period 1945-51 was a formative one: it was a crisis that gave birth to the UN Refugee Convention and established both attitudes towards refugees and a policy framework to deal with them for at least two decades.  Yet that was also a refugee crisis born of conflict that had engulfed the whole continent, where the sense of responsibility and urgency to find solutions was at a level that far exceeds what is likely to emerge today.

Meanwhile, although the political and economic crises that are producing today’s flows of refugees and migrants have their epicentres outside Europe, if we look to other major refugee crises that have happened outside Europe, whether that is Afghanistan, Rwanda, Liberia, or more recently the exodus from Iraq following US-UK intervention in 2003, all share a crucial difference – relatively few of those displaced made it out of the affected region.

Of course the fact that refugees from these earlier conflicts mostly found asylum – or were ‘contained’, if you prefer – in neighbouring or ‘transit’ countries in Asia, Africa or the Middle East is not in itself a good reason for inaction on the part of European states or disinterest on the part of European publics.  But the fact that many of the countries that were places of first asylum or transit in these earlier crises – Syria, Libya, or Lebanon for example – are either no longer in a position to offer safety or security themselves, or are at the very least fully-stretched, does force us to think differently about ways forward.  And given the extent to which conflict has been ‘hidden’ from Europe by these artificial borders, and the ‘burden’ of hosting refugees (such as it is) has been borne by others, some might argue that this is about time too.

Yet there is a modern-day European parallel that could help us to think about how to respond to events in Syria and elsewhere in the world in political, policy and indeed research terms to current events – and that is Bosnia in the 1990s.  The political crisis in Bosnia happened over two decades ago but the similarities are striking to the current situations in Syria at least: a brutal civil war, fuelled by overt or covert external interventions from various sides – the West, Russia, and the Gulf States; a territorial stalemate in that war which left ordinary citizens who had initially hoped to ‘stick it out’ either at home or close by to abandon hope for a resolution to the conflict, however imperfect; and a confused and vacillating approach from western states both to the conflict itself, and to the refugee crisis that it generated.

But if this analogy is right, how does it help?  Bosnia was hardly a crowning achievement of European refugee or foreign policy, and many of the debates we are having now about migration and refugees – about the role of trafficking, or the question of burden sharing – were unresolved then, which is perhaps why they are still current today.

I would suggest the analogy does help, though, in three key ways.

First, with the benefit of hindsight, and notwithstanding the many mistakes that were made in relation to the crisis in the wider former Yugoslavia, Bosnia does provide an example in which hundreds of thousands of refugees were accommodated in western Europe at short notice – especially across Germany, Austria and Switzerland, the same countries most affected today – and who went home when the crisis was resolved.

I am not suggesting that either the circumstances of their reception – much less than Convention refugee status – or of their return – hardly the ‘voluntary’ return that states and international organsiations asserted – were ideal.   Nor am I suggesting that the crisis in Syria is clearly temporary.  However, for those worried that each refugee or migration crisis adds additional people to be housed and found work, schools, healthcare, social care on a permanent basis, the Bosnia crisis does provide an alternative model for what can happen: it gives the lie to the assertion that there is ‘nothing so permanent as a ‘temporary’ migrant’.

Second, what made the difference in terms of the resolution of the Bosnian conflict was when western Europe and the US started to engage with the crisis in a more coordinated way.  Initially, the European approach to the collapse of the former Yugoslavia was all over the place – and nationalist politicians within Bosnia exploited these divisions and rivalries.   We set up ‘safe havens’ for displaced people without really understanding how they would be defended, with terrible consequences.  The parallels with Syria are striking.

In the end, it did not need formal military intervention to end the war in Bosnia.  But it did need a coordinated approach, and a clear strategy.  In turn, coordination and strategy are so clearly lacking in relation to Syria – and our failure to push for a political solution ends up fuelling more violence.

Third, looking back at the Bosnia crisis, one of the problems facing Western diplomacy was that it was always difficult to see which side ‘we’ in the West should be on – as Yugoslavia’s religious, political and economic fault lines mirrored those in the wider Europe.  Indeed, as political solutions were explored and parties finally brought to the negotiating table, those who took part were the nationalists from all sides.  By contrast, those Bosnians who had believed in a multi-ethnic pluralist Bosnia had been systematically sidelined.

This last point sets us the most difficult challenge, since the political, economic, cultural and religious complexities of the current conflict in Syria – and indeed conflicts fuelling refugee crises elsewhere in the world – are no less than that of Bosnia.  Yet grapple with complexity we must – in a way that is informed not by simplistic or ideological narratives, but by integrated understanding of the region’s politics, culture and history.

The changing face and place of HEIs: implications for student engagement

There has been much talk of diversification of the higher education sector in recent years. The outcome of the market was expected to be one in which institutions specialized in what they are good at in order to attract students. Talk of the ‘squeezed middle’ for example highlighted the problems for institutions trapped between ‘research excellence’ at one end of the market, and ‘teaching excellence’ at the other. In this sense, recent remarks by Sir Steve Smith that the government is likely to ask universities to ‘specialize in what they are good at’ – and target funding and student visa sponsorship accordingly, can be seen as layering yet another pressure on the sector to diversify.

Yet if Sir Steve is right, this will involve quite a change in the direction of travel for universities, with major implications for their place in the economy, and for student engagement. Because the current trend is not to diversification, but to increasing conformity with what the government, the press, ‘public opinion’ and students’ parents appear to want. At the same time, this response to the demand of the market is matched not by growing student satisfaction, but all too often by disengagement.

If we look first at research, whilst the ‘research-intensive’ universities like to talk about being research intensive, and do have different arrangements for research to many post-1992 universities, one of the big stories of the last few decades is how the latter have invested heavily in research across successive RAEs and REFs. As a result, they have successfully demonstrated research excellence across a wide range of UK HEIs, something that in itself, is not damaging to the student experience.

Yet in terms of teaching as well, in order to play in the game of expansion created by the market in HE, universities need to widen their demand, not narrow it. That has resulted not in institutions focusing on their comparative advantage, and driving specialization targeted to students’ varied interests; but on a rush for the middle of the market, at least in England where this market has been created. That means most English universities at the moment are trying to have not only more research, but also students, and a singular vision of teaching and research excellence underpinned by growth.

Another more pernicious pressure for conformity is the growing pressure on all universities to improve graduate outcomes. Of course, getting our students into good graduate outcomes is worthwhile and important. It is also the case – as a recent survey of VCs reinforces – that there are significant innovations to be made in terms of closer integration of study with work, and the greater use of technology to transform learning experiences. But university is about so much more than graduate outcomes – especially those just six months after graduation, as the Destinations of Leavers from Higher Education survey currently measures. Surely if there is a common purpose to universities, it is to help students to think.

What does all of this have to do with student engagement? The market in HE was supposed to align our institutions more closely with student demand, enhancing student satisfaction and making students more actively engaged with their learning. Yet many students appear to be more and more disengaged from their learning, and/or feel that universities offer poor value for money. Surely one reason for this is that in university management, we have become increasingly driven by policy or funding imperatives that are making universities all more alike, rather than understanding the different historic missions of our institutions, and of different parts of the sector.

To take one example, many of the larger (and indeed smaller) civic universities in cities outside London were set up as a matter of civic pride, to deliver learning and economic and social benefits to the local economy. This an idea entirely consistent with the current government’s vision of universities at the heart of vibrant regional economies. Indeed, it is an idea that has multi-party support, having strong antecdents in recent Labour and coalition administrations. It is an important role for universities, and one with the capacity to enthuse students too. But it is surely not the mission of all universities, whilst even these civics increasingly need to look nationally and internationally for students, partners and funding.

Then there are the more specialist universities and colleges, set up for more specific purposes – from universities of the arts, of veterinary and agricultural science to music and performance; to universities like Essex and Sussex that were explicitly set up to challenge the status quo, to think radically and differently. Yet universities like my own, set up in 1916 to productively engage with the British empire and now thoroughly re-imagined as a unique international university with a strong emphasis on language and culture, have seen their specialist funding cut to the bone (in our case, removed altogether from next year). And we also operate within a funding environment and wider discourse that makes it difficult to survive without becoming less specialist.

A move towards a more diversified HE sector would be a welcome one, but only if it is based on a renewed confidence in what our purpose is. Most universities do care about their specific histories and missions, but we are too seldom willing to say that this matters, too often swayed by the homogenising tendencies of funding imperatives and discourses about HE as a whole. No wonder students stop engaging, and just put their heads down to get a degree and a job.

Managing academic performance

A recent and fascinating piece in the THE by Rob Briner sets out how he feels universities are ‘mismanaging performance’. This week – and the preceding 18 months for that matter – academic performance has been very much on my mind. SOAS Academic Board has just approved a new ‘academic performance framework, the development of which I have led through a string of working groups and extensive negotiation with UCU

Of course, if we aspire to excellence then we need – as academics – to ‘perform’ that excellence. Brilliant ideas are no good if they stay in my head – I need to share them, whether in the classroom, through academic publications, or through other forms of ‘output’ or ‘impact’. Excellence is what all universities seek to reward, whether in the form of first-class degrees for our students, or promotions and academic titles for our staff.

All that implies we have the capacity to recognise excellent performance when we see it, even if the press is less than sure that all firsts and 2i’s in universities are really worth the name, and academics will sometimes wonder whether academic promotion panels make the right decisions. But can we ‘manage’ academic performance? And if so, how?

Certainly the starting position of many of my academic colleagues is that academic performance cannot be managed – or at least not easily. The risks of attempts at management are brought out well in Rob Briner’s piece, as well as a more recent report in the THE following the death of an Imperial academic who had missed his grant income target for the year.

In particular, Briner bemoans the adoption of ‘best practice’ from private sector businesses. Goal and target setting, a useful technique in some contexts, is too often translated in the University context into vague, complex, hard, or ‘just plain impossible to achieve’ objectives, over too long a timescale and in a format that is demotivating rather than motivating. This, Briner says, does little to boost performance and, worse, can ‘make academics feel failures’.

Talking to my colleagues, this story rings true. Many talk about ‘changing goalposts’, of long hours spent doing what are perceived as ‘pointless tasks’ with little or no reward. A frequent response is that we need to manage workloads rather than performance. As a father of a young child, I have plenty of sympathy with that.

Yet ‘performance’ is not workload – it is not about how much effort we put in, but about what we get out. When I did my undergraduate degree, marks were never awarded based on how long students spent in the Library (the lawyers and medics would all have graduated with first class honours on that measure!). Then, and now, the extent to which my research, writing and teaching is valued and taken seriously depends on whether I have been able to say something new and/or interesting. And that is important – for all universities.

So how can universities encourage and support academics to perform better? And how can university administrations foster an academic environment in which both teaching and research are more consistently ‘excellent’? One, moreover, where the presence of those who genuinely struggle to produce excellent research or teaching does not undermine the wider reputation of an institution, to the detriment of both students and colleagues?

The Academic Performance Framework that has been agreed this week at SOAS is an honest attempt to answer these questions. Across the three areas of teaching & learning, research & enterprise, and administration, management & outreach, it sets out what the School can reasonably expect from all staff as a minimum, but also what relevant committees are implicitly or explicitly looking for in terms of ‘excellence’ when it comes to promotion, accelerated increments or one-off reward payments.

Too often, such criteria are defined only in the vaguest of terms. For example, our School’s current rules require interpretation of what the difference is between an ‘important’ contribution to the advancement of a discipline (Reader) and an ‘outstanding’ contribution (Professor). And on teaching, our promotion procedures talk only of ‘innovation’ without saying what that is. This vagueness leaves academics confused about what they should prioritise and angered when a committee does not agree with their understanding of what a teaching innovation or outstanding contribution to research actually is.

It is possible to be more specific. Every historian I have ever spoken to across the five institutions in which I’ve worked or studied understands that each promotion essentially comes as a result of the publication of a good quality monograph. Yet becoming more specific in this kind of way also carries myriad pitfalls. Economists or scientists don’t write monographs (on the whole), and aren’t likely to be promoted if they do – in other words, such recognition is subject-specific. And spelling out what is expected can introduce targets that are utterly unreasonable and/or have perverse effects, especially if targets for teaching excellence are added to those for research.

Take the example of research income – in the news this week that one in six universities in the UK have introduced individual performance targets for research income. Such individualised targets are clear and specific, but simply unreasonable in a context where two thirds of well-written and worthwhile research proposals are rejected by most funding bodies.

An alternative is a target for the volume or value of research grant applications, which at least is in the control of the individual academic. But this creates the risk that poorly thought out applications will be submitted simply to meet a target. The result is to waste the time of the funder and peer reviewers; it does not create the outcome that the university is seeking to achieve.

The proposed solution at SOAS, and one that we will be implementing in the coming year, is to combine income and application targets for Departments (where there is scope across 20-30 academics for a degree of success and failure) with an expectation that all staff regularly submit applications that have passed internal peer review – so that there is at least some measure of quality involved.

In turn, actually securing a grant – especially a larger grant (the framework spells out what we think that is) is seen explicitly as a measure of achievement, something that can and should count towards promotion and reward. That was not always clear in our previous system – for some staff, writing that next book would always take precedence over securing the funding that might allow that writing to happen.

Academic performance is one of the most important issues facing university administrations in a sector that is increasingly competitive – for research funding, for students, and indeed in other ways. It is important for individual academics too – we all want to work in an environment where our colleagues are creative, engaged and motivated, all possibly synonyms for ‘excellent’.

Our experience over the past 18 months at SOAS shows we can have a conversation about this and try to chart a way forward. We may not have all the answers – indeed, on evidence of teaching excellence, we will continue the conversation next term. And of course, it will take time to find out if we have got it right!

Research in the ‘digital age’

Last week my past as a scholar of refugee studies caught up with me. I was contacted by Professor Barbara Harrell-Bond – the expert on refugee studies, Emeritus Professor at Oxford, and former Distinguished Visiting Professor at the American University at Cairo and Makerere University, who inspired me as a student – and many others – to study refugees and forced migration.

Back in the 1990s, I agreed to put a digital copy of Barbara’s groundbreaking book, Imposing Aid, on my university’s website – ‘digitisation’ was in its infancy, and Barbara was ahead of her time – she had made sure she had full copyright to her own work, and could publish a digital version of her monograph wherever she wished. Barbara got in touch because she couldn’t find her book online – one website update too many had consigned it to an inaccessible archive (it is back now – here).

So, Imposing Aid is still available as an open access monograph, nearly 30 years after publication.  But what else is, or should be, digital, in this ‘digital age’?  SOAS itself has some pretty impressive digital collections – such as the Fürer-Haimendorf collection, which comprises photographs, cine films and written materials from South Asia and the Himalayas collected by the first Professor of Anthropology at SOAS, Christoph von Fürer-Haimendorf; and our collection of early Nigerian Qur’anic manuscripts, digitised over a decade ago with funding from AHRC.

We are also developing the School’s e-repository, buoyed by the recent appointment of Helen Porter as our new Digital Services Support Officer in the SOAS Library – so that within the next year, all SOAS publications should be going on SOAS Research Online, with all journal articles written by SOAS staff available in full text open access format.  But as I get more into this – still as an absolute novice – it is clear that the issues around digital research are hugely complex.

A first issue is that huge spending on digitisation by organisations such as JISC, the research councils, and major foundations in the UK, US and elsewhere has really only scratched the surface of collections that might benefit from being widely or openly accessible in digital format.  For example, within SOAS, there is an emerging project to try to digitise our fabulous Swahili manuscripts collection.   Here, as with some other collections, such as the School’s unrivalled collection of Hausa popular literature, earlier funding has provided for a fully-searchable database, but fell short of providing the texts themselves online in an accessible and discoverable format.

Yet the number of potential collections to digitise, even within the SOAS collections, is huge.  Not  only are there the collections that already have a database; there are also so-called ‘hidden collections’, which exist in our archives but have only a card catalogue (or indeed no proper catalogue) and so not even visible to the wider world, let alone searchable.  Examples are found amongst our substantial collections of NGO and missionary archives, but also the School’s own archives (although these are being meticulously catalogued at the moment in the run-up to the Centenary).

There are also, of course, collections elsewhere, but where SOAS staff are integral to projects for their digitisation.  One example is the Yasna, a central ritual text of Zoroastrianism that is the subject of a current initiative within our Department for the Study of Religions.  Within this seemingly endless task, it is difficult to know what to prioritise, or indeed when to stop.  What makes a collection more or less suitable or valuable for digitisation?

Next, there is the question of how to make collections accessible and discoverable once they are in digital format.  This is a question that takes several forms.  For example, in the case of our Swahili manuscripts, an immediate issue emerges in that a good part of the collection is in Swahili in Arabic script rather than Roman script, making issues of readability (including machine readability) much more difficult.  An ideal digitisation project here would simultaneously transliterate and translate into a language of wider communication, to maximise accessibility of the collection, yet this of course makes the undertaking that much more vast.

And then there is the vexed question of the medium on which the material is held – quite a lot of my own early academic writing is in ‘digital’ format, but buried on 3½” floppy discs that probably already belong in a museum, and are frankly less accessible than if I had the work on paper.   There is a risk that material we hold in digital media will suffer the same fate in a few years’ time, even if standards exist for digital preservation and migration to new media. But format is not just about where a text is kept – because text itself is not the only format a digital collection can take.

One interesting example from SOAS is a collection of recordings of Babylonian and Assyrian poetry assembled by our Department of the Languages and Cultures of the Near and Middle East.  Although recorded on a dictaphone, and so not of the highest quality, they provide a unique record of how these ancient languages were ‘read’ by scholars in the early 21st century.  This is an example of how we can enrich our holdings, making them more discoverable and accessible.

But of course the School’s audio recordings go way beyond that – most obviously in the Endangered Languages Archive – generously funded by the Arcadia Foundation – which over the past decade has built up an incredible corpus of digital recordings of languages which represent not only an important cultural repository, but also a fascinating research resource for linguistics scholars more broadly.  There are many other recordings too – of languages, musical recordings, and indeed SOAS lectures (many on old-style audio tapes) that date back to the earliest days of the School’s history.

And finally, the point needs to be made that a ‘digital collection’ now means much more than a collection that has been digitised.  For over a decade now, both public bodies and private organisations as well as individuals have been producing material in digital format.  That includes this blog – and indeed any responses to it.  Should this be held for the future?  And if so how?

In his recent and excellent book on Chinese internet literature, Michel Hockx describes the challenges of archiving material on the internet that is inherently ephemeral.  His solution, an archive providing a ‘snapshot’ of the material at the time he accessed it solves the immediate problem of recording the evidence on which his argument is based, but does not provide an equivalent record to – say – the Hausa popular literature material that we have archived from an earlier age.

How should the School engage with these collections that are either born digital, digitised, or might benefit from digitisation?  Over recent years we have developed the SOAS Digital Library to address these issues, but there remains a huge amount to be done.  Should prioritisation be based on the intrinsic (or indeed commercial) value of the archival material itself?  Or should it be based on the principal that a group of researchers want to use the material in research now, and can do so more easily if it is digital (for example by bringing together partners in Asia, Africa or the Middle East with SOAS scholars)?

Whatever the solution, we need to move forward quickly.  Because – given the nature of the SOAS special collections, and our research connections around the world – perhaps no institution in the UK has a more pressing case for its archives to be openly available and fully discoverable.  And if we succeed, there is a real potential for more ‘global voices’ to be heard, both in the production and analysis of our collections.