The promise of the fourth industrial revolution has much for which to answer when it comes to education. 5G, the internet of things, autonomous vehicles, self-building houses, virtual reality, AI and big data hit the headlines daily these days and governments are understandably keen to ensure that their work forces are future-ready.
In last week’s budget Phillip Hammond the Chancellor of the Exchequer in the UK announced a £600-a-head inducement to schools to persuade more pupils to take maths at A level. The money will be available from 2019, and comes from a £177 million fund to design to future proof the workforce. For Hammond Maths is clearly the answer to our future problems whatever they may be.
Other governments are mandating further imperatives to future ready their students and have rolled out coding classes for all or the introduction of one-to-one digital devices. And yet a recent article by Robert Wolcott, Clinical Professor of Innovation and Entrepreneurship at Northwestern University, has begun to question the emperor’s new clothes.
For Wolcott, coding is about as practically useful as Ancient Greek (I write this, by the way, as a teacher of Latin and Ancient Greek). “Remaining relevant will be a moving target as computer languages and programming environments arise, evolve, and in some cases die.” By the time our sons and daughters leave school C++ and Python will be dead languages. Not simply because they have been replaced by new coding dialects but because “the need for humans to code will gradually disappear for all but the most specialized situations”.
One of these key industries of the future is AI and it seems likely that humans will soon be able to describe in natural spoken or written language what they would like their computers to accomplish. The coding language itself will not be important, an algorithm will convert the human instruction into code on their behalf. What students will ultimately need to learn is how to define clearly and communicate precisely what they want their computers to do.
The same argument has been levelled against the introduction of specific digital devices into schools as a method by which to prepare students for the future. In the same way as Wolcott predicts that learning coding languages will become redundant as they are increasingly implemented and even planned by AI systems, so too will current devices become redundant.
Learning to programme on my Commodore 64 when I was 10 years old did nothing to prepare me for today’s world in much the same way that learning to word process on my MacBook Air will have little bearing on my use of a digital device in 2037. As a consequence we need to think increasingly carefully about what schools can usefully do today to prepare students for tomorrow.
In its Future of Jobs report the World Economic Forum clearly defined what it feels are the top 10 skills students will need to thrive tomorrow and they are all reassuringly human.