ORBIT’s conference in September 2018 brought together voices from academia, technology, data analytics, policymaking, education and industry to discuss how to benefit from tech and ICT while minimising harms. Lord Tim Clement-Jones opened on an optimistic note with a roundup of the House of Lords report on AI (published 2018) that makes a strong case for the UK leading the way in both AI and its responsible use, then the conference settled in to listen to the experts..
Computers are really quite scary now
Did you know that if you describe a picture to a computer, it can create it? Nor did I, but that’s what Dr Abigail Sellen of Microsoft Research showed us. Before the invention of GANs – generative adversarial networks (I do not pretend to understand what those are) – a computer might be able to describe a picture. But if you described something and asked the computer to create an image, it would not have been able to. Now it can – and Dr Sellen stressed that the computer is not just going away and finding a picture from the internet, it is creating photorealistic images – from scratch – of landscapes, things and people that do not exist in reality. For examples of this, see Dr Sellen’s talk on the conference page.
Nobody agrees on what we mean by artificial intelligence
The highly regarded Elsevier ‘special reports’ are often benchmarks for a particular topic (see here for their report on Gender in Research) – Elsevier performs an annual ‘deep dive’ into the wealth of data available through their various systems and this year they have focused on artificial intelligence and ethics. One surprising result from this work has been the wide disparity in what is meant by the term ‘artificial intelligence’ in various sectors such as the media, academic, teaching and policymaking. As those working at the boundaries of disciplines will know, even apparently simple terms such as ‘risk’ can be misinterpreted across those boundaries – this can be especially problematic in multi-disciplinary areas such as AI and ethics.
Ethics could be the new green
Consumers can vote with their feet. Virginia Dignum’s commonsensical evaluation of the state of development of AI drew parallels with other businesses that employ responsible practices – for example the way in which free range eggs have eclipsed battery-farmed eggs. We can build in certification and businesses will be able to differentiate themselves based on their approach to ethics and responsibility. Regulation and oversight can drive the development of better products in the same kind of way that regulation on car emissions led to more efficient vehicles.
Will teachers be replaced by AI?
A topic that became quite heated in the conference hall was whether teachers might be replaced by tailored AI ‘apps’ that could assess how a pupil’s learning is progressing and create a personalised program that supported their strengths while improving on their weaknesses. Sir Anthony Seldon, vice-chancellor of Buckingham University, suggests that AI could do away with teaching issues caused by personality clashes and ensure that all children have access to identical, neutral teaching. This was strongly challenged from the floor, as a conference attendee pointed out that he had gone into his particular field after being inspired by a teacher. This story would doubtless be replicated elsewhere – but so would the flipside; that children can be put off a topic for life after a poor teaching experience. The debate clearly has some way to run.
Timing is critical
The faster technology moves, the less time we have to think and consider and respond to each development before the next one comes along. The dominoes fall faster and faster until we can be caught up in an avalanche. Neil Viner, Development Director of EPSRC, pointed out that sometimes you have to wait until everybody is ready to change – to change the way they work and the way they think – and from the number of conferences, seminars and other events around ethics and responsible research, it would appear that the tech community has reached this point. It was an optimistic note to end on!