At the beginning of October 2018 the Digital Agenda hosted its ‘Power & Responsibility’ summit. The event consisted of very enlightening and stimulating talks over the topics of regulation, trust and the unintended consequences of technologies.
A few very important points were raised. First of all there is a need to consider and understand the impact of technologies on everyday people. As Adam Thilthorpe (director of external affairs at BCS) highlighted whenever we use Alexa or Siri the interaction with the technology disrupt the normal pattern of communication in our houses. And this is something that parents should consider when raising their children in an environment surrounded by AI. In this light we can start asking ourselves if it will still be appropriate in the future to teach kids to say ‘please’ and ‘thank you’.
When considering how technologies affect everyday people we also can’t avoid to talk about social media. These platforms are so widely spread and popular that in the last decade became extremely pervasive. The recent and mostly-cited Cambridge Analytica scandal has suddenly emphasised the fact that social media can be used as a tool to harm people. Even though now this might seems like an obvious concept we have to remind ourselves that until a few years (if not even months) ago the possible harm coming from those platforms was not even considered.
A question that then arise from this discourse is: how can we make sure that we can address the challenges that technology is bringing? The answer should be that we can do so through global regulations. However, as Chi Onwurah MP highlighted, most of the times we wait for people to die before than creating new regulations. See the Grenfell tower disaster in London last year. This has a huge impact on people’s lives. It is the duty of governments to look after the wellbeing of people. Politicians are now waking up to the challenges of technologies but they don’t know yet how to regulate them. An example of this are algorithms: the outcomes of algorithms are regulated but the companies that use them have no clear regulations. Companies should be responsible for the outcomes of their algorithms but this is not the case. How many other scandals will we need to go through before than getting effective global regulations? We must act now: NOW it is the essential time to develop regulations and to make sure technology empowers consumers and economies rather than harming these.
A final point that I would like to raise is trust: who do we trust when it comes to technology? Mark McGinn from Edelman said that today we are living in a world of distrust. We don’t trust the information that we are given, people are seeking the truth and are concerned on whether they get access to it or not. We are aware that fake news and disinformation can be used as a weapon against us and there is a general concern about transparency. A good example here is the (mis)use of social media and fake news for the American 2016 elections.
All these points can make us reflect on how from a ‘tech hype’ we are moving to a ‘tech fear’, in Eva Appelbaum and Jess Tyrell’s words. In this digital age we must however move forward to embrace the power that the technology is bringing along with its big responsibilities. Policy makers and society must act together to protect humans. This can bring back to the concept of Responsible Research and Innovation, which focuses on harnessing the benefits of a technology while taking into account its risks. Taking into account those risks does not means to stop the development of new technologies: it means to reflect further on the effect of technologies on everyday people (to link back to Adam Thilthorpe) and make sure to create regulations and guides to help the society and the economy to grow in a healthy way.