Scientists and citizens alike need an understanding of ethics
On 8 May, Google’s chief executive Sundar Pichai took the stage at the company’s developer conference and made an appointment for a haircut. The crowd went wild.
The mundane was made marvellous because Pichai didn’t do any of it himself: Google’s Duplex software called the salon and spoke with the receptionist to make the booking entirely on its own – the salon staff did not know that they were actually talking to a machine.
Yet outside the packed auditorium, the appreciative whoops and cheers died away as journalists and commentators watching around the world began to voice misgivings about the duplicity of Duplex’s demonstration, and how it could be abused. Here was another example of a tech giant doing what can be done, without considering whether it should.
This is not just a problem for the likes of Google and Facebook. Unintended consequences or misuse can make a monster of any technology. As our feature on ethics explores, and our own Neil Withers has discussed – chemistry has been around long enough to have encountered this problem many times. Today, regulations and legislation exist to apply the lessons learned from CFCs, CO2 emissions, chemical weapons and narcotics, for example, and there are growing calls to bring similar regulatory oversight to the tech industry.
Regulation has its place, but it is not a proxy for ethics. As a means of guiding fast-moving developments in science and technology, ethics – operating at the level of individuals, in real time – is much more relevant. In April, the UK’s House of Lords recommended this approach in its report on artificial intelligence – proposing an AI Code to establish ethical principles for the sector. The first item of which is that AI should only be developed for the benefit of humanity.
That principle was echoed at a meeting of the UN Commission on Science and Technology for Development, which was held the week after Pichai’s demo. ‘Technology should be guided by benefit to humanity,’ mathematician Roger Penrose told the meeting. And chemistry Nobel prize winner Jacques Dubochet went even further, insisting that ‘in every university, the teaching of scientists should include ethics, and being as good citizens as they are good scientists’.
The grass roots approach to embedding ethics is essential. Yet ethical training for scientists is only a part of the picture. Scientists create knowledge, but that knowledge belongs to everyone, so deciding how it is used must involve the public. Where will these good citizens encounter scientific ethics? It will likely not be in a university.
Last year, Jennifer Doudna wrote A crack in creation, about the invention of Crispr technology, to try to tackle that issue; informing the public and preparing them to engage in discussions about the ethics of gene-editing technology. But I wonder if it was more or less effective at raising awareness of the topic than Dwayne Johnson’s recent film Rampage, in which figurative Frankenstein’s monsters of Crispr are given an excitedly literal interpretation as actual monsters having a fight in downtown Chicago.
Frankenstein itself, which marks its 200th anniversary this year, has become the archetypal warning myth for science’s hubristic ambition (whatever Mary Shelley’s original intention). Shelley purists (and scientists) might object to reading Frankenstein as a primer on scientific ethics, but it has informed people’s view of the the topic nevertheless.
However we are exposed to the issue, ethics is not a puzzle with a solution. It is a discipline, and scientists and citizens alike must participate in the dialogue that guides its principles.