Meeting of the Parliament 13 November 2019
The location of the tech companies is only one of the problems. The fact that some of them appear to be run by sociopathic billionaires is a much deeper problem than their mere location.
However, the minister is right to recognise that we need to reach out to others around the world. As I suggested to Mr Lockhart, instead of so vociferously resisting and trying to block international regulation on autonomous lethal weapons, the UK Government could be reaching out and trying to find out how we could achieve that regulation. Most countries around the world want regulation of and legislation on a pre-emptive ban on lethal autonomous weapon systems. The UK is one of a small number of countries that are vociferously resisting that move. There is scope for international work and for deliberative work; the use of citizens assemblies might be one way of engaging the public in the wider discussion.
The UK Government’s centre for data ethics and innovation has been referred to, and I welcome the work that it has done; I have looked at some of the papers that it has published so far. I might have time in my closing speech to go into some of the positive ideas and opportunities that it has identified, and into quite how far the centre is from identifying solutions to problems that it has recognised.
There are already elements of an ethical framework in place. The general data protection regulation includes rights for the data subject on autonomous individual decision making: people have the right not to have decisions that have legal effects on them made by purely autonomous processing, including profiling. However, we have no way of knowing whether that is happening to us, and we do not have the tools to allow us to protect ourselves from it. As others have mentioned, the Close the Gap paper that has been circulated to members introduces arguments to do with the social bias that could be inherent in those systems.
In terms of economic justice, we have to ask who owns the tech. In terms of security, we have to ask who watches the watchers and where the human agency is. Power has to be accountable, but we do not yet have the tools to make the power that is embedded in AI truly economically or democratically accountable.
I move amendment S5M-19822.1, to insert after “benefit of the public”:
“; recognises that both Scotland and the wider world are yet to meet these preconditions and, in particular, that the development of an ethical framework requires significant debate as well as robust enforcement mechanisms, which are currently absent; further recognises that the concerns that relate to the use of AI include social bias in automated systems, unjust distribution of economic benefits, the integrity of democratic systems, safety and ethics in automated defence and security systems, privacy and the lack of human agency in situations requiring whistleblowing or challenge to corporate interests.”
15:20Motions, questions or amendments mentioned by their reference code.