The Machine Question, Gunkel
Last updated on 03 Sep 2022.Review, notes, and reflections on The Machine Question: Critical Perspectives on AI, Robots, and Ethics by David Gunkel
Introduction
- Who deserves ethical consideration?
- Can machines be held responsible for actions that affect human beings?
- Modern philosophy associates the animal with the machine; Descartes and the animal-machine.
- Slavoj Zizek - there are not only true and false solutions, there are false questions: philosophy must make us understand how the very perception of the possibility frames or prevents its solution.
Moral Agency
- Something is an agent iff it is capable of performing an action.
- Daniel Dennet - intentionality. We must be able to ascribe beliefs and desires, intentional states.
- The standard account of moral agency requires an exclusive cut.
- Instrumentalist theory of technology - technologies are merely tools for serving users. Therefore it is neutral and the responsibility is that of the designer or the user.
- “Computer ethics” is always anthropocentric.
- Assigning responsibility to tools may allow humans to blame their tools.
- The concept of “human” is not immutable.
- Anthropocentrism is exclusive and even violent.
- Autonomous technology - technical devices which standard directly against the instrumentalist perspective.
- We can attempt to accomodate the concept of moral agency such that it is not specious. What is a person? What should a person be represented as? Person represents the community of moral agents.
- Moral statuses rest on ontological determinations.
- “phenomenal consciousness”
- We can only genuinely engage in observational/phenomenal epistemic claims.
- Decartes - the linking of the animal to the machine
- One ethical perspective - “only machines can be ethical / rational / free”
- Moral reasoning requires rationality of decision-making; machines are the ‘best’ at this.
- To what extent are machines extensions of aggregate human rationality?
- What constitutes a person? What is personhood? Perhaps it is not only undefinable, but defined by its undefinability.
- Moral agency is ultimately decided by the interested and privileged power. The ethical landscape is not precolonial.
- The currently formulated understanding of moral agency leads to the Hegelian “spurious infinite”.
- The debate over the moral agency of machines seems to enter a dialectical stalemate
- Barbara Johnson - we can both reserve and protect the concept of moral agency from computers, while also recognizing that they do have some legitimate claim to moral behavior.
- Computers “do not have mental states”; they do not have “freedom” or “autonomy”.
- Emphasis of the role of intentionality; they should not simply behave “from necessity”.
- Against a creationism: machines cannot be imbued with the intentionality of their designers.
- Gunkel: this perspective is still nostalgic for human exceptionalism.
- More complex thinkers redefine agency to be more complex.
- As long as moral agency is linked with personhood, machines will never be moral subjects and continue to be mere tools.
- Provocateur response - “we are not moral agents but robots are” (Nadeau)
- Mindless morality - a morality which avoids unresolved issues in the philosophy of mind. Autonomy, intentionality, responsability.
- Revised morality - interactivity, autonomy, adaptablity
- To skirt around philosophical issues, some engineers employ functionalist morality. Asimov’s fictional laws of robotics, for example. Machine ethics - the consequences of machines towards human users (flipping the script from computer ethics).
- Is ethics computable? Moral philosophy is arranged along an arguably computational model, one which is rationalistic.
- Machine ethics transfers the cause of a moral action to its effects: it is ‘purely’ phenomenological, de-deontological. Therefore it is thoroughly anthropocentric and functionally a slave ethics. It returns to instrumentalist politics.
- Mark Coeckelbergh - creation of psychopathic programs - robots with no capacity for empathy and feeling but which are rational.
- Moral philosophy has been generally concerned with the agent. It must make exclusive decisions; the community of the excluded just moves on into a spurious infinite.
- We are slowly reconsidering the instrumentalist and anthropocentrist legacy.
Moral Patiency
- Patient-oriented ethics looks at the ways in which beings must be constituted as other rather than in and how they might be treated.
- Moral patient - linguistically termed in response to moral agency.
- Moral agents are also, according to the ‘standard position’, moral patients. There is a reciprocity between agency and patiency.
Left off on page 100