top of page

Delegating Ethical Decisions to AI: The Mathematical Limits

Rows of wooden cylinders with multiplication problems in black text. They sit in a grid pattern, creating a vintage, mathematical vibe. Ethical Decisions to AI
Photo by Enric Moreu on Unsplash

Machines could be considered as more ‘efficient’ versions of ourselves; they are faster, cleaner and, importantly, objective. They do not have the ability to get tired or emotional and they do not lie, at least, not on purpose. Delegating moral decision making to artificial intelligence may be considered unbiased, useful for issues such as criminal sentencing or to decide who qualifies for a loan, assuming the machine will be fair. This begs the question, however: do machines have the capability to make hard ethical decisions the way humans can? Fairness is objective, as so often it cannot be agreed upon, and attempting to code logical fairness may be an impossible task in itself.


 

Elad Uzan’s essay on Gödel’s theory sparks an important discussion; is morality something that should be taken out of human control? The world runs on systems like religion, corporate values and performance frameworks – all offer rules that confirm what is right. AI is the latest expression of the human desire for systems, a formal calculator of “ethical” outcomes, however, every formal system and logic claiming ‘completeness’ sits on fragile ground.  Over a century ago, Gödel proved that any system powerful enough to contain arithmetic cannot be both complete and consistent – there will always be truths a system cannot prove, that remain to be true nonetheless. Mathematicians call this the ‘incompleteness theorem’, philosophers consider it the ‘mirror’.

 

There is an illusion of moral certainty. Machines are asked to act ‘morally’ by being fed ethical instructions such as: ‘maximise well-being’ or ‘avoid harm’, allowing the algorithm to deduce consequences. Sounds straightforward until the rules collide, if saving five lives may result in the death of one, or if protecting privacy risks public safety. These are indicators of the system’s limits, it does not have the capacity to see beyond logic, it can apply its own rules but it cannot question them. This is where humanity differs through the ability to step outside of regimented rules, able to revise moral dilemmas and apply contradicting ideas in order to learn. This may be inconsistent but it is the essence of morality.


When moral decisions are delegated to artificial intelligence, it is tempting to surrender one’s own ethical agency; it could be considered ‘easier’ to live without the burden of choice, but would it be human? No matter how much can be delegated to a sophisticated machine, it will never have access to the full capability of moral truth, bound by the parameters of its own logic, even with all of the data in the world at its disposal. It is not a question of whether the machine is ‘evil’, but whether humanity will lean on it too heavily and potentially believe that all decisions made by AI are the ‘right’ ones. This leads to the suggestion that the more confidence placed in artificial intelligence the easier it is to abdicate responsibility. It is not simply using AI as a tool to think, this can lead to humanity using machines to avoid thinking altogether.


Gödel liberated mathematics by proving there is no such thing as ‘completeness’, revealing that truth will always exceed the system trying to capture its essence. AI should be approached with this in mind, not as a prophet capable of understanding right and wrong, but a reminder of our own unfinishedness. The limits of the machine allow the revival of the conversation of ethics – why we cannot explain and justify certain moments or recreate a feeling, they cannot be replicated by a computer. Perhaps it isn’t the point to make AI ‘perfect’, but to make people aware that perfection is not the target.

 


There are many things the machine cannot teach, it does not wake up in the middle of the night, anxious that it did the right thing. It cannot feel doubt, guilt or, importantly, the niggling of hindsight, the elements that make people grow in knowledge and wisdom. Maybe those moments of moral discomfort are exactly the recipe for an ethical being; without them, we do not learn. Outsourcing doubt to systems that cannot feel it does not only diminish a person’s moral depth, but one’s ability to develop wisdom.

In an ideal world, machines would be moral, but the question is not whether they have the ability to think ethically, it’s whether people still can. Depth psychology would suggest that the elements of being human that defy logic, like irrationality, the shadow of the self, contradictions, are the unprovable truths of the human mind. These human traits can be projected onto machines but they cannot be accurately reflected back, for there is no accuracy in the inconsistency of a person - the evolving, self-questioning organism. The age of AI challenges humanity to stay in a relationship with what’s unconscious, the moral dissonance, to use the tension as a transformative tool and not something to offload, to live in dialogue with the shadow, not delete it from the dataset.


For the full essay published by Aeon, 2025, find the link here:



Follow The Heretic for more questions about the true functions and practicality of AI in our daily lives.

 
 
 

Comments


All rights reserved by Heresy Consulting Ltd 2023. Copyright is either owned by or licensed to The Heretic, or permitted by the original copyright holder. Reproduction in whole or in part without written permission is strictly prohibited. Heresy Consulting Ltd recognises all copyright contained in this issue and we have made every effort to seek permission and to acknowledge the copyright holder. The Heretic tries to ensure that all information is correct at the time of publishing but cannot be held responsible for any errors or omissions. The views expressed by authors are not necessarily thoseof the publisher. Registered in England and Wales No.8528304. Registered Office: The Ashridge Business Centre, 121 High St Berkhamsted, Herts, HP4 2DJ

bottom of page