In The AI Mirror: How to Reclaim Our Humanity in the Age of Machine Thinking, Shannon Vallor argues that Artificial Intelligence acts as a "mirror" reflecting humanity’s virtues, flaws, and ethical dilemmas. AI does not simply change the world externally, she notes, but also internally transforms our moral character by reshaping how we relate to ourselves, each other, and society.
AI, in its design and application, mirrors human values, magnifying both the strengths and shortcomings of those who create and use it. She emphasizes that the development of AI reflects the priorities and biases embedded in human culture, highlighting the necessity for ethical vigilance in AI development to ensure it aligns with values such as fairness, compassion, and justice.
Vallor further explores how AI challenges our existing moral frameworks and pushes us to redefine what it means to be human. As AI takes on roles that demand judgment and decision-making, humanity must cultivate new virtues that can guide our coexistence with these intelligent systems. She advocates for the development of "technomoral virtues," which are qualities that enable people to thrive alongside AI—such as humility, empathy, and critical thinking. Vallor stresses that the choices we make now about AI design and use will shape not just technological outcomes but also the moral fabric of society, urging us to consciously develop AI in ways that promote the collective flourishing of humanity.
In her concluding chapter, Vallor argues that by using our “powers of autofabrication” wisely, we can “enable and enlarge our freedom and compassion for ourselves and for others.”
We excerpt a key passage from her conclusion here:
"We are indeed machines of a biological sort, like all living things. But we are among those rare machines who make ourselves. We choose every day whether to remain as we are or become something different. In our lives, and in our societies, we even carry out from time to time that union of imagination and expressive action that Nick Cave described as "self-murder" —not an act of suicide, but its opposite. Instead of suicide's hopeless refusal of life and resignation of its power to create new meaning, we sometimes embrace and throw ourselves into that power. We make ourselves and our societies into something that never existed in that shape before, and thereby bring into the world a new value, a new image of the good to be tested by life and chosen again—or not. Perhaps we are not the only machines who do this, but we are the only ones we know.
"Our technologies do not oppose or negate this freedom. They are themselves expressions of it. Human nature isn't opposed to artificiality--the artificial manifests our open nature and makes it concrete. We remade ourselves and the built world with ideas and values woven into things: the till and the axe, the coin and the pen, the wheel and the printing press, the gunpowder and the penicillin, the circuit and the transistor. Technologies are engines of autofabrication. But we sometimes turn those engines against the very freedom that makes them possible. When we use our powers of autofabrication unwisely, we destroy and diminish ourselves and one another. When we use them well, we enable and enlarge our freedom and compassion for ourselves and for others. Most of the time, the result is some mix of the two. But while we can always choose to refuse badly made artifacts and their destructive uses, we cannot refuse technology itself without refusing ourselves."
"Technologies aren't neutral, but they are plastic. AI mirrors driven by the domininant incentives, goals, values, and virtues of our current economic order will function like tractor beams pulling us deeper into a dead-end past. It would be nice if we could just reverse their polarity and ask them to pull us into a sustainable, humane future instead. But a mirror can't know what we ought to sustain, or what kind of future is worthy of being called humane."
"We have to set those goals for ourselves and hitch our powerful Al engines to them, with our own communities in the driver's seat. We also have to try to coordinate the operation and steering of those engines, across sectors, nations, and continents. Only then will the vulnerable and deeply interdependent human family be able to arrive at that future safely together, with democratic norms and considerations of justice guiding us, however imperfectly and inconsistently. As Ursula LeGuin said, the struggle for human freedom and justice is always a "war without end."
"Here's the thing: we aren't tied to our seats. We can grab the wheel. Many of us have been dozing off in the back, lost in what philosopher of technology Langdon Winner calls "technological somnambulism," where we "willingly sleepwalk through the process of reconstituting the conditions of human existence." But there's still time to wake up."
"The true soul of technology is not efficiency but generosity; it is the gift of a future. To perform the necessary services for others to survive; to shield them from harm; to repair and heal; to educate and train; to feed, nurture, and comfort. Al can be remade for a humane future, reconceived as a tool for these ends, measured and valued only to the extent that it can be proven to serve them. If we choose. If we demand."