

Sobremesa(tech) - AI Neutrality Illusion
Type yourself into a search bar. What comes back?
For most people, the answer feels unremarkable. Neutral, even. Just results. Just data. Just the machine doing what machines do.
But search engines, recommendation systems, hiring algorithms, these tools don't just reflect the world as it is, they amplify it. They encode who gets seen, who gets erased, who gets flagged as a risk and who gets handed an opportunity. And they do it quietly, at scale, with the authority of something that feels like objectivity.
The machine learned from us. And we have never been neutral. The question isn't whether bias exists in AI, it does, and the evidence is overwhelming. The question is why we keep calling it a glitch instead of a design choice.
Foucault showed us that neutrality is just the ideology that won. De Beauvoir told us who gets to be the default — and who gets cast as the exception. Noble brought it into the present tense: when systems of oppression get automated, they don't disappear. They scale.
To guide that conversation, we're joined by Christina Stathopoulos, Data & AI Evangelist. She has spent years helping Fortune 500 organisations move from AI experimentation to responsible, scalable execution. She doesn't just ask whether AI works. She asks whether it works for everyone.
Along the way, we'll ask:
When a search engine buries you, is that a bug or a feature?
When we "de-bias" an algorithm, whose worldview replaces it?
Can a system trained on the past ever build a fairer future?
🍷 The ticket includes a drink and some snacks.
💡 Refund policy: Refunds are not available. If you choose not to come after that, you’re welcome to sell your ticket in the Sobremesa group or to a friend, just send us a quick message at +34691501916 to let us know.