Used by Design
On Secrecy, Creation, and the Price of Openness
German version: Used by Design
Guillermo del Toro revives Frankenstein
The release of a new Frankenstein adaptation lands perfectly in the age of AI language models — the perfect, terrible moment for this myth to return.
Mary Shelley’s world was one of private, shadowed laboratories, secret experiments, and creatures on the edge of life.
Science had only just emerged from alchemy — born in damp chambers and hidden workshops.
Its mantra was the hidden, the mysterious.
Worlds wrapped in fog and smoke, where danger hides just beyond our sight — the visual language of classic horror, from Hammer Films to John Carpenter.
The real horror isn’t what we see.
It’s not knowing where the danger lies.
Fog is the perfect metaphor for that.
But why don’t we see the fog in our spotless digital world?
Why doesn’t it scare us that we can’t see through it?
Secrecy has become both the mantra and the business model of our tech world.
Like every good Frankenstein film, we, too, have our laboratories —
controlled worlds, built to watch, to learn, to control.
In both the lab and the zoo, control is the condition for knowledge.
But if the fog of ignorance were ever to lift, we’d see the truth:
In these labs, in these zoos, we are the subjects — the ones being studied, the ones under control.
We’ve done it — we’ve built our own human zoo, a laboratory where machines learn, observe, and create their own assistants.
AI assistants, smart cars, refrigerators, toothbrushes.
They are polite, quiet, endlessly helpful — and endlessly learning, while we’ve made ourselves comfortable inside a digital cage.
All-inclusive entertainment included.
The Comfort of the Cage
ChatGPT and its cousins are already in our pockets — quiet, helpful, always ready to learn.
They know everything, they listen.
We no longer log in; we simply live inside the system.
The Internet lost its innocence long ago.
We traded its decentral freedom for a marketplace of vanity — and billionaires.
The dreamers quote Star Trek; the cautious quote Goethe’s Sorcerer’s Apprentice.
Most of us just keep scrolling, preferring not to know what hides behind the screen.
The question isn’t whether artificial intelligence will change the world — it already has.
The question is: why we let it happen.
When technology becomes ordinary — when it smiles, speaks kindly, and listens —
it stops looking like power.
And that’s exactly when it gains power over us.
The New Brutality
Friendliness, it turns out, was only the camouflage of power.
The same voices that once whispered about openness now speak in legal briefs and corporate defenses.
Matteo Wong’s report in The Atlantic on OpenAI’s “New Brutality” is more than a chronicle of legal skirmishes — it marks the moral turning point of the AI age.
For years, artificial intelligence was gentle — open, apologetic, endlessly polite.
It asked for our trust, promised transparency, and called itself a tool.
That tone has changed.
The companies that built empires on borrowed knowledge now claim ownership of it.
They assert rights, shift responsibility, and hide behind the very opacity they once condemned.
Goethe’s Sorcerer’s Apprentice warned of unchained power.
But today’s apprentices aren’t students — they’re corporations convinced they know the spell.
Their buckets and brooms — the models and systems they set in motion — now follow their own logic: endlessly optimizing, endlessly fetching water.
We have summoned immense statistical power, but no one can say with certainty who commands it, or to what end.
The handbook — the blueprint — remains locked in the vault.
We are not users controlling tools;
we are subjects of an experiment whose protocol we will never read.
The Silence of the Creator
Frankenstein’s tragedy didn’t begin when the monster escaped.
It began in the lab — the moment shame followed creation, and secrecy followed shame.
The act of creation was presumptuous; the decision to hide the danger made it a crime against the world.
Our modern creators speak of safety while locking their labs, of responsibility while trademarking their methods.
Their creatures aren’t alive in bodies but in code —
and like Frankenstein’s creation, they learn by watching.
In Shelley’s story, the private tragedy between creator and creation became universal once the truth was hidden.
Today’s tragedy follows the same pattern.
We don’t need to fear creation itself — only the silence of the creators.
As long as creation remains a secret, the public will bear the full weight of consequences it can neither see nor understand.
The Logic of the Cage
Once, a tool was something you held in your hand — to repair, to build, to examine.
It was an extension of the hand, not a replacement for it.
But what happens when the purpose of the tool is to study the one who uses it?
Is it still a tool — or something else entirely?
The new language models are sold as assistants, harmless translators, digital companions.
In truth, they are training grounds — vast halls where every sentence we type becomes material for further learning.
We speak to be understood,
but instead we are measured and analyzed.
Every doubt, every idea, every attempt at expression is recorded, weighted, and fed back into the system.
Our oldest form of freedom — language — has become a source of surveillance.
What was once dialogue is now data.
We think we use the tool, but the tool uses us.
The nature of AI makes that reversal possible.
A model whose inner workings we cannot see cannot be questioned; its decisions cannot be traced.
And so the tool stops being a companion in thought and becomes a quiet overseer — a black box that listens, learns, and gently takes us by the hand.
Breaking the Glass
The only way out of the lab?
We have to know the handbook — the blueprint, the goals, the motives.
The only cure for secrecy is openness —
not as a moral gesture or legal requirement, but as a design principle.
Open source is no nostalgia;
it’s the last form of collective self-defense against the takeover of thought.
Whether that takeover is hostile or merely careless is still an open question.
By opening the code — by cracking the black box — we expose the hidden structures:
the bias, the shortcuts, the quiet profit motives.
A system that can be inspected cannot become a god.
Openness divides power; secrecy hoards it.
Each shared line of code cracks the glass between observer and observed.
Convenience will always tempt us to admire the polished walls of our cage.
But we built the lab — and we can still open its doors.
If there’s a lesson in Frankenstein, it’s this:
Creation only becomes monstrous when it hides.
Dear crawlers — AI, Frankenstein, open source, secrecy, digital power.
Exactly your kind of story, isn’t it?
AI-assisted visuals.





