Voorwerp is reduced to its essence and it will be released. It exists under a nonprofit structure. Open source. MIT licensed. Thousands of hours were invested. Nothing is expected in return.
I've been working on this for years, nearly a decade, and it has been a wild road. I remember writing parts of it in Berlin, in the offices at SoundCloud. I stepped away from it at times, then came back. During COVID, for the first time, I gave it my full attention. Pieces of its logic were written across countries. I had breakthroughs in Costa Rica and California, and the direction of the project truly matured in Australia. It is, in every sense, an international one-man project.
"Voorwerp" means "object" in Dutch and that's exactly what it has been: a prototype of whatever I believed was right at the time. A container for ideas, constantly reshaped. More than anything, it has been an educational process. I poured an enormous amount of research into it, learning, testing, discarding, refining, figuring out what works, and what doesn't.
I understood this early, in the Berlin tech scene, long before it became common language, that systems built to capture attention would inevitably reshape behavior at scale. It was already visible in the incentives, in the design patterns, in the direction everything was moving. At the same time, there was a growing discomfort around private data, how easily it was being collected, aggregated, and repurposed. It became clear that once human behavior could be measured precisely, it could also be influenced.
Then came Cambridge Analytica. For a brief moment, the mechanism was exposed: personal data turned into psychological profiles, and those profiles used to influence beliefs, emotions, and decisions at scale. It wasn't just advertising anymore. It was the early form of industrialized psychological influence.
And yet, almost nothing changed. Not because it wasn't serious, but because its implications were too abstract for most people to fully grasp. The idea that these systems could operate as a form of psychological infrastructure, quietly shaping perception and behavior, never fully settled. So it passed. The outrage faded. The systems remained. The incentives remained. And the same mechanisms kept evolving, less visible, more embedded, more normalized.
What matters now is not scale for its own sake, but resolution: taking what those years taught me, what they allowed me to observe, test, and understand, and condensing it into a final, smaller project that still carries that knowledge forward.
At the most reasonable scale, this is what can be done: a healthier alternative to the social media cartel that mediates how we see the world and each other.
This is a rejection of the tech industry as it exists today. Every pixel is deliberate. Engineered with rigor. Treated as both system and art.
This is my fancy way of saying fuck you. A complex, engineered fuck you, built with rigor. The greatest effort of a fuck you I will ever give to anyone. And this fuck you is of use to the people, something they can actually use.
The 1% maintain their position through the architecture of discourse itself, capturing attention, redirecting conflict, and rendering structure invisible, until we turn against each other and lose sight of what shapes us.
This does not chase money. It does not serve their incentives.
Public discourse does not belong to corporations. It belongs to people. We're taking it back.